var/home/core/zuul-output/0000755000175000017500000000000015146772673014547 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015147025527015501 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000322202315147025345020260 0ustar corecore*ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9Gf􅴟 ~x6b}Wߟ/nm͊wqɻlOxN_W~𒆷7̗8zTY\].f}嗷ovϷw_>on3cvX~egQBeH,nWb m/m}*L~AzHev_uαHJ2E$(Ͽ|/+k*z>p R⥑gF)49)(oՈ7_k0m^p9PneQn͂YEeeɹ ^ʙ|ʕ0MۂAraZR}@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߖ|Hp(-J C?>:zR{܃ lM6_OފߍO1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0NMc߅f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞ~J|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO jKi0>,A==lM9Ɍm4ެ˧jOC d-saܺCY "D^&M){ߘ>:i V4nQi1h$Zb)ŠȃAݢCj|<VxTlHz `=\T\=y~eA>DG.b~?|ђP? -8%JNIt"`HP!]ZrͰ4j8!*(jPcǷ!)'xmv>!0[r_G{j 6JYǹ>zs;tc.mctie:x&"bR4S uV8/0%X8Ua0NET݃jYAT` &AD]Ax95mvXYs"(A+/_+*{b }@UP*5ì"M|܊W7|}N{mL=d]' =MS2[3(/hoj$=Zm Mlh>P>Qwf8*c4˥Ęk(+,«.c%_~&^%80=1Jgͤ39(&ʤdH0Ζ@.!)CGt?}=ˢ>f>\bN<Ⱦtë{{b2hKNh`0=/9Gɺɔ+'Х[)9^iX,N&+1Id0ֶ|}!oѶvhu|8Qz:^S-7;k>U~H><~5i ˿7^0*]h,*aklVIKS7d'qAWEݰLkS :}%J6TIsbFʶ褢sFUC)(k-C"TQ[;4j39_WiZSس:$3w}o$[4x:bl=pd9YfAMpIrv̡}XI{B%ZԎuHvhd`Η|ʣ)-iaE';_j{(8xPA*1bv^JLj&DY3#-1*I+g8a@(*%kX{ Z;#es=oi_)qb㼃{buU?zT u]68 QeC Hl @R SFZuU&uRz[2(A1ZK(O5dc}QQufCdX($0j(HX_$GZaPo|P5q @3ǟ6 mR!c/24مQNֆ^n,hU֝cfT :):[gCa?\&IpW$8!+Uph*/ o/{")qq҈78݇hA sTB*F$6 2C` |ɧJ~iM cO;m#NV?d?TCg5otޔC1s`u.EkB6ga׬9J2&vV,./ӐoQJ*Dw*^sCeyWtɖ9F.[-cʚmD (QMW`zP~n"U'8%kEq*Lr;TY *BCCpJh~Eq EmFjq1jX]DןR24d 3[n )ܗKj/jUSsȕD $([LH%xa1yrOP! I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd쿽d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^oHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9ѻ|y7*nD4qL~`|%4Q0q["< HK'f dt(d/ZoQ%_}~Yki7}SWekk̗E\e'h􇋲rTG_77:0@Iuʙ?&Ԕ8e,žLG"1lͧQѶGM]}yxZl 0JM"d.=`Yƚ^"J?}>8ϵq\FOXƀf qbTLhlw?8p@{]oOtsϑ`94t1!F PI;i`ޮMLX7sTGP7^s08p15w q o(uLYQB_dWoc0a#K1P,8]P)\wEZ(VҠQBT^e^0F;)CtT+{`Bh"% !.bBQPnT4ƈRa[F=3}+BVE~8R{3,>0|:,5j358W]>!Q1"6oT[ҟ^T;725Xa+wqlR)<#!9!籈K*:!@NI^S"H=ofLx _lp ꖚӜ3C 4dM @x>ۙZh _uoֺip&1ڙʪ4\RF_04H8@>fXmpLJ5jRS}_D U4x[c) ,`̔Dvckk5Ťã0le۞]o~oW(91ݧ$uxp/Cq6Un9%ZxðvGL qG $ X:w06 E=oWlzN7st˪C:?*|kިfc]| &ب^[%F%LI<0(씖;4A\`TQ.b0NH;ݹ/n -3!: _Jq#Bh^4p|-G7|ڸ=Bx)kre_f |Nm8p5H!jR@Aiߒ߈ۥLFTk"5l9O'ϓl5x|_®&&n]#r̥jOڧK)lsXg\{Md-% >~Ӈ/( [ycy`ðSmn_O;3=Av3LA׊onxlM?~n Θ5 ӂxzPMcVQ@ӤomY42nrQ\'"P؝J7g+#!k{paqTԫ?o?VU}aK q;TC.a`@/t[Edso\wz|In;3&'v]gخO)0{ zz2 ck vz(vb$^Nyo$p[DtUCE9siuKVMٞM9$1#HR1(7x]mD@0ngd6#eMy"[ ^Q $[d8  i#i8YlsI!2(ȐP'3ޜb6xo^fmIx nf^Lw>"0(HKkD4<80: M:'֥P!r "Lӓݰ@ 9n# " $fGgKQӦ4}Gn\^=-Y5PI dPN6 Ozځ/פ|5) F[ڣ$2*%&h v%9HN H~Q+oi?&۳)-nqK?2ސv/3,9ҮT9Cef˝49i.2DxatC<8iR/ƬйR֌vN8J"iJ. T>)qaY4ͬlyg "]BvW#99`TegõII kюHLa^c&/H^FFIu`2a$mc Ry+R:LڕDܓ>Y:]t.+|PT6=qWe0NƏw<6o3mv8k vGOfpEOkÈWȤMف lOc;SR&.w,qk>MPs+Xh4iyuGRd֞q鮺]m S{}]U kV0/ŜxtADx"Xh4|;XSxߵă@pE:y]/"(MCG`ʶϊGi+39#gNZYE:Qw9muB`9`LDhs4Ǩ9S`EkM{zB<˙ik; JD;;3!4 2Y.$Dwiu|+lO:k$]ԜYLUҞ6EmH>azʳ/A+ԀZk"f`.,ל{=wh|_qYj5M{K$gv>cDp"'0޽5xCNQ1G2})*'>fC۝'*)"5.E2IeD 2.ZdrN6Uœ=n8D-9޵JKw5ُJ,􋃓ZUꋼ0b1f87GՂ 1t_o}{Mr7KO0Ao-Y*Is\S:JzA(:i!eҎ\,f+,Ąt78~ڋ~?[F^.A'!,iGow3{'YToҝf5ޓ[he>=7S8DGZ@-#]f:Tm?L{F-8G#%.fM8Y='gیl0HڜHLK'Cw#)krWIk<1څ 9abHl:b3LjOq͂Ӥ=u8#E2;|z꽐vɀi^lUt␚ɓW%OVc8|*yI0U=nFGA`IC8p+C:!}Nh,mn>_MGiq'N~|z`|mu}r:"KiyGҪ$& hw#4qn?ܶХfm_Ov^ܶ[6j3ZN9t9ZMMM)I[Rχ/C|W䳮yI3MڼH9iEG&V 'x`u.̀ab7V<*EzfH{]:*6M x-v쳎M'.hO3p-IGh ܆hR ]zi2hB9'S_;I/d0oIU:m/~[*K1QA="D:V&f:{7N>^uU` c/X)mS5KC߄":{H)"%,!3w{"ZWÂk>/F?RJ>FIY*%5Hg}3Ď89؟N/pgÞ tJXB-Gjsٶ 3Gzp؍H|*cyp@\첹,[up`uV,\KCB\qGiW痃[?i?S{eϻl71X:݌>EEly(*SHN:ӫOq{{L$?Q{϶(F_Ej>3mqfΤP-j)H˧&8?a?2xĐ+EV؍x0bv6 fd1^ 2ӎԥ sZR cgu/bn/34'h9Dݥ:U:vV[ 'Mȥ@ەX㧿-p0?Q6 y2XN2_h~Cֆ֙82)=Ȓ7D- V)T? O/VFeUk'7KIT, WeՔ}-66V؅ʹ;T$pZ#@L; ?0]"2v[hׂ'cJ6H4bs+3(@z$.K!#Šj2ݢxK-di +9Hᇷ絻+ O.i2.I+69EVyw8//|~<ëng)P<xͯ~? fp,CǴ_BjDN^5)s('cBh+6ez0)_~zJz"ё`Z&Z![0rGBK 5G~<:H~W>;ٍVnSt%_!BZMMeccBҎÒJH+"ūyR}X~juPp- j\hЪQxchKaS,xS"cV8i8'-sOKB<չw"|{/MC8&%Og3E#O%`N)p#4YUh^ ɨڻ#Ch@(R &Z+<3ݰb/St=&yo|BL,1+t C<ˉvRfQ*e"T:*Dᰤ*~IClz^F6!ܠqK3%$E)~?wy,u'u() C>Gn} t]2_}!1NodI_Bǂ/^8\3m!'(Ֆ5Q&xo 8;'Jbo&XL_ʣ^^"Lq2E3,v1ɢu^}G7Z/qC^'+HDy=\]?d|9i,p?߼=\Ce"|Rݷ Q+=zxB.^Bld.HSntºB4~4]%.i|҂"? ~#ݤ[tfv3Ytck0O ͧ gP\|bЯ݃5H+v}/{&Ά+4*Iqt~L4Ykja?BH6!=?8[Y|-ɬeǪzd;-s~CM>e:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓t$ėcZ6"J;8mj8NS$ տΐ,;W0b%E˒=cwF& (@p-%r f$8Tš*IHwCUBea#8T(eW|GEWCS-qR$+E/z3Up{eY; U/_ntZ bİI&0 y:x'Ï0ZsE&{qj8p ;9܋$7 ;(?4_~o˿yPx9$=}O,+ !>ԣ[|(LtL?k ]8V?GųTZRku P D(dHm <18\֥/yۗwiXcCg QLe}ЂV,Pф1ƀI \oWY&؎x)@rm=V_}zmCXHO/ro1@j8@%y []x)2~B;":(Kx_.os,Q=#7mL weDKY DD [.D=OTb\cmYe ǫGe5J双}"qlkXdk0 H ô쐙\f }DǣyqOfI$͛!oƁ'Eq:j.P[SIdY 2KQ`<2O l= R!XN ,\@k"f#M[q*5E"AD(0%jgWB"*GG_"C~g:4 #+Y3)%hn v[Aď@(cm_^n"IH@ַupN`>(ex6v7nt]X$5Y_sn=|x3W$>EwՏ>EP3vF\Ӆ[] WH8Q+OPl\]{ z};Zy<qK]B\9hϔFEBJGKdžq$lP՟Y ?=qNM@,2g|DwLd5RR0M2 kAaJN&gwwK'=$p[%z^lhgm]74E5I9 `{Ngvu%I Ǭ7}mifНxWeS{"QC 9o .% {(!slBD{H2x1jH`).iƑ[ri]Jm|#Z` l̈6둦0`HَZ2ԚăAuU\fyL*ESP,?]I ⊖i\1&kk K=:`\<ϯ"T4XhcG<+e_RXSG'ɵXg.i-9zD+ , McPp|G8lbM2&`&Wx;" tCDA \(4q$`=[WhMU6K@mF)t$aFH Q2iYvyh OF!Ui[y5v`[ (Hb0/t5[!p]sh5`C %3yVB7,(m.Os91}+x/Z=nlE~:"e b6˃Y}t–M4G3vS.й |j cRx٣PFIc"}U.հؽLM\k@LՆ7VFkɦYr-Ax3=9'fO$-2oAƌvaQS@Y .s#39\0W]/0m7@.FZB}".h_Mp-_9Sy{͏ǸVʻ\%[ti SE0)XT ح c~ 'dc$fjV_ s_UmF mEycTdzqogCd˙mmZ=u;p:\F[N;tiNm=ߦiT ŞԕT bzye+bH;pO@)3,wK?L" @o".G23TƥjJF0?XMiݯ SfLnQϔwhtUyDե"1EV6Yj=WM=ξOa, Rm4 *Oi;ySCU2X9ȝ)4f&IHq]>]U>xm6 V5 aSa=qj~s&KKi CqM"=x~uڀGk~h|mw1EBׂasuw9;h7`YO>X/qİ3Ͳ,ѷ?tSyVxڰ ] 9Z;G<ϳ y_9O\E-"/&ʢ&?/=x73>3e|}&9=lHaO>$> ~FՔl~*jY 'rfgjݣZ`!`ײm?Q7x4J;C0ޏ `5bFГ,#õ``yY% >ك@vG |ڇD *_G@^(A}K-rztc wthO#P4T.׃+4ݓ, <P8v?B(A@ ࣇXo/;CCq0aHU ݈5)Wү[N=ѭ2r H5.2}GQ/N"EU!l,ႌ˂$I w|ERYd: B'YN ~>P).t<A~,+2)1q5w(ξf9n&, j+OwNUr!7:'QUEzf<܆#A`oA3+P^:xﭽvW-bf1ϴb"+ rՒ4 ܲkRj20~d? )(,\Mθpʉ+Bj)*R\$1ʃ )bdx2Fpc7` r f"s Dg4/6GP y2\Һ\  Ut'hbm}̡zp=ϋѤuU6=Y (P |%\dAL@-*p_e Ո$qQ `Rj l-GIɸgu4031 s_Cu"kX/?IAY"dF`@-6F# [Լ~Oݽ%WS}̳ǑcJE{pzD)Iѡ Ovəfw>sٺ}5_KH̀t`z/N)%`Jt3zMS:jDzyr~~tJ:4̈́yx)3Gv>tx]af4vmZiL"Ͼoe GpA0?0-;Yqħmqz$Y!}r_ F_JXQJYVEy?Fa05!#/*B!i';qV%A+N,0VÃ8O *!ASWu<[m kjQua?J}sxO6ϲՐeb$<C#z˵0 \ULeaZ`MiYF  8C~1=PK6|θy oQ @-]iYaLed۞PX &n_Xwn>2*ޡ;K? __7כ^¦ 2;AFWɇb(q;(|7[A8cjn6m/,!g\1!맡dda؏3~ e37s='cǮd7OM2؍Afx0 XLW3[P/`PY&Q!-K~k-y`KM#VBq۲3F-AWa`G;ZS`unY~7~oُwM!Z;oߛCAS§و{0b  8n<߶؊aִpJW~,t(&,=C/xgk(0yVjg]'-I{pc'VkӵY[ڡjUoն|-#dy 6 Q-"uaeNg+ Xq4D{t׵)MG~%~fH>3fQ+{r';Pd'MO ^|k۬טTNe$2Q-(P,e <(Ie9΋A2aXWцBˁ-½ Bt %J7 nN(ݎPLB&Rɘg *rTsn4v}n;MFR6 mN+ێTLJګjoN3 7 Y%ـPgsBuI*nGLB V 6 ۜPo;Bgm@J?PBUB  6'4؎n@h9v$4\C.n1@(K PhKQ!C1Ch=ګ:爡rD.+ڎ0'r1'e!UH-"A>6&5aiV`11  ֟AE褛4 ,aA6 /m]xFȪuSxN^#*3./ܔpF t˶?״!N ?9jĂf茪(M@-9%s20<$F!}贈o7"p 0uI嫡8ԓ<Ɇ Ϡ1bPC>?Q4Qq+?|8>CٸiDf#"pf!{nK $)q%9˸b#1GE<c#[(KxPVLa1m \BZv؟1+wEuH hS - :y,eĽXNX y*|~j10{`39K/"DS%GmOƩD]HL>F a (- `aK+ ̗XJviO쁽"tU\l|Ʃ.Rd[B]x_e _"8~&2\V^}cq'E 0 Xt w"6u10WZtO`NҳL@Cw0NPs-럕cjxcmȻuGKWXJx94諬0|86r-gxnBcTnff=9߫lod[IޛǨ|^-`jz gK>VC&536$ƒ/4$@k)YJ5O =OYMAMYݚ0m`_ߙkA ٵ!cV٬CR説&׺ 9nwqpYRiفǷÓI ;>iySUt:ڨƉ!3>gz\`DWVAr;NI;*W菝R<6V[ h􏚘[g9R̀Xր L]|o\R_@Q&E䭹P9ȊghDj{k =EFgP5x2O(Uw<SED[{ƢofmWrZk1?Iy" 0@?8;bOaCUskFB^ڶn?(0YPkHߴRo[L90TCp)Wb+r/g Hnk'\M_*!N~7?1F 7~]Hޑ]/h /߳A[&У!n~EYV NBRJ܁T!hP.Pѻ"xc~jY>{r}Z\DQԳYDOd~^:mxfrCRQPzd6]=stz&Z^$xx1h!O* 2 |{ 0i~DhɺAK," rEޘ+9R+%RV\cEËŔ_A,Df]k`GnDtԯ6)ƺ7mN&MȜ4RZRӚ,#P2Z8NV {IA\'(o*2g5#e4QI1B#'wmIUjag ,UEQ IKV҉8+T YV9宮n_x=A^:L"ÿL<ԍ޷_\ҟC`s5qWo> b _׌J6___X[^zǾ)|o?V6座^Foٮ:J}6Vqo}տ_xPshV׮>I/{;_]6 H_%۱lGqCoO>ks-W{t_;uw cWIh'-~=V_I/9}zp *Nk۔mvW'0),^ҴJ\LK^ዑ,pQW?b$i=l?a0pDxˍZX1-c^&:xy9 F,Dz!VgU_Fj0?5&ۂT!.A$p$eg|ŪD+lb%% Suw1E_ ƈa@KߴҫN+D֥uSTנL^~Hb} r}$coAҢf{?eɼ-g#=je7:esXkL[?f4R«fK ̋дs;ICNgC5'r5 `q²PuS8߂tO|$9.%,VBphe .1 F˛U5Sq0I  ҭn+MCI'hfKG,GZ|LJG=᳀&1Q1-@g=Ag/ƈau_k|tI \12xS]3z^a$MbZmh0ՠ`DJ(JaA$£ؔMjB+C&4gn~xbt KN-ÒNN]s*NbMT< 0bSMot7yZB;Te;xU+BH n." ˜u%}507cO]ʆ9Q½qCrK:D,`$xv[(X!% [:WF˼,)mt-Vb$)|96eʸ S/Wt_pVT&ZYĝE7L@{ 0=47}w%cC`ܿ3 b|_i^^ZdFoK4)߬)VfKΰS.ۨOmO7D~ 6Ud=#`_cb&#F҉7O{=082Y+%G6f F x5cui8cӂ8#Bt`mL۪-d,HH:97$zTky!m-,q}x+ijz$Ѡ{mѰL"D:9 #InYoޮJџ]|p' 'Yl/%&L#_Tգ *u#ބf"1{QbII%K`}$zJ,(5PuwGG}MOرgP ^zjj,"ឪRl !b$7!]$p2e ҋCTgI'ƫDA|??cs6RSʱ`>2Ԡs<[,e xG*mK2;BAK=i_i"&FJLc$L6MMZuڲڻ,BF# IY~r#ѣ&bRޮk?[(ńUhD L@H:[{w- 4 jQQON xޭ2)fQ%QOtBF)aİfuw[NE!Y/$&}҉Њ7ٱ\aZ^)^r.5$b[sjyA5ϼn cm"IqMBA,t(.ƿ^`$BWm`=TA딘->Zc)$\D1@qS]$?n[՘d jh%}p>q8=ҧ@zid R0#7 4ppGuÅVKEÈD " 'I\ iHd{@ .aNf*!Tv?>;#L7n%djSRKG<Itr#2Ig²ڐӃtyVP(ъ;*ᴁp,x n2`6ǀH:9!ѣ9G. Oeܩ ;*i\;7q`vpIoݧXHaujQj%s0j:=:S.&nQ!DpϠg=*T#}~^iJNBhwi&455Wauo*|˷ <56ƈ,ZYdbLwz\ɄI%`MӪqygSE>I*A&>A!kE)(1} 'zc$F8ͫIKp.v`(=bF(3f>fw6| B1YIxS6a$]#ѣ֗&Es$0<٭s@Hɬ|#NC%4j♀Qf^\zBM8 Uz׏ Go_<ӳCSX͋UEg42e-.$p(I}$ʻDNI;T2 LN$ztfyܓ[:Ngk{;Υ`is6f0ù~B ?u7aQ̸9 3.1{*@de:5F0A F L4Ҍ n?wk,{o.f<0=9ҩЊn~'P{+ؽ7+cXc{gy`4$̒}AD6`[bf+̍]nGFϣϨ!xb w?A4/fwj$%\-ח I}FZd`&1%"O%b$C#5t?ͦ: 1,n\HH>tepY`ff1b.;rjx2N>cE}X_‘1F˙M_А}Co:6A'Iv~ Z,זR^+@[Rk*DH:gz%ѣ\ʛ-TVrn.og$RѪBy/UIȬw5Zm1YFózFsG9] Dfl qWKXM  .ff:Bt?Fijz3bD1Yh wg[RCbG= .nO[c)(ppAءh%Y+)x-+E!$ͧϷ7DaY,4M&"*^ĕ0J6!`9dr4S:;Ǿ[Yj79:'CH:9ْqpQ`$mԞN9 _%)K̢:gzLM&ԭgRb'F.Bzp\4]RygDN5Y%b^Xppg[j:ݷ%kntlWXo4#<~Of2`t( ,.S\h {VgLzx5q+8abc/c_~ FZ`'y/+5[LG]8E}[h"bzp7M) *kPbz6'R; /oĤ2x\/̳?*}]ZWRSث_Mњ-SzpVFݸ8|0 @9#ӣLa,趙z#d$BowgMqN..lo?z!ߜ6@,{==C ب~"͸܍ O$\\wb\^'8oջsX-$xIPE(UKbD«}3Qǜǘ4׻Z-e_ptl' <@FzGoXɼH4DU55-e!_N>㫤i"6(p}A`0\kXko~}͡Ph/ {4:= 8qŵcd'W͂֍O~isgz+x.+ 1BcDa^"N@򇪦]ܗ~ڍʓ7oMryzE9Nzgʲ]ò{r#_̏Ac*1'uDqr=OQ7ɠ$c;L:ƈ+zD(#>4>FOUBC~CXo+ܲmuE0;h?zۥm?tKZXkER˜#  U%uNJv2,hpuS兏?Q誹Cx}I#b;]`֠^\Q{`f7r&K236Ȇg]=M"~1@7`Js$ '+m#WOVsI]z+@"_\0o'Sq$1.w=g.,x7 BCGFBqUtsHnj fm>d9i{#4aA~+Eb䲋*W/a8˪ok]fi*0͉Cp)   xvvTȟU PUjzLɪ.z2oϦOT^2r>UR ?z7XWx:ړ8?d|Q(/ݘCAҋ]50xLXxىݍ?zSڠox4Mp'9|[$l_~l$,%RaeS ei!Sw.Uq @wgEebv%zw6YmcBK4?KrK3j#L&yrc3 GXTӺz ZtwZq [M7SQGRT[)Lؿ?^S MW"m,*Qml* VC ~ cAwh>|ϊ<\!9f!?<VW٦vndک6+()(Yf̈YΪLa΃bG",*Qp$wTx:|5i8s4xdEP*u z!*>-@nXA~ ihY%f=|]-yoёWAewb~.JRK^aݟt2*eO>Y^1/,#[lsp1 ,5J繵ơLrWD:V(~[}VW&y/lZQ_yM <@m`"5uVwUWe(9XŖ1asF3br/Yl*ŤԔ)<-T1!S QҖѴFѴYP D| ]VcjgR D13#\\B9d,gRDJsfw٦5#󆕯mÐe,-C`6ҟ}`Lꬳ01pJC4.@18s~e4e4|sz*ou*zT] G&UĄhb 75r2 &v8FtE*/dbGg#,.=^;8;|r}EЖ7yIW@ mN %m%y\4&^ȦF@*N\yoE-ˠI@dyYN2A7H45a!<ͅ5L/R Mئ{kJɼy%]>Fh3 ,^2,%si7n~Au+ʲVC^^U"CgmoB_A}t Ej= $]豠Hha}k76pc,1|,p$+| ]"PMDnͶ {3e9򖣪qCXޮ]D6eho꒻ &4PUuAE9si2l*l1]Owjٗ8a)0B&wJ#ǩS>(s):U2V} +5v+ > !xAZ>rp΄UQD Ω4 ΒL:aEp*iʵ!iJ 3jͧ$׫1b_s@m6s-[mKnJvfQ9 a g!Re%\`9FR9d2nn&޵$BE~Vw%YLp `W,iE)'jRl&1`$EQ_WuDGU&*DHeR͘@ $jL…HHI x71$w$Wzq!TbYFĀFxLLfT]D2N&NSc",W<(ܦb"ZԤ KѺ m$%1 hn9 2P(ISc@*a1IiFBkt$1H8ߑT=Bj -Du(ble3^;q hMuHQ DɎ3gR1WsyA|^ϫUݚ J.%;3 Ÿ]^*ФṳupΤ^fԣ.!-h>L>}֜TOE~.(缽DtRq7їQ7 toTLVH$xyD(#̒HKMb"i 7EȧY.uLD )u|K F(+)=8HAItBTdTnPZ/Z ޸{ؐwr֬{zsA4WR]y ˉMeWZJD dJ4e:Q"@w{ ݑ)}':i~7Ufgl>E'kFٕ̿|v$kNNne$s.g&r99MxqzE.6ߓTeL+~u^p1]%m^.n4ǭ@I"F)B뵌2 n~茖qpAuEm-!˟1(!CԨȟq56P- F ZcGD(0/CC nCLbsV7U&|~`x~\A8ьrIbEa7c5"yŐqKbSw5l쫪V \` ITAP)ofRss9GQ4Ӯ'j"ȗggv~^(|Uaıı֥-'3Jq]G4M'>0 3ͻ'^9od\RV^T韨y@!*0  :Qw4cdH`k[3"ySk>0zn>9nXxb?QJnTk`-v3)b4 +}Yؚ{԰זpьHwaTR-f~eHV_+":w,4KڟMǣrn7wa%/BHn\̖PZ2PJK5,?\F3ťK>{Q$>.?r1k`/Z.Òt?/89_Y?&ǻp*uk.O9|8;r{an񋫻uoK.AVmb 4_eTE|(YXt p&T1p,F[1nQ >0*sI~A4L E5+8K:_= (ǾE[׫ 4a/$3ꗡ0O4X^OIhe^.h fJIP!ⷑ+?#y~$rDam2#Sσg #b`a@QѳyX7A?CyTg[SWW9l/C/F<绐WKv7/?i^K}R{Ogs͊Prbٿx|yO?4$Aܦ\+> ɻ[t绌x4WRK~CJ&Dˏ^ҭXE3 Ɔ$ S4sg0[FH"JU C@^9غeN츝OƩI(70r lFr8blkjYP9{NڔCZizs~Q-h7z/6|uk)W.͚J2vW%h< l RjLTyUpig%@!*>.[r涡Ʃ͝{(? cT dq|Q<-_LVZoBaob1%EDwW0* P奣_B21qxܱإ?\P;6sUhʐD|q"ߞS.K5+l)7U#{Tl}JܼY6m>N*,X-C0 :.1uwUzrx0RU._=Sw3OL@H)2xPy.WPqΐ~ϷKU]̇%+2^577J 0M#DWs&  P:d4տgvo/k9Mʟ 0KqLqG$t6ee؎~y1]ã%vq}Om?JiO ͦ|: gG;<_gԫ H;*:;/O0xgb#ěQ1NI!rp'bR\{Φ^]51֟ |^I5VN%>۫]*оqY9.R;kYJ7goӇoWwRq1M>. wWlƫ)^^@3wCaW\<%I: _=NcN] wICjMsI5/l{ţ[wK/ı|߅P9T3{=K9  I40t=ؓ4lA9.YeyۢήCPA[\hw;i@iuflqk2]r­:CrA< +qo·xq2oTߔƟ$MqC20h#+fvqTL(3 $F<PC'% c2$ĂE$4ee[6r[jޙw 3n`E.{ n@KiVz0"RU;~Nעou u%QfVe(5+5b\*ٜ!1n 9r?>?*x$A+~o}-duR lT2Nd2c,Y0vIqdIQ 7|jS?lw/@ 7[7U ]C+ktO~ǃS Mgҥ3t+|TZHQiKo3>Hb|p)fg஥/͚rZٻE;U9hTڀl3)]i=C/g\e+UOz>W7P_&?X`煌YAjv|F3xIuIIn+ۇ/6?ulY7#|ٟfY@IUGה<' ;p}dݧ?{5[QH9A]b _iq J](6CVnMC05zpbcCA+hLL^sgu>Z ZvPSΟ|3jH 3mAA_bYtmͮ4yrz9zCFMiwWzHMK_B͓R8n0b;__ J'?mVrJ-Fʐɴh>Ci\px/*YQ45+61̨ 5bCSCc!N|֤_ԗKIya`NokުAxVa| ;l-]n^7~ZTb2 ZeseSS~TcsBF]I >iD [>0ƛ60?6o?HPS{?.٬KQ6CQ΅W(7W(uB07C@5~h_@EƵ6nM]"Pe'[/28[~I]=.jDb7R-W;}8/0qp9 *zmQj[}$>< kolgb͒ vJ?+ƥ {(񫯓w@CzJiF(I(g,e}I7J~525L\A9i T_np cln2($OVp/2B(Thii?"님~$#2)kqF܇ ~8C>,vImO[FHۇG֣g=`D"HU*RW L_'YjҲ =_|書El7w~O٠OW+\njy[Cy.wx諸Ye6~rn ,]d ~O~9#hc03EfkHz=|5{Auqy9u Si* &yͥ%oᵍW:[>3tYroonfG`* d{^k3D.j/2Zs|ӿ/xV..aVcKj: .qyPr9m"wh1TxA\?n;>To|]_U}/[?_;Ki65z_oNnbMu<~yoQ<SE>fP=;%rg9IfWW~*~bB}薿ƨK/cR!FeZ1wAl"&$4;c:OZ#Q!R봫ߧz→ Yyx3*un_к0UMf>?;? /o2(=_jSx6Xw٤ eu: t?nѡߎOݢ9:l<[W N'Zv&Lwbshr:"^ }w}EWaO{m_ *_0#;~W~\oP&NJo7zv||7[7T;#gyC@ˌԢe1j$hD Whګ{H9% 5xb҇UL8PFBqڄ3@sD%Cr(>%·?ro$Z3-vYGNI7uIrQsKaLB3ccIW-MD&6"<Ş8b>QT*0KY_v<^8k}_z><?'ox)&=ύÞAd. .=X/>y^%5%W %25$2g1锲X`"E'[.`yMOgW=klߓ ʱ"|]>[~n'_Y$1!) \{.gKz%3!Vi64*ioWm)\_'-/PV@;>)|Ân%P~]`"խ^>| w57 d;Dް~˫bWق/0&쯆kw9,қO?-fr̘OM։ PpN"NA.:Cv7i }TOwo߾%rI`"Yʏ-b|L֦Oy)$v6"IhN dS^ڌ21 AC0|0aۛ /6y^!.Ve{)SR#k& Qs/RHHa}Ez|)π**3wpZ1252FQĬE05^>"j4 .r ՉY䰠،I. D^! 5F XKҴ!]yϝ4?ik;8鷆 68~!}g߂+9_-Kp6qX)%j_>KF^L9-4,WqR9}fݶ)h>spt*J[g:O^K!jJQ!%F bG0C/شu^ ueZCw3:8h`7H^A Ry|E o=L~`ISNK,5!Kmhv򽂃5l1o!4.v%U%W.!Dq0r,K'>'HBSgt\yz+$=t*g8  8G"+r6)o&ֱdCKl` J#VSҴ)]yox.t(\rKf!mD =>Ɨ8,d>e: "S8]V3T04jF5.m$xcV#G ɛ%iڐ.tՀ'9'M.WYPmG qEwr!Z l )ɹGl)i'nd2ZxԾ#@Mp",֎Ay%o4#RmRp^a5} dz̑ϗwٺC5(<  ͇#Ez>ڡG -͐8pZQ]Y~TtxTpL\)!D&|8;Dk贩A}GČ.^7Ri1ebDUrQ13$YRG cEpPʓr: !}LpI@gE h5mvnnN*+XJ-~B3Ĥ R{(;;i7\:^ѶPRY'IV8gBc!͚DWz(&>}lx}#Bxo5ළDŽ%l2NFq$aM&! /܉ҨA}*;\ږ"AN4Tbumx| sșʋxRpFseɈj[5A=EK"0g1cqkbI]62,E֕0T"UBm6&xN G,"|bd^ߖY$quUHGP NŎG")rJƋ'=f1JIQr4R; d8*8j9a}E>boJcʊK6|Sd4+%΀4v?m9wg0NoK $Pc_vb _ Jj_?94 (5XA(:Ϣvh[A} ;-aq+: 0#4I ʱ<'a)pnf]>vuʻ'WZtAakARIŃ6Ȅ)@g0;}4E"+8o%WxJ]TQc"4*.0AT6aT[;Hӆta=+"vcV 1 ǞI1`*Ó,U(&|U{u]ƭڪTj3u"A^c{cg;d?(VxOO(*^xZnǣV>DS䒨A Ί@d~"E Ɖlvhi}`!+.|A-<&\K=ċl_e ^qeRj>x {d{KH(ǴuQ xR{g*CV ]TdXE&rG Un]`bGC֪дA}d5+z0U f *ן)(G PESH(/MX[ĜSjJQǵ жF1l'7R ^SuhNƻ$ɹ$Vl:uHa]S);ř|)*-^Lyd j0!iAw t gH߳+0d yYUt (Y|ΗU$J˹vs &-l5^r8&ZxL8M*:=?'x%c5?v`5Ex.Q&Ah9p,!Hsgap4LY<9j«e-JR!w+0 QiPUHE? fY07MxQTܹKUYGf92g[=̟s ҮYDW;eC*aObtsٕ}zt2A )8&C cb Hc7L78zk5̗(͈K}?K$2l!liˁi%o~PBWurfS1ii^#SJHa&LN&uF951Pi c%$m˄3)I9yG($7,K XHÉΣҀnZDWtĒi?TKK7*۷:MKQC}4 "b4Hw:An1j+űt3F _$c}SK:x]%G/ϞrXu)i1%Հ)v8p نfkrZ# iҜ TxZA [L/P)6rz[U''InyiѝY c:Eʒg3^t=@iչI*=tM XwdGWJ cecʖӉ/[%LLQZWwC!H} /jZQr+KB9ƞ)Q> ـ t]])DvWiSL0r<C nǔJCf᪹%Q@|q.26T$R,<[e+U 8컮h۬"+b%'7Ayahcp!P6ӂc)4&nYdy;ԭLj{dRN q0Yh@7$-",;x4G՘Y \|KϐBU?.gwhhKS` ht_MK0߾|tQ ޖQe̡JOL,UrT@FյƳt0ǫ|MS\ĆYRbx7;nlL lqThp4 J3Aor-[Ӗ }YɼCt5iY&ޝv \E=aC<.hUP҂@Q@_IocJ_ ׌iL&V쨷ځD"QցMR+p$Fu1WsQY'1<1`Y-3e$NFq=vh̸?mda+XG+&WX]rQMRʼnLx#^ݹ <~#;(k-x{|^tAZpyɸr=PM#LCє"̛T;F$>@J5 ! 1Ƹ'H1 $qK.tZcC_k'=Q%yr3kI.pFMiM4L46M%Z~-/zܘwWl.6Îꚕ=l!꺧[ ZņKTz(5LǘcEJ/߿t8@Gm1V X,P]LյKY #>r.KקI[U1F=fvp/zj>w K*l OByṠӒx[o$y3A6t>/VL%yD v!a֦d0b,x!:ѹbO 9}Gf52%)\;<{\Zٍc\Ztno~8>c]Ԉ ul9K`#9&$\3_|KSpE˴!ւ+xӌrGBy<d6~%):"$$O㕝<KIz$9xG_̳5k@TIڠ#.mMƕAݖ;䲇,:bQjqg競]hqڐ5\Qc>kKLܓG\./]qkY#.O&\tSi a>(IGDj8"?oqT7bLd7ʬ~i C$u&5*afI{xH4`W%"g$`hCj)~%]tۍ]Lc65vKewE洨mHZA:E_]ljŲ4%9."j%If,hǎ0|H=|Q*I%i~Pe< D&yAv ug}.U#}mQ81m ϱqI댉Y 'J`}̾2tm/yu;$bg%oC_(4IΜr1gv-(S \IXJ Y@Tpr Vy҈Y y%=8;]6 q)K~fpu0#$UxIG<ٜwoϿ5]x,=e'[^? РcK'[[=M0ʜ̀tl&є.&Sޱh(5"fKUL=؃Q]5eإ'J#&u}YuNIJqs.fKH.CߦiSN6Uf JCb&/\FMK[8 ewe߈(9kSC„Zp|A".g0lJx?˖_Ȧ1{f*n>J<21Ą?s_U^M&N;#-8z&N ‘PB|OEb` ;*3 -*rqr>Y7z_=1 M:a;7y@1,f4ӞtRLB!v!ĸs{qzV‡?ilzG_ "_E0ViD)4e*X9͉a^9}./ڣ{;!F݋5<2Nv@L EV$s:7+cyh+8Ɗ]9r$@x)؂# \Վ6j2rm8̀2kK 繋g"єSK1Cs(^F1XӸGWLz< x p'Opr5pP'U,b=vo-K1x+X>m"CP+*w2En{4m*Uއܭņ|}7mݲb }M\jxaFY. ٠$'YKKyK-Wf!Uצ rM>2JnbmEM|SY HAUG߱$J_6dž9}xSf}x|iWgIKxIyG*J^hId-R옅b |l6 rTpjs/݇؃ n{pbKp7n-[ɐ)nP% HzpS䄩$g>KN"^Qh ^0X-nP=Ç:W:h~jsCo(9'>Ie9 LLpzAC?-w)~37Ӹ2jLGB<&}?𥳐;D &r7ᨤ:& T2$@roqo4C{,M'gXCczgs 6s٘szI0֑npf*\| 8\妧\+;f5~8K£$$2{W 2<H̀Ja@c 3#ҜQ;KJ/K%y@;`,Gi܆v3;/V1* y%ZK}iBbl8TDxxۘfҜpJX^ՁC2 U`jS, wl"SGo CGX$9,\ , ẗP# 퐨?K- 1*:-LOa,a+[F ,|M괈}л(7>ʍn4n&T XDPyٓrK\HagsєOwbp;Uo*1O7(N"܉}ImnxwI) A$LXK(ôx uo2KTfi*)O(sCU^ni^SA= x ܚ4[7Mel8)iFllJR=Oar[{MgS!x;Յ>#yG`7}Kic;1|PMǦw)߭a # ޅ7+_/.]oއ_rH ſMS\KnP!98Oud&U?`@7=w`^?N<6oBS*~^jϓ0T pnw|n Տ7Ꙭ{w2]?x= d9/*,/MόT6"<_};^ܗa s ćOlAzp $2*So,2r-9+K 5^LfIU ]j}6Ït"^a+.N` y6&+g鴑??R]X4߽<]pQ՝> Dt׳ﳬ/y_:cYq3W[ \ ?ql[ zc:^Ł̔/S~6vӕ7fCU )٧^"fmٴ|b޽q^&ǏUW\߲얟Z}:b8% &2v=sYc)B廝) wA33e;s%j{Ƒ_v;Ld\ffg0|InATݶc,_Q,-/tYbUE@ ;65nv*\W_m mw>Fhm< f 4tYU&U񹖌A ,n8_>yXso/`A/ÂFE[1MxC(lb0HqVRC K\rz`{TmVD]V p{ݏd+vR@բ.gPsd:=@$7*iJ8!V0tY#".l89.^*Z]Z08LQ{K JύGJ m*]A;X7HqGG=dEZRKyVS-SeH X_>G H{Xp-L%z!OL=X<ޣ0R*x'" !'ʬӹY+EtÞw"Sl!ҹi0VAOj fCMHI'wOb ~#90Y]j93 O^f,D'ܲjQhjA"nj7HySccBFqH:/wU< gCFDEnY6زR1KN E *p Y1ZO/B'mbh"<.iJUE*z4`& 1:Z-fL}c DK?Ŭx(,wcQ$G1a'v[LmU9٣2y[?+&zqܴLp׃UnCR2sd<9||RʻgYw48 m;a(Sۏ x{G/0|>8s /X5pW⢏`Oh3wO RcGa r#=2O[69&i9AyL(i6b αA6&.&ͤdCm#ڱwO1}٬ė3R7^V?cN)(azv3(yAӗ7Ϝ{J[ҔdQ5vaM:91L$Oxeߎ&FwIk}g yAF}& Xdn (PMˬ˦kWٷux ,r 'x|=4HkM!YAv91t*Y_πI`8eX 2> Nx?*AC{XP[AǽT?K!Ta>B7ƽEOT(sF1^D$,M 8XJD[t:Dsr~Q0oюOC^G]:> #8> *Y=JKcj g_07NbM.үcIt:Р(_skpJo0m0'=B\ow`} ICw1d")OF a|9;- 3oHbZ9H'yC\˄rTht\Pnqr+ 5ԭ#C 5l gAl;( a &iv }P: Z j莼@)KYB{˦V'1݄9sB6 A@#2,FdJZg+Bܧ,~ ,Z"U R"FDj5s.l.W MC1s.9.cHj ﱢw'ŗh1Ɗ\MA\G)„xb Ηb!>ϟ.*n/Zbe|GM<>nQe:s¼G@[gҥ+ܯ }xDGMݞEםGxl$Q}b ǡAsC/ßP-Jf1$nz94hf/"f/=Jl>cDݴnut8~Hۧ>l9֋sQmQ:Μ曷uc?mo砦:53#o0-{#\_ŹDž1T)m 4lrlpI /lrYfK%Sb>53egE``w suOSTϖt ۪x~3 ݥ϶#\nUlϳ m3$ւ7PA7e~5{7O~ͼ8/4|n5+wx}_wfZm^лstLum [7s/l"V}ٽg| Nq7:nAКYb0!?Z~[0G -Zlaݕw-5Ygۀf5?6':0-/0ΊV&γ;h貪3Ls-=Vj]Tq +Ͳ.aa[% |s!m9U UNW|Elié9TnwA|i _(r%BVj i- Fk9ϭG')"a$Lc+{Famkm%ٻǝq0 "ZSEЗ:ya6F(' . Cau%1NO*i\eF"E9rlT똶;EC|Hބ- Yd.N Lc0S )a0<LZT0͹ k~pRD[^读^!p H&+S*% }DE@rq( Ƀjphh#JC|$׮D+K%aAIB(=1P' fc-6(ҡrF:T ]ø @`*MB/zmt( $Hrڅ9Bw_ ]I Ȝq!ymQdC|@*:P y졅 E/vHL?GC5vgI3rJurAl.]]k]7܉SޡN~:{aLGj{P/#wCjG%%ȋ1U"^:0[V#Os+ %wz墓O6-(AD(A qkeI K) 0$UhV{kdӋyS!rcD/czP\&KxfgSP:DZ<74T:tļbuzZNW! b]*v .s-aTΚܫFmKHitXiFh.*~`G_tԿbXPv1-ڎԷU/vWٛgB9j M"^Ha d1=_ŧԭi31ѩD5cV[ɨչ;8xtP'_l"QSH("CBiKYdJ]"\xFcZ1'.9S$]?% գS) ﱐה覚ΦkG:4vM !iu]'w$dv >doOU}[?evV+0ȚWvUg"GMvԇ9s^T hm>‹:Xu|f@R(YҒkp:f Tח߼xθ^]N=#0а;٘&^NVL%UL;ЬW>~cpD'EceyZ.z+.]+AJ`A (pg=>æؼm\?W"/_\{}+^'D(u]'8580ؕ+rkfOr֋*炉BPZ?W"^vc[$ Fs4lU!_k Gk&^ī?mJEMQ/""7 K7J;/tp O4]MjOvK~ܡ0kܫ~\VS.pQH1Z#2DN#z*s:;I&j\7~֦!BKĊ^_V60e"@z~勞zjU- ~>U#J|i8ƺVV>Dw8UΌ$BMTuaʿ@h{jOK ':kʧӷx\jcʞ Ыe?eA(P&9:NnWw'4ßAG;'n;ē|dB)ƅP> SGR:,2k)U4\E1 Ǎ6Td~Av O|w,~}._vYf:fcf}8,mWh8~ ެWUӧ kt¨+5: ܼZI^u%i0dVs\WN\GODd|GߜsaRT(2:ͻ' 9jdkN{bT4T)^F<.w!>j{~nS*R*k{Լ?fZ2TQ]2xkc},({& 9՜'cҚ]&@Oo.IKkUT WMEt%ش*8w,|p|l𒴆;/p[)"L}5 &/DO1qybmbd@ r>'* x {+q6HhaHٮJmÃ-fq9~DnWO 'HT\t@>IRc$1\%&s>X_71#} 4WFkKu':)cI.Ι)E,#F!pd}0} 7P>^N8Z? ϿykN^Aa_GA *ԏ‰vLZo, d&[رL#E ßI3jPq=ܯBDïp7|SHkZ6Fw(`aZ-a=4ngWTAX^"k1ee/:M-XDQluT`Kϑϛ} l ī3Rlzg<ʷL$IrCՌhߕ&K!QG/ßL*UgT3SLus֞ſ.U-$_2#бIvK\s)P|(xx;n68?IMӻ7ŲUmUSw?|Ӻ|^/n~q64?:ռ(;_Jw_1WF)̩z x;i'_ָ̦OV<+m쭔#:UqU'lOQz\7k^NrL&vP Qɥ({'V^E>j<,d0@G=W-NF˙bI;j(7_%y:vv:ωd4͠@TGSVniZmպݢWQ•(ءK/qU2*cDqp#&G*RҚ%@Hk"sW Ȩ -!i,ǃvdTܥO-Z j].*'ƎhI;C[}ӒN}S:'ݱ2 28x @}!c px@=h(D%mSƽ"*C#LKx=" yII`Ha(f4=yz8HW7R<GXk)DyeCJvO 䝺je nn|c:inl>*Cg/vWDHT̆WoU GawV  Ip=n^EDC$24*,Qi/~کvʩ?4YHvkՆ.U\POEscySx]0l0A PiT0P1&Q|48{'g/W)L>d|-2F .bJZ2L]~Ž+0U.h>$pG| p|:Uqᇲ wKTF߂o<P:;`HB]l8't",8E/zxQ,a]iXx"6r#ype$DyK'#r1HY JjM$s8>{Gs |eT#2Qvf/4HGV)ۓQ.DIB.!35*@/H;%3jv!!hLUYAhʸ OwaO@eXDę{IrEiȨ :ނ[Qx؇F;;Pc!_Jw LYɶDUʖg!0f Ȩ Nī}Bʝ׎jJ7c$08VY j\dkѺHA9@d#/F%cFW._>WUhG㯄U&rus;/GxAze@~ؑ#HQ#OaKK /-_xu#((l7Tmi%] QP DhH(B (OA$&Jkm}HU%,Z]{7F$?Āh~w4rwwwww S {|+\t]g<]\-}IKWy7 28|_ݛ|ލpFE$/$Y<C/k)B%@ ;vt@Fep+- M՘u(B*MUkQYȸΗWF?=&ʻMP'@G6pEBP.WHQG iʬ7t('xJSӧ^21y֎ig4=?,@8; &eTǞzIL2PT"%b2?<0׆8HZϹ{,e"H2P2,K!:cĬeT@!eTRTȿozRdr~?\0 2~䞲ʟ=$PS%S<^_҇@4 28Kd9N#FXyTYg-1c"fptNׅ\bAd8 0D8 $>'3cܛLv ,34 I&j1J@ȀLOz1Y")0=&v(B&v!:!kݮ!䴼xsI!Q&Oמm۞Xr~b]hsep~J4z%(k-i8-ً;}le /Xad1+Ca)kfB-kܕ{8j< .8@CzzINB]mu'~n2.[L<Հ|@Fep:ITL3D'6^>܇^VM Rmsy0cp 9~KdT¤x1AMtQuK+kx$.Su@Fep L M ضc;Q!+PW3l^M_ G(kq@FepXy_투PTs@FepI"Q#z[)U 5>tp O4uŽ 91;2TW W^xWw=ƴ8$SmDUEDD&+b˥/^cWEAsq*.jޅ 5]I2 x_~4u+P­}~MVF[ƒbeޭUguTK䉠ET0dj>,:8cҐ^a̅wf7nQEseDԗGdkb&2*#yY)1{d0's0e_u6Y1mG ޑ+fbp0Dש>hu:J|'^.VeT[Hl\,A.#gLkq)?e]PggX ]"#L뛵+ ąW$\M}lvP89f(Z(ZvʀHUmtKWE~ v-hx+R e|;2$R֜'(%ܴu\IfH8|6&O٘Q{ ؘn\؂fL+"X!(c#G@i} 6Qҍ6293cR:nSrJg 2GD'mh-28ṆIk2(U#|AAP(+<{!tQqaTtE+xuݘژ2*ԩ:b9-;*ZO/;y9OȨ N>Ga:VbuP1!N@R(0):Q&NT wj9"9yzy1QMTǰ"U'>MO*?D{WRel WʓN a9F}*ʓp;uBUw ?ɰYY#2^xPM z[w.Ur0 28Zmk.vI&WO@eX oP[m/0&F//aѼAQcA [ n>yY٦vܺ8||KDH6QϨvY)C&n+"@,oOI܃AKNucnf&/$_ar<32 Ni:N#8 UȆ{Q%'99~ƍןmd|:u ߘ5ϪhM<Uw9k2dOwxG?竢Uk@톹>)੟X1UǏddžo {_~b ]?{Fr\ݻ0 a,F7(] " I^k5aoP$AT.!j 6'CKnZ} "! UH#F_y2]g`u7Y$Ǿq!ωNW0\l\CdKl2eSχaeBMEw@[kGWUe+d2o+ [T$ҔQSML1O7˄ VI?7~bM %^jyJiu"^F}%-4S4˖hﲨxjd <6Om氒@*eZMb-=wR8d9kkq eao\0#RF!Ɉg 0]O~cr3`pԡ30޹=<ߠm`>[o./fHhۦļ햧 S_:][ g6/=tɝ)9A\' MEH `ı(E5b232'G`ԎV'O7r;S5io-CpB;t!A,U$C*3D(U|dddE2.AzI!H+ZUFΒrV1+5?_Ĩq/2"@:A[a-JϨ""4 Jf 3;*Et )%oT,ppye9F2`MCZ3tL@U)C v*ZvI;i"" O&`Ev;'w-Ψv=â")"Ӿ|yITyBGNFeoW ˩r ,s>}mUca$1RO(:l)npP,:P) xьx-!aQ|cR }iTfkDHwaj(̘$ T0 `4e42Nq|l՛D2jA ˾Xq ;=gR'GI9R DqѫU25@o XJ q;әw 1hv!?4nbf% ZyC;18*Bg ]F*$l&f}ƫӂU#$28VAd$V6VΨQafl!M jk6+"9pikdH6q-L&0+' ` xphˈqFA%  r&04Amh @E.Z ,.sn!͍U0;$VsY ă_֛#1z /7_M?0ׇg/#jbN_4맧Bg(SV8`%x!2B9 }duHZ/gÃֳF42qG9@/Š@ È@+M;qV8n)~:mjԫu-i>o})47NFsYk2Ӎ/f1D@ޜ`@Si&G4s$, 4#IlpBEiU 9Fʵ  mM YlbL?$aZl$^&\,! ZTC}Rfexzu R{2$ay&D1F90e/{*N\K5p̞/mAI펋hG>^NhyׯAw%So(tS4=]3QȺWtzOԧlꮂޏGp ꊳSG)FMeY$>j͵^㓫N$a8΁T?qCtXÒ!t>zpFG@ /\WH?"to|+Kcp{y Gzӓ3hgwn|!ȅpt_1#(vawRonIc'Vi ?ՒP谑vQBE>1ܥ]I$'z\;L|]2_`s߻a!5' I |0uY >tm5=m׋Nۧu{(yS`B6Z4Óެ;ܪbc~l᳽egPB&9.֊eZ[+ykMyu6Z&-0k92`cb s%hP),SA;j^:ԹnGk:| #k.޾MA0}vU(KJ)&B=P-a,T:dɷZՏK,eS~lc JV}i.N_Yi;)J ,[/9@H|WWA;[X/0Ŕ|6kVA kE{FYQp5&&mWv$76U絈Y%Iq@#P(N}~F%;CYI ҠToV;w:컅Y:vevl~Iy*tr9`FJ& L6&qoʄb R Es1QVȵ[`Ҵsd( ]amo\͏o^־9xOI\|G`k'P@ td'ջ Y2Z~_UzA~3le\"a>}Ǟw.*g0/GGo`,'vǟ{W@ݽzB9UNa♌)zm~\_']VwY丢WhT4M0h@b< v>wQ2@y!z϶1x6ޣx4 %csr_!;7ԠV6 Ývnߋq6{ =@v"sOY5%yǯ{,d &/|ZTRq2 |rxO G!}(YbQqͬ) 2Lj!)  Sѯ#y.}UKH]w&AtѻAۇsN3hդ>h?<ݻhovtx ŒJa%K8,L|6;ƨQU+VDUiT5Ӌ(yLKS\Q8%`D {MH6 wzZ;|uk#A:ѝrVf71t;3ȫ xכXWU?~v8<>y5 3ۅ6Gf,\XQ][H3ny/$|Oõ|H= :><'"RUa86NUܥx. ٓ{EQ{VTbt] h~8#yǏ=E-T\䆱`W{++pY, zB,RFXˍp Kp[n Z} ܳ>kҭl#nDR)̅ Imؗhb 96yF|*$@э?#WݪƓ>)oga0O[ͣE; -Nout@)|0{-M.e\18Œ҉3g9a\ݻ:߿2XzZ scowc]=zpe?\Y_+(ғN94mM+2(*Xb1ZjX:X] rT)M"V)c4Yl *B-B ASV '/}I^$(@NR ' % ھY:rc#Ҿޝw'#JmL!*" oN0YɴC)uͼ  pQK ?X5˼!G:zp W@%qqPk܃W}*P&c H 0~G:@wY ZwwJ&CDw3V׮ŮRSRk--z/rdf)mR_Ɖ VnW|b VZ${[_aM$4wrܭ/wrܭo p<,ay>Õp-zQ tc u/"wիqx~C1{m67xjD{9r&{C  kY_4b&gV*pfE83rc 8fx43b\C8hb_]o7y~FH*h:Cs\LW_k[KzIͅ~2I68(lcR V>ʖ/a |CBaJKF) %DV) }ÙS:F栽g>>_J4>_dDsG=XfxJ.S33ѵ1z#8-!i51. 3'R2c- a$[U"iJJpp^PҠmPV# $0Urfb93U>pl:ˍ)(nAѽSHwF?Lfm!-ā}V1SWāv]]w{  S?2'"akkS9zF8zU'HC5#Tb 8SS*9^р8sedMTR#*#W J 6Pd4M,gɆRjYͨ1*\~Yj# LZWG%6a5*`|9,fH)_[G*d cJYe#c3^XTqj%Z,IO o?K]ztd x]s{޽Y2"x19P2M4GUΐv*G kEcEìfőFq\/+;eM[i99{ge9c3rVp3+&"s_px E`5*rmT!=Җ."c e }LK" N,|z |mHQ)ji`˅QjC) –0Qy*$*#xX..*ґCJ}/,mXsEpKYRn\mNe2ݜ  Gj./R2LHȹ ͹ w)TrH 0869O{ Lɜ%0KV1; vrr45CjiT[֎WoXڭ?\M7P ͈~B u$E^/<&uJJ-xjTdi M0m!\p!\3RS(zv[R6f $cV߭y i&Z_̓b&rVmQΊ(Va533w҃>>C15'UyWTlHfqX3Saʧ]5;P>Ôfӌx4H♠ٛ_˯?~ N8RK*Y0R`D$)h3YSC!]|]S)fpKVQg2Tc+\1E4'DR%n9 QR˧W3C/fc=0R-k^|=Ȭzp}>)~EO^g\tX,~gP+&ܐ OAcNх\ ٯKx}<9Z؊W/~_])Dy[ ZYJm+A(XoNλ~ `[En]ukEo|'0)#`^B_?jSBY$LUT:aWJA$lZ9:@f Mr7oQGx-U֎šڍf_$*l.ߴRxjZI-U1j7+$gMr7lFhZ^N&Jp_έUb8l F7Fr׊2XOrl_hM"qoEm f xfu(YqR׼sUd-wWDPT|I6-k]~/ի͢[ń< Ōsk>ilٽͯG zsW.Cp="bbѤkw(Ϻ9'>^$Lid&jb's]O,P[Ņʬ7̶25MnXߗE_g;]-M/=c}`L#DʟsLݠ)LPQ>lR7s\ؗ%Pߥ@RI ҫՏRwм〫FҙM׻7۱`O ,DobRw7$81qǾb)c[[@` ܭT: |\e $&MCJyE"+!tMfILd&;bX;mWۍ+!7@j_[?{ǍeJC؏}/`#3 6nϵYҶ$=nڲUh[vu\\QGޕ١#]T_Rs αQ(A &C`?fbxN$? \M$?o~%4$3GĠuߕceMH \d{&̹W,oͳZI=Njw%2[Q|uQ;;ڄC~[{X^/-5㣓̂W!ueF[cw<wpo^7Dҏy<)m2[M#jryL~]ԧ$4}*\o7tugZQĊD$wk^,>:;OW66xYXw/.{@no׿ZSZ1X;|ʀ׋ן.Eryr^u,ϛ򺀢(d.'G}p_W/+Vc 8<^HB=oR1FvlqnM쬢-;²9.ʳWGZyCx~Z3۵9m.\1R ԶE))(Gå/޼xns 6~ x >ĪGQ}uBE߲FX1 q 3KזSI g-鐼)R?-axi >W5EF;Cz[K,43$;oBj-i o]L-!8?-axKǀgm0R,R&f> ~7r4\MʁFeHz^jDܤ2Q L # 뼤DE&L+ՊFQUD%oϑ0t4Ke5,4*YhK]?CH-뼔L^:W /KJ%V' #puxSBrB~ƵNA7fDϑ054(R"4Rd&&8ޙ09FwGǼC/%!WCJN ~ r cy4Q= ;'d{Ы:e?CxTRRT=FWAZ 'A"] v06׆n)z;GіZe\(HM-YD+9F7~! 𰔕EߐWh2NgulQM9FGcch *y4)wEB37li #{| airɠ-R+UT?uL0|aQHZ%E'W5z2dᩘ!ax  Hr$ϼe4Sm\bO~֍fxF#^y(`@/5@-O9FG3.-յM];#ף%<~KIFB+(J(kLI7G64/Є-ZZVAe]#axh𶧩( IJCYc` #Gy b^(nZCџL71|?EBJГ!~.uϑ0<3r"q,vK h}MBjGQuNkibfH q޷vWmn$՗,O}h)mFʧGnԏ3*n'7qV jz&Hx퇃 LJ=#[BY7FVv %74">_!8)T2GvLrϑ0eϣcQwciNx v5s$>eǔ,bCϨoji?CFhViܟC7hUD##ax?6%t/:V^QZe4RX&b~ѥ.mN䂾U '\! 3$ HѝP~HPUt16v'~Q1zfDiQ irϐ0#1:gZB'FGGhHI9 LVL \?فDԄbqi=dsད<|.HMUZ:(EL;f](Tm {φT$jT 29R%`z1GRkR*FZޠ.4VA&Α0< hnVD~_bt6i?͑0ftˤjVK563F¡H3ArH~bLE*m^Tϑ0|+l궊_ o7h0Xsн*[ md-[twK߲~rN},%}F4(dv|/N&*03,Ha|SRԑUhrnǪ8y>uS[}+z' ߑPDzFX֗F-e \=]^=Na/?ߞ>J]Kwãuz@۳^n{|*Һ;޹ Y9ͯH"gNw3HGfڎ}_/W(5Fwl|91+拉YZ\2d5GYvX^lF}M[7RM9fz/J!Dk^,~=:>^M>^mڳxl.\p~a >NK@{uE{:]'xݤD%+C>E'x |vmzX_., UZmg{%Jj@h^,棋=w3 {{ 4n|a 魈u{{6x㵋7m^,/pkK7 ㎪ƎVl<|BcTTm[0,{i:ü_|ܝ>|r˜׊>+[͑Waͬ*Ji>$vſj7RVzU 2MA[ErRլl3G;;?[Jˬf}]wJ˫˫6#ӥ8:)kuuED-AWߒQ (YWV' G2J 4r\/b쫢c !ǬBʒ4ҸR(.,bc~a;0=s v)KѮ6*p sp1DP' e`"jgtڎkJO>yXIaZWQkZR *KCUJב5'T|5crIR. ٨&eTֱOk){,u/MPA0ÌMJfpf l$0::LՇSd]d&,n_ .56aԊHlQclq"ÚSt { @rHc5dHcoME"3|Wuk@AA ̈́!6KG`L1Rh0hB̚` UoŭGfұD8*VZ!he*3[!(s$e丽Ƣ+x3h^ 4GBY'o.+mWՕHދݨ܃.PDVKxl[Qзz ($dV'!T{)㠄jHchV.`niXEjE},BZ I །_prN+6n[1/EFzhkv ؼ(Dh#;B&mY;w3lۘx+ a+ܲ^Z|ou_VuA!I|4xK Kq@xc#X YiT2Mt%ڇhRUA 0񘊒'8i97z_fPcQBgsA9J$rA:WDMȴ7 5*hx H}h(kIv 5<o ox6cQA,TGW>/A_EYьmC5YK1Z1pߩƪb%pZmT]@,M$C.2Pe}4 ^{@#BK R R{ЋA /ѡB-1`0BYG]J@SQb,C)9h#e>%TD;뒄 Xu+P(vkh6!XX[vJgCⅠН% ʀGصo,  35)JL `?A j@Pq ˇU%o1 ҰE(3A6 ?F)ت(jS,ؚ}-6?vFHmY4g#NGhTf%Ei(5^ZzrDuXPh-w(a:JM?+M-)K M ڪmrynshVkW F:8ڢ+@(|4jNÀe+J VO(IѼX͟fpJ7a"Ƚ5>d8N֞ +JOQt`i %׆&I d~qކ`ݠ>jEroVå!0r( ;fl"CJ,=tR@Ւ-0 YZ^(m*?u Aygjp N}K(YE 󫵸y>_ֆ!wNd;ӤmHQ:z{Pa}W;("d}Y" #"^9矿&|blM) &R.AN.j(Vth|v~KMk-ߏ}_-\d|ĝ,gcyAI ˿I;)L\kWY+_h=y/>B@w~LB="-^cQ)Y>zoÈ kGWd c+6C+Xigzp nLmp4pEZ5"k?t"+#W4Xl \֏`̮#\Y/w`9v,pE~mx"+dzpp;!mę%-?7K uṂ驍+w~>{2yyIx67yVy"{=?p;xq~ 74 6q~' V<:Y W=p3>eTZ 6R\="kh!\L #+l: ~,pktWdevJ[yH_ߪt5BGd7@o*}2:Z0>CpBGW0{="s \ڠYi%3+GSfDpE8At%k\>Ot^$H"ÉGktvH]W$b9=]=Xdf5?blN[kbV 띲V?I Kjw˝4iY_V޴[_K,@Any~v?"; \]>n]٘{Q3!TQ{Lo&Undk_jz6Ζv۠.f#?mχ6S<ݻ&V7k.2}DOww[cvvI߾A=FƷF2{7 )\گ>OgV.NOo͟ߴ]u?rjƇ7m6Y:VZ=L1'g͐y5{;v?%\#ml֎pJxHE\w蓋CBĬe¾y7.~~z/}~H7ۛOHNjVmkqG* >oS;\^~8*fy~q.|1}~}~^<;NGmv'g4ŝR6 _uklƣMCf^63ēHozJƷ^TYi %j,'bBGBb:H:s`1<ېMzQӵj8ad>+;akh|,'m,f+=X ?0nAx#ՏJUckCBEH;kVtxr;׆{+m+m:{eaOt;|3|@n.tT=-|?̧-r>*+A܄Q6ӄO!'eu"5[McYe3%a}/:{$>VK48g-r"*U3QUZMrt +i/|bs>79N:ʉϨO]p`ތ&!Ѭ ai7 2tZ:䔮NϿyhP/|1_Atp/ߚ84RIIq{R5'%tRRXٴ7)IEL6xf[KkMXZRlC6rX9Ms"FCZ)FTMsB80#Zױ4bQ*lm}4ΒJ+#7`B GQhWm0"5Mq69wU\_x[]B@#T]*}*1otTU^D!3eUs #w'MȾ:D Y)s!ˤ])jex@0(UbQW`T,ON:iя7r\~Ȩ GR#)Y_um:l=M/8]=X- @Bi#l[L(<2l.) 19| Ǣ-/ {gn&>qmS4yce仳( e|4-{JB Ms}BjEeoI Q} :gYdmÍlӼO|C/|q|w0A{Wܿػ涍$WP֐UJf㊝ؗ /4En!  kDg@ȖX8%RT8c0JDPMd 2'#*Syg3өXGkruҮu=E'Vh!%o[6ynQ{ɴp]'Υu|ܑuBB ;D>t~A,M"δ.6 3 Bdu<ʼnRY3 U I7O5x[,tF&!1Nk4ݲ\|]t/Nj?d\v=w[ѻ}>KVP4jA]z˨Sou;l_}jחrce5mt vfKTEÇy,yтj"I$$:赭'ukI.%eKu!% 6Te"SBHψ $i :2ۊoo#G7^9|v"ݰx%ֹYH\݇ާQƌPk,hwNE3=dOԎi;=&gOQ.]:$NĺϥOJR , 0zjQ%09|Hx7~|y4.[S1)n kLH-5n}p51\+ο4Ngׯ^ 苁lŋkj8a~~HG^yx,¤Tazٗ8jee_ ,`?)LgX!}=dTff0ieFgpFOUE nmv-C5 b}dՙ΀ѓeE? >Ymd^8`KƀpygNW_cxteNf;5+th9=78랮 Eoψ䆈G֏] %s7⑊ yWte{:TT &DWBvЧNWR񞮞"]-jlrZQ}fj@uKzͷKͧLbǏf_ eҏۦ>NHm O bI[-EaN˄glM}E,JEY3 x[m6Ts!TE4iQjSi믃VLrG 7IU* EbS#rc4G /QXj$tF֪_RNgF4H!phh6/΂5mBM=Ė~nhCfBt0tW_ìU{c$kYKHh`bFScO?TuµYxG4R bDc;[_I{k0RL9a TX~3 t˞(nw 0Me|'!<9 x#oQ ųA!x4KXI,:uՠLŠkI3ST$jf|Ʃ x/]x4/Ā)BXAP.c(I'K|̳r^ kA?XMjEH{1ge#SUx 0j3qu&NE߫;rsk-|Oagc8Py<~fRdAD'Z{eYbIvDdq XA:-[1 nM{|Q'x8N$uQNUQx6nfa  T)ʱ-E_}f,ѶKo vLhO[Xbym#xk}k4-.`ޚ.]_9|x#~~/H68L<%'V`+\zmٵc0IgS {[ n'I~|olXZdt{jL^*ۺ*[@/kQ"<rKRiH!&TdL(K}&>(EjJu׎Y,6~y8pt-MPwn*4RܸW`)pZ7Eu{i0柬Fi`̤ y bǙ` ĭ9,Z޴TQqHyP>YחrUCnϷ?WS@u4y4WJE'@n<3m#;Q>\'e#{A:ө70)q)KݯGnjf= wF[VIJU %rNO,#A%9TR5}Œ Ú)"i+ȡ GAw˯xIWW 0/}MX E{_W`kv}9_F _T]i:q:k_ɗhe=RD!K[3G25LYBA5B'e&hm W޳LALfBF$p,oZr 0 –&$UV995uFPi2A[2Rt!2 i/O?sYD )M,RI*% R H0Ƽ.˘]\-qR'U[~`o_W\M)f4;eۣh)$ eJatF7hfăшBq,r5;6axcU &n}D3 5Y&P,ڊ8=yސroL4I8f\;՘dT I$ֱ$([lyAЋS1TP--ɂ!1\+L ՉS"Js;ap;* Pf2dDe7̣rb?}:hMCAڵ'Z<Mv7-qyKI.T6c71Qquf7ΥuVw[c&_aE.R4~5tL13B?b:-៭waWD GZÉ6K輑FG. )nWٕHr̎7>4LFh;3<2pqMEnjO: 21 2,ёإ?g8Ǎ@(zIЂo>6UĤ>SBTy<|{j<gDx!>Ec9FFnOD_opngs@^6ysoU[Qәpa}Aդ<eq},J^Ay}O}Ty7,FejpIŶk/*yEx1;ܩix Qx,C;STٙp5D{Eߙ('m;S+5!]Ɨ?OZB&06DX3+L +>0;&0c?=`@{AüW=BEȸi*tHJS,_% ԦX[Lj!9S1*EH=gyG_GF_k[إх,|O.L5rhM'>>}az:ZY}zP'oUp}b]1TN싺.ǘYOUݎΧY=)~']>(gW-٭Eg= ; ŧxU6LΏ3ƍ!@8>'} MHN5Y<)2G{ʯ d̞0ͪ٣v=X?uNЂΎ:b/)#jTrEo*U%c)?B:G6',ธ&ˮF7%VJ8j=!ѡ!`q-*W&]m_9<*Oog,6続,w ,P[^$cGN˛v玽yt8cvv /I:(yPഒJo,J&W8?-79_md~YUj 1pŬmJb;1. ix4GhHe9yFJZιJj2@(gQH;`a9O"y}Em҃; lF*wZQSHođW9OW1Ͼ9cYԅt $OF;y$1̕4hW:ʠ+3{Byl,uux/4ڱkF'4ޟ+76H +t$[,h4 #ȹnd,ScfCYBpG|PI&jp+\phKHhK*R QJ'nC=rH<z="% kʵ==DJA z"4 a| A*Q&wo BI@ĂBMu>sG'p殃NCp c r 4~멫Z|iq`#`z'%r9K0iNDJι+5_HeN{cSLAבM|-#-j?:l:aŭzR@9`ܲ˓m!OO3LJ;@ ǥ c((#aZsD@ĕ:V? !8nI{aMO (@I$<#:j>1gJ*SrsW sѺܛav?cw;/93sɽk=|0zLg0] [_vv(58!`I6 E2RJ)8S&yyuZjڀ mBs-ȝ "D(ăP QSPa3 <[7I+< ʫCח^N|\9/zؖi[ۀĤV/^a*^eE=Z^tQ}5Yۮc^=#8E =*Ke[U}WWJ!;uuH]ejjuJ]]e*kG] "EA$$o֠+CɾLR+D2"u+tQWZ]]!N]TPhBkmGWH%ڃ:DuLk]A{5+Uш'ץDArc~~]1MgM_N} tv:Fr)iv+Z__VT[` KMLl>qw5SgwjZ3D6frYkUMURN]28] Fb9ڿy-'+h+~TU7\sreg6ŴOfAx1! ovZU K,?R|<&ol욺@2Z:bYGSu7*}7lwo~)E}|`<; ӲeXLDqB8AA"\0ET?J ˌT6`da5 ɥ2rWJFT8#.^,&(޹2JI(Qzh|;)omK 횼wN ؄σe0v[#@<YG?]H'&Ujԥ3[JJo@NkRͅwk4$m3TĥAH!%F $XfA(HI9E !&YDar<'T mڛ"YQ%[dK?!j-UݡȽ *7@<)>`2-h$Ej-DLùP>y$6$9-:wO;9; Fi.. !"~ 4FMG%g T ͜&%r8PdHP1a Q)C%ЂcTZJ97,gUr<FS~`T'J4d=WjA7,0B q-<A4)|zKI%MAHV(uӎZ.Vt cޘ`Mj5tIޙ#ÕjXJݣL(]3k~~5ua#+JZ ģ`\) )[gx# k\E͝cs\It֜Gi2\}IkE+|a3m0$+7grgR|\V[e8ZJD:$XI֡ID&ىGsc_t4!As^L-ՕZUnrNcoD2] f/eWosҍ"!.d},ќ5_\qw!Ic+i(ѮtK%Fw#Eoɤ(gj?)&x_PA-?0*&h6~]ou?fJ޾-ޞ̏/U,CQ1Gg|9%Uaݶwep*;݆(׽]Żd,=9>{V5 ؠEB?4ʇyQFO=S\qڿBT>~P }a3N&(OSܫzqo->RLGH*Ϭғ^hu8c|t1"caҞ h~w19a4a1 #7˫sRTp(*ȼB,ztAZfM2G5*sP !wNzPQ1$Uo2i3n(u B^љN^CRC jDz;ݡEP{91ql"m(#H104}FS‚ސPwϢmz=;n9hV ff0ڊGr}1jIKRNWrTLNAݕux~ `+V[솄{VNc:]d ZLY+)kWӖ)kVӔ)k:V(SƇ2ez%O\{,4.hRMg[enVʢ(XŸTnpزI^1z%|NJ%e.)JZ)^rg Iц4ߵ/hCt'n <8?_Dy?&䝜w)9m٘a]?Z@2уE# $8c1㒣*W \ZE0: c@i0\˕W:4Tk-G8gV{2z.+OmBe+^SJ$HGaȵ 4@gt?R$߹- d4A ܰ(JV`#zB "j#UeAHIK(mr!I~tC[R-QȤ`,i+caFB2g)|?!R%E&%2NSAY(=I .?sn4]|Z Gf|]Gj<_q4խpdu.:θsϻmz㒩}I3k Y:kRr9,z+K0k깳~JUn~5YQB>rO:k$|>nh=%DEmMh?+4 |Ԫߧ$=KCMnAchOh0"9rUvZwޣFJ{)2GL`>5:XPwuBKޕ8^zPJ. Ttҕ͈XWW\Ђ:]J!])5 .++Y.tEhAu@W+ZUn2Άn>uE( 誇teDWA>tEp~]i<]J5U/ $7c[sǷ}1i#fxMc\PYoL-M39?~ף|·9*ccϔZn~x~oz5W>ZYU&Q4Ug( 9u R,>fF#Lj^=_NY5anَ}< i~euxr{rRY&#BCWWd''JvO}+N ]`+x6%-t~I:^]{8\Z'vp剭vhڡ4&ZЕCOIˈ:3.׹tE(a>ҕЎ+ȈLBWVl?t% ~OyFtEʆ.0 ]ZyPJ=UJ:$`8y:vpM6ut(5]JnuFtj'&}e+B)@W=+ \Qp?ӂJpio4Lpy6V%4M(`UcJdU"p\!r+B:PaFte'XR|~5Uըuv|M/_]b1Ts?+Nr"`ax6Kpͅy|uɏg^B)=d^d~DV\.7cZCWCOE?ߥ s,4]O\ڣZu"J-EW]Ɂ:25+].tEh:]Q}=>UJ2h&>]!`#M6tEpABW tCRZ3+l "BW2tE(`]hrN\uEhOUڣJ;XW}+U^ 6fCWW\ju"#]^ &R:%Zh/X:qʮ)  @>uJZ lāAxчV#kͨh>HVڌ<9Ɠ#'GhM=9BГ;O+h+Y4EhOWs:W''.5םG;CWCΘN `q4-غjVh_;y'J+5CK̈ˇa ]ZɺNWR tGsp ]\uhOVڣJ ]DgEgDWؘll+D+:]Ji!])ŭ ;++3H(誇t ،*J ]\ ]!Z:OWRzHWZYxFtc::B+:"zp{IWh4ju/#W3Τ 5),y>V%U*& 8^Ҵ\|fD ͆-;R Vei&:Ql.th(4ur 9pƴS[nC8 ]kOZNTƪN+Cύ<Q&"r+B{Pv-@WBWLr+a ]t0UJ\N 58TZԜ tCRV)f3+ll6tpA\*u"zpHW`-]!` EWW\u"C쪗tXtzy,7c!άa`3ǶCR\Ȇ miP %M;ɥ|O~9=:ci=zC|d7hʦ|Ouz,cpӴݹl<ku. tݣ!j<ѣ |M\ ]ڧ/x" tCrT# hBjw_7¹qړx8_(*d\@r+T+iCCk e@OʌFrCZȾَOWg+gAJXfxځmM?p~*;:cj Ee@v>F.2, RJBq!x%z{<܊KhatƀDXajrU g 5"zT%g5c-P(qEiWt:(-es*[$thҪQ95pe4x"4~\9IğriO}IkoHL"XP <*pâ@<OKb @Fʖ"Yj[=.Nf~+Ci2˅&U MnK%OH`֤Ȥ`,i+K|?AͫP@}0MfOٯ ]wƜ,F(/KNfhXN.gj%mG͚fW: /Fm><-b:R n 맾Y5=~}WkJ1`_s673Ғ3]kxw=abTIudr{Y-:7/LuxKq/|tWu5+m@ >0ؙ쎱3A6<1.0E2lJWC$Քx-RbKtwU;wGw]1_P5MxX!{zj|(S7||m ],=n[䕋$bPq դ\0Co xhuw .Eg_Py ;8'*'T9,:ԠO+IFΐt>)xzwj{6D<P ԞhD11'7ju?pjΣ|{'uR)<j{T_Q3˛ٿQQUS)C3x$gPO(^.%:.łp Q.*QLy'm/'JeyyeAL=Yc\Oa1&]\Z)T]ځ)2"45BH50=JD+s!)$zD;@UT<ò[YCc!˜LUB}Ѷx_ BP"woЛbفYC>?n >:B\TvCƍ(1evgԓ=EC/F 7TFu1bҺpx!@x WH" ~9-y̕4JY k\CgOh5ZSGk-w ;i*^GiGB%B":!D!*>#2WG 't2 ]@lzl@xT9-ڢc8@I 1%8db`eD9->i"!Dh0UJK84$pT(A}SEPL@s;.2~F#lg9vϣcc}WTh\)(/ώ<_0(^4~<M?P,컳{}?8Y8{?;?E_U9wer{IIK&:HԘocDJ٪,Bb[f'b;FhǕ%}H<~|_?ޏEuy=&{q֝DQB}3g1\f :90Mq1|nrv{yf=}%6%P! $ AHtk\pX%CuRTRN8}eWNPp~׀ˑMh%T@ FBz)EPotƠ@Gޙ(U2H@j<5 |N9 c".br Y N2Pt l!-Y̑l1,\ruk4 N"G#RKFG^JV'; k`0 _} \fLa_ 0oN0E2ч1BJhiM\G!* XA2z#,Jrk)Pa;ۤ|ȕc8}|֧N!3/J`:m , 5#1#1d-f1'P y)x83%Q< 4*ovjjVGqvE˫h4\FmJN4LDwHNmJ$*n)N{eFS&rYB,cEas)TmT_m.9OWfzHY+Ҡ}&|M{0{Bs4{t]m*ϛoV ~DE^2"gRjt0<г'mOri˓  K"*!41:c>je=R"CL$׌3e EF2+YFL("'`yr9u8fuU}QͨIhJ |5#V=.Zҡ$ng1ICOz+EMARgJ9(pLIƼrHj jq'oa ɉG% RMyCE#E͝G;6h'^!"hs$7pVWcEaAG샼u49i9 Q4ȓA4SV=" mYfM6 õ}}yqRN1UNƼIP$:G{1c fϛ8dG՚t]M4ߛo%YO $rȬ T \ R3tȘʆ̺UE9 {mtAK+j)Jo - OE/(we-ma v' pmXY5xU+64]ƣUji2KS2Ѽe>BQ}hZ$|BD !0RtB4gR ə D1J,Oo gf Qr|Z3ʱ<#4RVk47k&2xշ~Sb?˶ݏ'RtXTlNdGf[I'}KnȒ'ࠜF!:VVH\bCoJRdDKyL 6xU&WcQWHmuWW'ujcRWHhU&WcQWZ{y!^rL 6xkѨ+-JNN5V^3ßtVuxR4_3MXv{rmqRw1M7ȊjJLFUݡ`-[* (H||bӲFqP!5?OA7gi)X=JV-D{_ԑ97F`$s&MT@U ʼnqEISO!OQns#)P;(Kɧ}h?X 4`E4i[h,H΂6\ @$]c4Hkb ˵Mp2NP9 iSHsN RsYxa"i]kGͺɦqyjZa %&:F9b6ƀ] )a)PH,+sRSk\l~puuϹ5>8ƕNPJ,Ɉ$S")2Rd-$Gv7१HJ02&EnFW;R  ?%V$<YX8wS75CyzͭcPƱ+T2it]HM,w! u.R!( 2 R%9N& B+3bb_*BTzujDZbŎJ FxL6)HWTܡ2 U[Dk/4Dzs- 4KDb<Ĥ%zZ e֐*dF7ZASN5mCoL8`U"h(h34sJ>\ ֢k)MbrDf$p_ C\F"VS\%A%*[{ύ AP0De( q%ti\D.8,%4cNC'"ZhBmQDAMskmHe ~?t0pv/bI@!XD(R+RvՐ")Y_c[:m@2=5_6X㢁SX3bVbb {'`C/vNÈ#Xe~T5@&ܪFp.JS8܋Ɣg-Kokjöz+0 qaj YCֺz5`k˺x 03tPXA^9:DZ=6TUPGU (|+ I}R)22J 0"(Qw@ҧ{^ @n@%|yr /s&G>k)f*2{z(%G6ܖSBFk.IŽj89.z-t bh@[vThŊeztX=H(Jj "P2HeFBjF-?A z{ƍz [ƮгP*'uῇ+ AISL1^M5w03`G-_* 2P(EiQA!6ɗ U=e2 `#6CUK.d?]뜼<,qTU8-]Z4`ċ>!уLİ)3<<#KD9j0k"> 8%Cɵa+V([0x } "\NFTS.*f(H"ˡ蘅@VR!d$>Dî );(`j'A$Di_uvW44革yx]pB{ P GQ/:ݼ_bw1&';1MIStvwPa?}?"*5N$B&`D$ڦp6:wccm@c~ڮ  棢A2Q#KҏCߵ4:#Q' MLauD&0Q:LauD&0Q:LauD&0Q:LauD&0Q:LauD&0Q:LauD&xI,u>G&0Q:LauD&0Q:LauD&0Q:LauD&0Q:LauD&0Q:LauD&0Q:Lau,Qc"@`Q[=BOԁ*2Qs$XfuD&0Q:LauD&0Q:LauD&0Q:LauD&0Q:LauD&0Q:LauD&0Qu^=n^OԱ^v_8t}s7{i.!| \tWd Wi?ӕFAhW9n-T4uWN<畑j#+Ck+Hgw+g{ZO~].V"變m;j"c'ĉ  7trwoG[M!!ᱸiVŇIJMn;䮼sG +6 g讂^YCu_˟K\6膟O.~uŦYK4bQ~8t|tԻߑBQh3~{OֲC5MwS%ヌA[:>MR*jip5M gDMj߻ЈZ!]ێ W-׶mj6{˶Dǟ:aQa=)Ʒ׾dy%§9FNZ7hİ^LWg|]9~]g6izWC-v3_̮_ vN/-ζ܆i$ɴLZ1K`WgI^[]J~^?ty<[<;>,7FLW!O7ksP`P-Y:hGDž(4tJ IUvAgQ}i"kklSR!PPgmxec37QK UV'ayEܪOD ZC^ok!HIlc71$51ZQ lCEKh[^ji%dSԐِF6<]kPٕpZV}=q1Q QrsP} [:Q/tyvY}8?ǶB;H;TGv弮EqI|e[Y?YNHnTw |yZ&RCV&I@ujfpo4oP:\_ov.M(^q }mkukt~;-Z%X 4&P'[FUWY]oڞn|VPm}lݗthN7p6" &z=IqggaS[u28Ų tr)ޟ/u݊xs؉ǗP~ ɟٴtvꆒ]NI=ƻHkw͸&ҍѧS)i_B]gDR岝ˣ dDKFS׮e*BD l}>;)jM˯}&hw`mBn'rIkm"#>'ӏ]`>zYަM>Gu$]8]el}{^>n- DB KPJzds.ܒUEJiMz.RH_{.p,/M'n$wS;~:Cv~1k;yזyS#w.azaRhoKu_l eޕهK}Pb oG:X5?<ډ!OrH0}h5X̦}~;>T|G;z(S{b6&[ۗ@Cdaզ$5˾ ]3I;*"Ոt4qFehc[ aj}cqI&]NGl[ۂŝ&uA>t~򬛴߯Bb7)ݓgT̷ձr&J|-/i'~W_m+EAۓވO|rk'_m>= II{+A>%M&7p٣6wҫe7W\e[|ZM?UTu<@qvWv UQ gmVw^l-;3tM7ڍXc3-ݢNJ,oh&XmN{Lҋedp-RDF !Wig}vy1ϾDke_XEc~Kh=sWH\èk:n-Gt4^WqU.?ʗU*@M~ӃSU<#*j0XHE9)=ӥ,QWBqLWWP҂c\SFhkdU=l='ݺq$N]?agL>PVV\Mk1ÏJlF5İʞaK^02YKaseBJ5FYi+u\7ca98# BQPrf ^(|T%FY+oY`JrWU,Я+3cQj4*h.DcN#"8 Frh_pv"RFY_.+DÑq#4ɤ TrhRtI/8Shh+a_ޫ=b*eʔVFZ6Rq7r8{Y:c%x*}, `/!j[Vi 4`]%]v0ʔL7ɚgR;Ʊ#GnTL@vFc(D,0[:Dɼ=g p, _©N4[9*Y`th$!JZ(J9>!y@_u >rZsͫRS.?k<8R.%p<`ptW(F՛*W4IOᐼF{%%Y$!$0:c4<"ΰ@ H4oa$~cPyz$n TrV d/P""0Q%ce}MheyLRFiC~ד.s=uУ6 J), JˋU[ZD]"<ӕGc#bzy0)o!{UFdzN2eFC AxI^_.yzj!A"kd/ ޫ\ϓ:/߿ϯ/VrǽUY/~[X~;8yq9)|9ʸ<{q2&S|&a2hfm`(f`(KDx!$Q jkaI DK NFtU\i/B+E3 ZUttz7POAO\;c*EvC~Otђ*6sT:_B3 W$)n(K0B㭤a`h)ATh兯IJ}1?Tά;X7O>mNR-?N?Lj֖ѕ?\Kڽ9pjc~bXU2G׼ * -3%@Jr'HWIc?n1lrM|L`{dרU'U7EW]mznp&!BRd ]!Z`btE(9tut%,a+le2tpcWWRN\BtE K5|P 7 X\$DW'ddЂ:]JF:ARNhHi; ]!\ÒY" %)ҕft}pR /ݻ:W;2 ~)O=j5.Ck&!ײ9os,<e_4}3vl$sVS/p9#ϑsso.*˾^9Jk|56*wXFB{jh_gg4luom7nk?= N/$zӺߴB6 =&9Ay~nvglsF6;-!M:~kMꊇ1cg2oq&|턼k~|cVzuq9 ջiZxJzn/GT9;Lʙ<2γk_P"y݈UHI@ d?m?(}9jʕduq=<{ԗ(~l(Sy+.:Vmj gdٷ?Z""+gPpwPkS§%W) A_>t~h7K y o_uq__g{0E6 fn?{{9[WTƔؚޭP&;NyϋKm܇%{gmlTЛoW:x,f; ;<^-yk*]7G#zQ^?o,WnF-i~F=A}F'!$+ۄiB4q6 rSTƾV]tE7պb0s֏ X(t6 + 4 CO]xkܛ;Hj>}u~3y(sh(^K66MȲ0p eBm Ri!zSQ(/EVvm?xSJ^0y^dbyi}yu9{TEks Iusv %0Cw"Rq"E(^ 9|9mfljeV7{62L?m̝O(W .jG$YeI=`=98mZzNJ|dK4'q#^"zy}NaLziI(Uix@#}4LiV!wlGʷ u˶4q,^qe mI.-:H˾ddCslm+w u׷|50^l.x%^}yż~2R7"wߜ[` /3lY/0ߜ=^7 bz$$s)dZ ]!Z`tE(HWHW`)+"BW^]ʡEPI uJj#k&++و zPZ3 ҕ{58a2n/8\Hq).ig~pr;Si,$C &NJ O֍%*T0ӈa1re :kٹĘ-\ZY[9D{Yq('cg sX4x2.eV[4uShQˆ ˣGW$ChAN^K|5</DKNWɑ ] ֮;X(Ǐ Gpq]һpay!֞׎tOspz`Y2tEplZbt(P#]"]9)4L~Ec%_u\L#9&'3 zڌ .$?W4Xa.`l߫F+ֽ\"}i/uus]",֋KhAJ3\RRvD8#'DWM.$cZ=xPZ1 8Z%DW8%upq BW ޺GFHW'HWlJJ6Rr<]!JǽS+%9 ,V VX$epA%#*m3@YQ>Sdi40,QIhA ?;Fn,O3(7LgfvidaRd+%ulw/$˺EXcHزX~9<$ M:C\YbIrhprN-&v\"Sv\цW2Ǖ`͞rE{_=ժpDA=pE5=RWNOb~a42vL+CKsĴ6\hWN @n^j`y:@%K++c$;NI?qX#묮kmg {r>e~8 : Q4 ^ rA1ש4ϝ#~bJHqhpr ƿd*Iz7 MϯlT{=ֆT WǚZe^)ǃ++4\9q*Y9¨B+l4\9H,Lǎ+P)dJ%&g 6hAPi'\!$1L`r`̮@0XpjڪRvGU'R ;B Oݩ4LĕR<=`욛k.@9#\ܝQo${˒ Ue%oNȼO5-^Vv+Y-.GMUOJA-O*œ9Tɝ'g"=\`- WN+#@%Mz+˨T` \\I *;@! WWalNiԊvTe!\cMO@%Xq< Zz$\!g]`%-\\ñʩĎ+P)hs),G+'X*W P,rj T2pu  WD+ȄsĕwSG.bt?->wG=XzF'xr.?Hſ-Y JU,*4#5<©48R(!1pB+wʛZ*DJ'\#JD|q{rqE&Tkl^o-4A Di'2ӠVF?:Μ#$\cr`!)\\fuԚgҦܙsĕqLN06x`'4Ŏ+PdC+" K}K{WZ#{ZRiedK W2XS)DŽ+l \9,_Q~xs;<Ŭ@9\P͢S[;R Tu!Nʂ!uU*fߪrQ(G/F*tZ|3|*#pN??=}~Y媈9s=uni7y]:;9~:uoK)"Lwsd6Y6|t?fA>m`´~}"{-9xT?Ԋ{ƎG1j%0$Uh=PktÞSiHJ.?R?Lf\Ija%0yYՎ(RZjզbޜ%bloG /_G#zgy1ab6r|TE5k{3*u#"βOW좺.kRwK y1y(FGweu%|צnhU/XPp?WUzXSJ=6͍>PEl鲆 ,]a5WZ[6BB33q!(W_n@^V60^FV&9X6q8]AAO-v=-x^?Kt{xv-d+3V,^S( ݗ])e{ӧˋr.?V̽QO5]'3\}jJI-',whU^V'I5yܽY|٪9͓o?K}h^_ k}Zzw̹"U3[3,s8[ ,2="qTI{ƸVd:7i0ر2{(_dMM[6ƹW䙛[|vza+Y5]$Z~[+h|TSJA8ʷr~aETY7 EBVJ@|aAg:߽,!5]roTU 0bTU=*1mWl0Iʝé(Gh !]$;>kU9ɗ![ ّXWxO2b~Ydtm}p'|EM'Ót[>%ZbY7:Tk3fAIqCc,Ly;U#n)`4Xӽ(Z)ٟ̾ȡ:^v&mS&wq޸iGmG[sǦk0ϫڏq)KA( JQKT@/շR.C{GS'Nƹ Vӻq E[>sښ+ҽkq2r*3AUΧ+?TTsxZW\Ua_.Sw~6\^}?}r|YeAuA 6,oq&Ԉdِȶ-\rVO&CQ]ƋM싗8MV0(azrH׃Ws0Ærj_ЂfbX60Tf!#M )(;';CW5 5)|ZM&&odWwE3kpLof2^YQݱE",EFmȾi1,!VC}N';ͧ{S6mlRQv8!{iU6fh:7ol}@mޜQfΪskmU2*p\3Ŭ Zy .mWdVcs9Lt1761jld=- {9`zHp> .b^$U98̳o[Nbi e+M^,jK7k:\*0Ch%2̾ݯpt$SbAtSbc<~m>%ndbʺ|'Q6nקx,Vd<}*LN111/7Hv{فg52AhDz%--a>f0̕|= !šŅePKg[??̙̭ru*}|qP+vByws+Էzܦ!Q]m:.*{*'IRtu?g([J+5&?-`8SiJ*כYz#BzSi|Oi}nnzmvm^ka7~^\,{Q{ݰPIy ư5G7 i+wZZj`M6:-w4SfZ˷lyS/94su;ey t6R6:PO -w$a&ְR5<`' )<D\GـGHHuv/"O7c k07e,|7&2{JmfVQ_irߧ铧| \zERqQmGNgV,׿+)(i^ڨ5QҞ;6#)Itb]$`DƄ~6%ȵAQv?TmU6jk*-APbLSo9u>ypm/l+W&Sz܍|w30hv##7Gvc 9^k}SYVLCnzfA;1ǛP2qWT$_; _Mm4,4<ȼ2o6<J9JՕ*G`55kyDz#=3X§E vXBNN! 6Ţib+:)-n"o>Sj9{3VsUP1AEZ24"2o:-o,lV0C"@1{pڋޞz^2U6jL ߖy5q:y;܈GOZ8ԣy9YTg{wF х흻aݫVNs ; .s7\ T9pkಘUaB{֐wN(MqBGA<`s3BIXm{)1UDS7j|*dRОV&70ߴͿ3vV }m$ss3&ygL1hਚ/|"jA>nTZwMkq)e3XiKi,^S)i5 #nym*ypx^3߂HI=0Z-=䪢1_=n@0%9jix2b]>e߰OgJ6BNvY3 q0 u5uI[flHXw{^<jݕ4KШDA_YRw./3|xzI n<ѾQhؼchy=I-ŌRo-ݑWM< ~1eX JI6.JTigA't ,u`s4UvL`B0~CA5Eԫn1')_Ox;7M̙If|5`-\fIM'i9*7fxg&&?g7Հ<po>L??~ W(+2 /=G.l>npB&r929Wp4WZYj@;Jɥ$^ ' 7Gg BFPdD$47ӻ)ỳ@q4M€+1˚tYC,Gg X?Řm( ax|iڬܯ-Uzj2PJ:\,mf S9;brƕ173˽ |kj|vlY5it"_HX^ Tq/}eUZ%z۵W@M5ds 5n{>9+:$.-1")J4Foz߁skM:Yb_,uG r7 X1)Zړ^ljv:*8Bg y9X|=Ho:fffaJA'_1)9LiaJcirlV4Nž)D0 Et NH1.*xDcMjo:b%(+5#Fu^]R?{F˗l^lO=kmL{)c.A ~n 3y #:~NlyI,=*`"wc^!{ ^Idq6}Mmr'\e'Q.k4NlG+,kKNC&}tER mtD)INDOwG3*;hmp- $]ÞS+EG;/>mx̥oR޹_qJSq|C+5/ըw1"DKaUeqq[ h ť&ѝfBڙ)`a[$D(J/3s=e/[4Hso*]clJT꡾D-KDwBN&w pwު-$+ փD8it8=RwvI0S^ iyo]JN;۲wjQQ~EW)qZNUěލ}݄Z -oZe;\b*H,FS*uxH  [U0%kӒ'ϫ,m,s"u}l"h 3(@@DUWeE\vSůG;Ge *5d*}]%xmi• .FYJZZSrF7!];&G|Vi G`;8L0òX.Ast48Q(jm8\5PcΌPriLӁ4hR-b:f{5}rAԙ1,`T ayΠF=Z ~z?-EE c >ZY(& wKkQJ' %UvPZG-@e~Km|*x4e /ԅPbiחc׋~sg*k / y Z`)Hm lEZ xM#Wރ>c.\hӑc4ˇΏWn Κ<cmjV%fYk[^_s%[?\L#}\:Ym8e{6]jQg4!JUIf"w=<`ۿls૶=OW?7COhu6&5y[3K ofZߺֳ{&fՂu|:~n&d 2 D!+sd:KV"XI`DqR!U)io+Wݤ*t -MPwYRH>FK%[cG|*Y?yno}P,FfUkL+b~ۀtҚW0z3 .EPFD<%IBB|:bUExHȽHZ. rFr:ւ"qüj)7D9i0\EIkkЄȂmʦ EcQ)[Ʌ9(;ߐSCƀD`Q59. @p>dOVQt8fS xko0JʜBRFAB!k3*b_Z8XwN}6kR18!Q% #_   XLE09p€D 8es #!} {̙!EFC@ڨ`܄q%FiTzV[ 0 d1q4߆%nɇ_sWtj]>= $x$#He jQęJK01V{>*hT5jAc{|٩rtO7//;r&XR0E,{ s\uٜ2quۿ qH*"c`, 5DjeaI<JS-G!0cfRԀc0(0Bt0geGLzE/Y9#CÍMCSP8GPBgA) .1 xUYa  O^^*(iQ}*w*p6JЗYa![@[+ 7JmVYʱo:SXyX ?+(R vFHqK#jVSMªyt|#XAV|s67c`M,<5Q,O%,%L`$z$Ј VDp,2ia!ӎo(9gA[n[\T \H6b!ɳoG )(˵8V#JGxTjLJ5v[#ނFR7m >T8Yib0f$\ Q0H,F\" fJz6=yVWEv6̤sHJ;j/uަf(n.xw'\c^877"^oy1 ŵGһN 7 )G\y7C[_̪n2 yL%I!w9ޢwqs*OG0,unuFVp <_LAYp6Hv/l3’=lE$Klq# !kj`:q=5;·kNvfpoT6lPy$h6)3 Ivk=d9w ,xl!T  h*~?ӨȥjJR2*PUi =ZRĞ?{dSIT縐&J$v[,TH9 ^G̸aM!lVsA0)AQ1&HqEaM)u>QqD7P#IΨ]+`2 46X4jO92-A̙Qj瑃[X|KPk^oGR)6\b2B~ "B'sٱ^tDOgqBT8 K}ENot!|.Zw7k) g a2qs$ƁĩdKmKu`c*W< "i3˅Ow RK u74 )Lãs:2zJ0 =2@P>̖hy vIoyZf:CTַ,U]RG[&Af֔dikb@h!(Q{f}fHu@ ֠#mNX9)1r9AtzIx$u  R1ޓ6rk+~}@$YA,ڲ%[vZT%bI*; ;<<V"=Sd]حT^JAD ڀ~*lCYJ>){VT^b4JF  *zhC"^qg, R*AJLV^qj*%ĩ4H"J!)!QY]Z"K,5jWI))9f$CؓڙSu~X)5" UR$;; :/ тjK%)OS4 [ň z q:P[uAX*eBqP+ܛ2jx1Ιv\Rsv7V~:.Aa04':MbV1RJEמ.\AY*i!Ҽġ {+,a;fu`8[:H: #T NÆK)$h^Hʑ*$*̍9fKgD:BAi}n=|ezl#4p0@" qK+ ʆJ a{v7UWҳ6o ]Ҋ,L{ V؋4SZ۬\c͒?1 ;C;os P &xXI'CTPp rya'MosmL>zHއ!@vж@TRU*5q'aLYh$JI@XBiYJ*P]n\2$W1 9 |viМӐQ?RB5nqBY)w|!xUJ'<|?3q)v3Z,g_KLP`75aYeu-ZLGK _K)V]gÑjAC:G_^sp0IbGnde>f?'odjnvڎ=h3E@ch HHHa]/.Feo>D˻rL^),p{zv+"KH>|zv* |z_;rOb|[ RVJ+ޟR&v$pm:ۗ*Zvc:Z5XT(2{Xj="D@ߐ ʀwV.ߵ# C7pB7v9u_Xl wM79A{ A?WmxC?J0 Qur0Vu/_ޛТm|F_S@B"#bOS[,1|eſ.M1)W ,!jhDQ￿.^?ZN^n?`(w( xt҄)g}!bb*O&(?Wϥ-0㑿)jM @[ʎXW& LQ\3eFuœ-JS32wf6Ljh2} %ylͧ`^Fv_mDGRc=}ta\kW/<7R90c _ !7O* 6ɹuϋKAsh~tyڦLg)VOxZԫZkoa{Q{[7Ӡo&O];2yLIVgzSeh2byg7[C5 1~EsPqҝ| ?N$ BPp,f)9W/=rU YX 3&BSpo?uVD^CwJ( 泱cIHҽCŵƅ {LÃW21)_I/౷x$Dv /;(cRߍ侄[`bvtY9`ZP |D_<.#0SVPɬ34 LFI~ i[<qmvvߗUYFm5Liok0JTLqѱ4/&qY!CҚ[w/ 0ʫ$`ȝkxGNqxܻt7O(Yv`LIppk6. ~$sR1=@-2͐O: LD6KraP_q,2* B gyvAf#=dj ||6-j'A_9n`(^NR=fȁݻY5] P! dYOTyzf),e1N'A؛Q z8U}/z3e^yǯE\|/FBrY,f?w4'ٕHQd &jMs@6liXismS^n&e&' cB½GUo+_+DfJQ@1 ]e?e) (@2fZZfhX f VwRLJ%a (V hIFA'C=h&VHe- O0 P]+pm2؃`5f (tH3!6u_iX\ZTZk(8)Oea=bQDRKA-u`! ! EcFR(.*Vq˗YMo~Qlc\P ilէzbZ%'4^Ԁrpym͢{n}lP3 ?PKH g Ig-,B9r Kp jа!ǷJ+4٤F}Jqz:]O8P3+Tį uarcn2u. t JTڌ_U,TorO/RR;l; 4,2Pܘ%)|=SF09u�{ҡ/8L"-) wv#UebgKPw>!iRՠ Aדsw= AiWm67. J.!i\߉9kfIp%Ip/.N0MA[Z~r{d#vePJD?aT|&}7)3~L9 )՜鰳;Y͕ 8)'F=Tfb7Rs6I-Q4GuJ^MSj!Lu5sxJG:X㇊Z0W $6"EPf3[jn5_XvL*U_Uc$[:".|tUhԴBMN-ug2zǓ@hMa 0+@@+( jL0ł=E ȫ/f52 %y{|غVC,T=^^gm0lNmGnQ;P*QYMD.saV%'뱧M]IH+~wdjޓ9޺W[|WuPk+|-ZhAZNĵGsyMs͙1B3)RbB{댷g4n=rטaoF)zBTz,٨ ~svh<'uBЫ/C J#1N=v)o*Q/r~]h~*o|w4b|:ʗ$Q\a3UEI# ^c# k 7lw8EGyJO]5Y;HBsxL>Na *a0Qw,8Wh40^D7jǡ Tj _vNu~j +u\8;5[3j {ns21TAuCT=gį"yQ=$>(&}J#t i| 2kcsjՊi&Ӈ٢[}Sb?㜕=>|~-jcJJ5%KW絷y{Q|P̓ v񀨌ܽ: K(}sX%@*"Sjf;a$\_:G? Ԇ$s!B{ <PQ W4/?2w5xJEDqzybQx0Oe_!甚{^DA8tNCc5iCޟ"MF+)IQ|J$]4WKtQ9?N4 ާ>=v8N)X,R>YeILRvzJWwE?W25Ʌ,UCmeipwq=X'yG,` ֽ(7'`U&NHCٰ}VGѴ; S#&֚dJi4屋 $ JJ\MRk !֎p"ax$r".|\T2ORD_O.̅0i 휰@%vd8)0t'u$6%BL0ڇ9j/2uI k\ d1AYg*J&>^FF k(j! W7׽ȹ_.,t$[/(?bSZ8~Kz\ȑz|P Ix؋|9=ҮRu^ǿ_ϝUˑsٻp^F3ʓr} j1zˌQF.&*XDM Td6.Su V@OVNgm|1 vc~ \sBKF  ʁIy8;k~ %&p>w"gR؆>~>+5TD¤LmRkI/Qju|zǎ1#ǜńiOSJ4S!@Ų"r,1ZGܘ¥тjH 4|s;igy ʶ϶GCs f;VHEYPxh,{Nc4t=u9m W T)QFf'Ҕs.dJE ޗ06՛O k>ݔ=5x/łpAt~h QjwlY.bICIC决W8Իz&YFBTĮA=ӎeanG$gձ/'P4 E/Ѹ"yHg=L\(R EQbh E!#Yyli\ǒ(6# 6+׉-[Ww"oTQZVQ@$ќ'.kPT#(@{OmXPs* fJ>/xvU' V8x@V(VeƇ#rsY2auy4*-VxmesvNqaæjT"4D {"#I G?'p)ʨ`\[`^جӘwUܠS+8]٤: \=YW9,:{*1Ce,Yħ-JkPI[N$J`ey bk EZG kqj8MCRvV$=቙jv`JF؃}I+5Աd,Kw^GcMnb!A !;>> `TO V{tobm81 soq{|-ɍa0Gfu}6QA88$Z4u,|ҾmTX]\* F0]yҬ|+Vfihl(H漕ğuO^g͢鍥4R}$;v ,J%Y\R7>-;]0k2ӥZ*4baǖ;pyivMMN>v4Fg ǥίs_0t۰f5JUL|&9m1fRTF[&OS{Z}rë+(nRڴwY޺=`"э(V.d'Ia}?[T:5f/+O(nf)-lw-zJ Y&Ԗhv<GfiK7>zws/#bF,0MaHLh+pNd=&Yl:+QB1S\)ljWqKM /Z8+jB0TOeNV'_I>fu4 ot9f*ui-~ٔլqum>ȳ72$7䏲b⁒ XYgKP&opzVD';kEPlb$5kUDlُQ;%y/ OѱChBB|wԈ~8C~aT )޻3Zafg̠ۇkE7p\È &ĭ*\av"!}cK4.͢2Ezs6]P0ЦU4]h+! "Eb!(`D*K y@?{O~s4UT}ٚ6>/h 'aA;xj9W ߒtb}Ҩ`c}q(~c(# /B@wо3՜tJF_PstzepjLTo]ptڴ6mYjtެFn@y rQn>|cU]sTa" 0H_/2[tqPlΟώ͞@nSRcvV:B 6:>ԲKND(!5Mp w%9Vr׸BAo fv/pXjX̗eV($j->dw"rvh>uA9]PH݉3t +ƕ4F5t$ * 3=VxɎ'XwP*%j}Uz|C!zP9&7LDKH~;05f]UvU)~J#>޵a;n)尳b>w4iT/- 8]EhdF%Yqm8qڸ$DlB4 hpL}ן s>dq=`2>\^_ R]ǟ.;e-+de4K+׈]]B^iXYٿF*]j\z1-KFp0Hx ӏZ.pQFXxлgq(+ Sa?Uy2ɖI>O}$dk e nK;Ync(A F}V9W'wmAĐ/0;j(kꢏ LIF;H>y=.6j5XE4mN<$=ęQop<9,{}ϲKg|RQT۽|ݻ5AHn.|*/8-dJίKŁͅt ~H0 ,Pxq`+f^2 GΆ,OS3$xzy%ܓJVOsoߜ61 -67gwKzOϫ@NǛz\.iEe2QNjy|FsäM~ c8`6z#'ⴄPd) \x3vLIJ$]ũad*+H|:'A %@ BT,)(g(8Ԟs٬W]:vN@i~)ڵyqG)+/#_Y0PXVвHyG- 8L Pa e<7F # &&mS]IIwANKc$11z(UۺW͒.*sv '6]xB*=mR6Vmô˳_0Dzug٘g@f_VI廘8QNe{L, a2s*BgPz}+05ނL' Z+sT2OR\ B$6l|^d  PYBaW0ƭp#x홾3ow_̻Wf,j.e\`A0@cF?r%’+TCrܒ1HuiDJXy|=C74/I [>} UEs\S!9;$.\ guTZly<ň ;(&+O bv x*TwE$+.pG]ćBƖL3؇kDa/O9r+%R8R;, r,$-e[/tsFBnobReF3i2 [E&K\6^ ,72 Q]$8<E..(cj]}GL\U{?5cK! wy\5U\]bVGQfWa`OyYL3f%Qdsā2U ?h#q8aT*16eSy+\Cf 9!=Ce(Wh\W|h8F\)'2%;0^%L.4y(pe 0aŠsΘQkc}ގRSE@:)Ys@52zԣdT)#UPF FvrJn1E2iH{,a[6xjdl9'* 9<%D`EӭlH;F0iB@;uMc\qk a*ނ]t%( Iluʾ>KQYtks.ﮗW]V k.\j.u1r l W\9c\3A '\uٰxV vٮ uזL\Gc*2"-q@W0]*_[vK|}n<3Kn@)q׆69`9/b!KaTFfMy`hTdQP o)bjn=l~/~IC3^W 1A⠰`7HBI$8DkdPa/  k~lj4w뇇r@=iHQ0@V !•2Rm廔K{Bqt%&yIi]"#MF0[;J*l:MF5ޞ,܀5#p)>1U?d 8+C忦E㷟WyZ^xOįm+q+aHW)4%pbё k$ Nj%!7;yB:sry{5ųjZ͚a$+,1X'3/ [Bk>#/ݝJ+)'yVK@ )BGMޖ2mOߤ W`:ئ(Em] %Vx4?^~U'x0Ԛ_ơʤegMnw(кF"`4 /Xd구iY{U~Aܼjd\aRiSX%P0gBCG ղsBJHl-?+Xg?0x<OTɋV;oݱN.09ا;R۴*<2y8_ R`YrpLu7qj ~Z6 Y?rKȇ]# f7pДxj_2dBKm=S2o蓉6#V#$3!z˫5?g~eR?z ӆOf6SN>$ iR&5*&{UkS{Hnڃ< 3)4ELМC]zIl")$YDOCF809g$6XO\ۖf)<$t>ebJZk^5VSX}Çђbzj2Mj!뢄oV @J6!:Iu6N}f07b*Ŕ$E_p8o# GȎF7):@JĹHc\!ZUjh.bUZb˟t{?^nO_ j{5ErE>"58gͧ6TTgXX,if:m|:8!^nKT1s%|U\4AY]UZ/`(|ga%|[kbs_ԟYle5_KYIߺ&X=m QjB)1J _`+.Er<v2;* "Y eHN%gwQ'qQ|q>֑y">g QIhe,yՋ {!#w>^F///G :ąQBj3ӕAXXO 9ܜH4rs]"mG#p(\uyN7-a{ΗG<sr|3$8g}X$퉞~]W Ҩ.%HtkBjʍ_R09 s-P?]Hi\tV cN;$qDs=@Aw9%RKhbd y) %h5ŃWr.Jd`Υ)@5MY*i:4 s@Eh~,sup,N(X1Wp n+:SHU'U'v9JyWs-ЍAi.peI$aƃ¦B52j{,B8:<)z)=Wjk^,no.܅ː3FߎICURSrؤ(M[ #cz^ȄJS#qKՓgnNzii>3MB Хחwk&~M5 gYuTf<G06"EJRGTP'K["^]&7${b^IV` lz u[rg߯]\/޺J=_U`YUr2G+fܠW" 0NIȃT!6yҎm}!/CZ2tFP'`yW \aqل: + o4β!xd,K5wn|BZL&X#Q|:Q6(-+D^"ƔgN(-y|)cjz2[|?h/r`n lETr{ 3HU:S9~Q3&5PoR8ߘCNR,率DU@nK72|LDāVv٭#Jd wzt4Q@F0R×8qr'䁇65`]ijq/ir,wʼ w)K}yAo}r"HW!Gc~:أ0W %LFiS>izf$$Z?2=ǷqV]l- &T3j AË.=|#iВڡjh )L|Auj.}./_覚Dy%ȉI_^}!ERO?Y?~8xw5N3%X' $DnepX{,e} ۦlxCQ!C Xxk2؊)R( 6b[5iB`9x0c6LȂ'@opIj[ڿ-O̶?"SMjnS1ZB~AFr @: +[hS0 *aZ1sUc.0{&pĕ %)[Ijc+j{mWY׼tw\|Z1(;A;w&RL c U;reKX*`(Fhx#5A#HS;c$Cǭǔȗ@l=6 CcoeJN@+^tQ >+D +1 ޝYK?'^,08𲫢+61/>;-u2؛ݏ/L|9ȕ$rh%9d *!Sd K̝/J! wV@pp$B4.;׻|[#LDY֖л ^^.2X%hw%5[L\@T^ىkM"W܍[ ϯN*IEcViT`>@rqvd)g I'AWZ掠m!-yr T|~ڰoYsȈ懆!y(0 U !4475gB,x,I/V8f^ɶ-V-osP$^e.ủR_7/9_dqo??-"`M(JqR8"XzJ [ B:2s^8AI芀A3a'v{ԆwI~ݥE'Hp뺜:y=wMGI oҿZ1F]3r3R[B1OHA +,Fb"X;nʂύ6 nV:F!284 Uly5VhĭPXͷ>+#5[?]cG=l<ⳍH3grA*aw,ԣR>m NV*LĬ0X&. jSIi lDRI9JK>Y( uFG2ܓ|GD>Ȧ7_k}:] /)]%I8D#Th{PJ}Emdm$I#g3Q`BGx f3ڈ;gYtS0 ˆ>ژ-9=IvNWS銲~hwUVBHWaIII"we_d8 ')䩡1臢*<1I̭"$~b"̞+4Յ`g;(8㚍7U FE;2?4 DVCccrḁK,&WCcE9zkh iEi's #EkRͲ0'R2p| K86Ʒh%.Rnɴ?b}!*g*.Φt-$ ߕ`My UZC.ɎXK6PιQ!7Ґ\ b4^:Ehl =~`:ېBнJJrI,l V/%xlvlIxk -ZZ%UR y"p0c P-82rQW b`s'-XSM$&w˗ ȴstQ}h~t6+UHBɱՅm9͍ 2/3]Wα 'eR?N!D|;vґ}u5JCi9I"5z&&;qk!)ex!"*LlH54m2ToŅ96[4;m4&wPy+v)/£~Q~5SZsDud._i w$Ѩ/ow1y~aZreӢ烕P_u4έ]M㔤 ^S4iᾦ;I53|@NwE>6?֤1'a.h.q)Kpyۙ&IbX[xX ) 8qGQCUzD4*/XA⤑MoK:MDy4ш$K>'n_Z/9#rdɒF *raȖ K7ϳT?? G6$GVU( RJdGOo?d.1!24?>|yzE2X hタY(ɩq+_ZwMV))xNyV< ZIVY:/ nb/*z~\ r9>6g*:bo+rg<]RreAsQLU`y1akhŵ'G+ScVsT?}UBMP)9[okISR,O&˱}Ry-3/K!Ƿ43f(M^^dE6Wl7j>R}S["ˣL)Gk]e#Pt.ብToKh$x]/ƺHuEYͷ 8Eaqa sF`czV{8$o[M }o>4\kSC{B4d]N'L-ƯO& gU86"T͗׌ou2}+MȤh !eFB?XhnhN1vpyE_r_b$&~:?f|bv9} Y~oo@. 0_ϝ<;ųIA\o͢=|yYa>9Ɖ/'=r.$0(ab6oj: j{=ٸe=\rt>^Zi)(p$y'#~quFX,(R!N}9+Uܛj!o2 9kNW~jU[Rb$'M8MsO?%, _QƆē%y^H)y v\=,T~%J_?xJ]Mzq2%Ok7!(!] m7swPK-_*RpE ny^/Fp#8#@>?8US7Ed|΂u'cnlSNT3O{o\\zv2{]5Ǔ#`YD3c?^-5snox奈' H`M-cLx ؄,:K0$fTPO3l,I#"r3ŝ^ts⢕]qyC'8IǍ:bBM\FgWƪDM'jֽyeubf^RƛU,$&b7y> 8XK?͟ppR;Yw[?= xwqqdlw?Ջ{P8P62;oːvŁsCʃm.R TF DX9Z)r:B6!iXy6^ec:*c;r@XÀ6֙歩U{Y@k]u58 kؼy a\|C6 'ZRHa@(XvnXźW&Jn5M]CK"B} _y80;}O;^Blp2->+,Uqlnx)"^E>0;ܢGM};6V107)V ^Cҥ8Ka!6EYw`9); ;7o0߾$q(ǧ Fi@ שOKsFZֹ6bp#k*sR ™ 41j_Ki֏2?|:_RA6;l}+xFb^P)SQy}rMT ,: LkLĚyk) !)I}{?N5b["U諛jzrڦ m.Pl1CCyc/0mD@i@NEhHB Mb؋q򆔬#sVtT7pFpg.w_ONm^kɇDNc?2۾rJnB_ƨ(*am-SdqbF%@)KqSK_ߒ ̒|_zTՀ$jR)q[^lJPzuf& $UUs:蘴#%2!R ZR e-V#A1Y t; 1@< _gw!7=2+?x hXLEDP5-0p1C%ߐ<|T+E&e1MP Xn ȣ\ 'ۼSQ#ri~yaA!4O-9mT+ZCi`̈́2;,+$\U&&R:᠌!.8Y^_lVt8 ę$5 Qu !IJ#.[RÖuRN9pp`*V|!թg* ;\ۋR~c ډKw% 3OPۖ6Gk3 vBt &h$Ćr[q8ER%qX_$6sBb'o-t5'ϳFg} Ճu=+>\g{?.^zBd_x:;{$9q?<'YktFuې>_ Wt Z?_2vxwHU/nCmɞ<_X/@dZg.wFmLl@V,ZAWnn+ެ:uWe+H$Э"Hs8)|[yK4qԚ{xuMYC6Ty w8 $@^ D4qhܵQh󛏍FFmNzxax4^JXZZbIi&N*g- s۴\$^% {J4H_UGoiv2l+Xml mQB J,6\7f$ jےvEM4KJe2"->N[F8V4[e_*^)vs҉4j[ U[_ ; isKQJ 7$iCo U#$vSB]n.sS*:&]c޶yZmTv*(Q;  {m=yBfT;X+LӃXiv\MKԉ={5Ab D꒕JKٲC8~RBU}Vk6SV^fpP+3b8fĀU0v; ^~a̰u-@!V#&RJ82? -WDϧF57էT5HVEMbJvvo159ͩ:ý^i^Hw~ TuGRS 腅K4xc˰ -04 :5J$ yIBͽt=x9(+m'?IIBqoA&b;4(X^qycO*Ln"v|A](XN)^Chvq>Ae{Fӊ/\j__ЪhB7w'/q9*8 Jo 5'NOAqT?{J,i ?}yPjrbV7VQ6OF}%o`\) -b3^yJ~L_ּ-0l03y"Wf-^ue0\qlFx6f[:yeM֌d N۔A\sE%aIAۅySjRk7+4j06^NjM+qJ3.uj}>H;?%6~ch9F@A5j?Zʅ[W/iXcÃVwmm$;y\*Yp0389}:A]Ƿe_lKmڑԒKLluU$GI=ƞb`l lU[*j?>d}Hx OlnIA>iK Ro@2m 钊Wziݐn2-O29ᯞ(2׫7R&qjby5*kmń&0ڠMDI;A`nċ irӮ-nD쿒zmF$y=;cPx᭿u#``j檽1R+"vK5"fЛ^F l!?mZ$p`XsP<@n3cIޭA)58Fx*<8xɎv")<#Fә "G[QyVD0E:Hǰ3@ {Ke&E,H##!mBdҀt2HB?a's'TQd1 VOk}0eG'̰^ф?f`6"1MfpFuh fx@)lKЛq'Z &hl$[Kד_jm[pOM^zッ&oDŽ/ٵ A"El0!f W.x1 !jC`=U"ҐۺdI+}F\}5e*|*(eiw+R1=z?A*֩7 \;cyhhޗ4~q'#+z"dG13mF&0"ٞ9 L?UP5`U/ކRW)C1U`X2!' D'ҥr8:xD'|<)u#غӒd^\/NbQ!5}.3xX?{fnwyԴF{.lC$LQ)NhpʧL=XcBh]>&+-D dO-4UQh@GH@gkrPnrHmstc.m>gGS(Sj4LӤ@v}͖\0Jח,h>fgk0b]4 ԩMӭ'(ާVbhY@LF،j E#8|$k#}㚤a mx`4ͥŪ(vkww&W%$8=,u.]\R^ĽDDYɧt>fMce{ʷ'T[@66$5NחRj>KW>1/^Xe;ucV#eysq}2A޿W(NSN[&q8}s>Yi:,.1^QI盻[coZ',6y,j>.2c<-#hnռ,DXU3 srg+.(#sP;e`iUYq?e .&Pޚe&C6u(XT>b+q:DT%)gmcOmA/GwUH?o B`M27"ퟌw6$hBݔq-._^71FC&Y>ϟ2~0Njd@'^H96b_q-xv{Ar;3BGYo /Yuɩd'qz늮*AR cж^86ʢ1\ .`&PE_@U6>`Io]\ik u0MV5Cf3Ϭko!K8cٽ}<}z>;h#^~.ҁx6I"J{`؈9)::wnt2bG_NNgxֳ?nPwx!Y ֙UiZ֖zA+={]Tž=ºgŏlOV 7H*+V #׌C-*]61S+tُ.tZsdo?@+U=Xja"f6*k_BKƥ,M= t\e0+"xD&^: f(Vb0RbF9[]`}Ÿ11A21·@lds5:) sI+#Fb\NsTbzB0@Aw[SԌ+XT#,iLoO`h~M}\8ihK7Sr!B47W}SNO/Ջq2jc@kX#E@)./pl˱1:Y&"}P meϨ2QJkPE1틠"hd@ IEߢi!x}xLVIL(׆X F4yFY<9d%Ǫ˃wfVx0썺XUj7m,C qB]e*ij k2! x8B@Q\u #~5l&.M7L О kkL{^VBEr~<0Yд΂{o[p^U7~b4*=-{Ɉ[|9y!*k=1%ZɈ5֔%t$0h߷(ȨPK)D(@dn7}O|v\1gBF!a IدM\UۆfzpXH`X{hT 悵:F$u1t;bZ*eˏQmv GjwrG\.qsqR8ڀS%-je_S4+`*h>9R\j`_@|cmf&Tn=vM{%/[6_ONۂLLx'*k! 8\@Ms ;ߕ6ys<*r>6ڮ4qe \׉lJd/Ў&0TRPrM9Z)eZؗ6fٜu'g62 9c۞m[9A ? C=x|[d};uZ|H֣.&|`؃'!0Oo T콒}'-Qu7k3 @: A{5dtKݚAhA+( PK(kiRf8h =#٦O愶Jm)L]*_B:/cXp}~|BN@6uݑR]7@ǮkĮtZ{me^<:| 0ߑ~FZ"CVndJ (s"S!-Zvo<_܀n[Rjc3M%1*ڜD"0:@!bg`0!&?gmgVZ=>:.}f/7cQ/-?[4cYί̜۟%\e5M*a+L=9%)r< Q?5lMPXZ3\o*J?PbGm~GESv6b:gyᡪ'z#ǼP 31ܡ#L0?/9` .|sI/üiR4yJyb#uW7Yfvr]dv}~h?Yc6[^Uc!PC?ctFѶb.IHC2Q%߯{0{mSaF;+{twє"9|-DtV1xC AF}"i_!kYL\t?a'-;1ѓNlĜ sbό!S,qkx CSޜ]Bh2'7pZ?8+ß=>Nk5cȵfnݯAͿYnUMݞgk#;cLLOځǝfI-}ӫF~ѨtiZCeV=kf&QSɲod+ٍ#I":/ afs(Tל _%J1̅ddRuPQLWdϜ] X" RsNrvxH\:,lw~p q h]\ h x@=AJ+kJ14;l-*]U1!z)6u-5I4& yhv-η'$pDbÅ7A3:T6q!$+2F2 d9!PsːjYJB,A@6w/ov_\Yw~ziϗOO?k^9˯+^w/ 8͟_,/zcͧW?X']ΤmԆ$B/k[ ̯g<l.D7/}oƩkϾxZ7//s~O$+FՊ'֩Oňzɸ2@=ex9ug:7R ).F]PȂ/_$%|i8`h0$nOD'5WAQҲu l)#`UKهT >]',?YrB3dUJiHY רԧP;|5L"/|A oK!>ίؿyC ~7,9H4`Wgؓ4Hf-Z+fw6KHaFerPQ#4ub/WorI.-XcCR8=zD A|ARBc\92 82?S.5IlTUnTvTuHR:!]16 JilA*:TC)]bOqT#;xNulǓ ]zx+" /} hBz^I;DBV"RQuCNQԼV)֋QéAr^'hsC.' 0PȡISΫ;Z||wQ}7:?+'AIWz=2d\v!ZS4:{fK@ deoJr8V5hRvH& ۡͩ8roCdLJW=VIQ_HOYr` xT}́I917^dzRJSF:_l0q6 i8:u͏[Aek6U@١n~utz[-H޴mKBKVp42Æ︵'Jܯ'dZ EbXt?: Ɍ],$~~7R!̷е* +gG}#`}@=k~QZ$sKc V5i1Z%)&k(ErU" y4<'L:p1%(- >JK') LM:<Mds iZаS>/R 3Bk M,'Kن$ݰ:1yHkLE$]5M(˸,_tv;ѭo3Z\0#~Ʊ/zQ3=3y 3uaFϾ/hgܹ?;#{Ovv L#,4| uzi& $ )T:$"qؑ Pv:5jɢ^ˢ.gX]>::PKOoZ}@ߥehWS`E6*l먌UN'R c΁U&>bll (4ڲ} T(|6JmyCVf|OQ62?뫞cWyuӻ[wɱ'z>Jzk]pf ȌMu?{,G %>h`hXŴ9j5Qv\h8svg_H\gk0{1R p>J*4#6anbN6|*%C#! 5Ԙ,*-AmdzevhR6 m[X1~hG=MXpkF;Cۮɸr'xGLfމ(XLnOF *j *K2 W˚I+~>q9h8^}rRU;'J$mppY(gYO%(쳖ۨ66^+ScnI6ꂒӗNEG e$;wj7L/[;HJmpejTtǮki^h2@@L^l:(@1sV69皑M}gm4qL0un>}}wq8le;>yWQ#xr**+&S"'ϥ`@XM{;i|0n><;m~݅$FQ[2XuC6!L 0 7!"n|GtgcJrKj4ᨳ@s[vꨙTS@Z-q.%[5Mƴ1h ]떯v_Xxu0?}aەaۻp]u_1}?̹?7Ȩ3db\2>H0 :ѵz 5f,JcJUvc@r6Iv dZ21)+B d;ʧo>~X炍Ƨ, rp ĴE[8{K+V̋>yM:0 v筊G{?~m7WlHO& W+HzZgxk"ba[A&#)ʵ"-D.x f@o2Kuvp`⼙#9]޼3Ô`GM`7K Պc7auFNXݼ&J~(ep`o"}C'!:_ddHN :e肈ODO ϞS6}VUDɧTX9A#`1U9;vv~!Rfb]5W7VӯVí@_/V\Hɷj<.V1 yO_ kQO;}x2(cCt6Rt8_`_h`/fE9,OȾ_K7Fe莔NK7hC"䍉O&([ԍVŽ|Eĕ'  { Ŝg5 ^^l]K99'27k>oH#I(;eNrdF슖sBSs`z fٱ9cuVxr]cw$ٱa*;d!C5rZr_`PWjc9jœ)[y\ԯbĉ5bF:\qEf=MK }ZY1@&@0slV%7\ҫnoۗ/N>}=z3xFXӃvPt?d[:>%Er:iĤ.wޕq$Bew yyؗyYCSxI$u lRdfuSX%ݥʈȸ3O9Us⭘\s:$e"Q}b@1d:ބbކo>?:?xqQǑk0a Xj6^h^\ȓ[u V$!>=9XAuًg+;(-7Lq h?ϖyI56jydY1H@O07̏R@IŘ2Șĸ5HsXH\42h ~#IusII?ʊvU쨁n#'06 uһnvc&#:oב ł[K;=cFwgۑC+y4 Պ3CɊDхEYRlHs'[|b :E6a|JvO9ݯ޿_H&,C98xWߏEb \6@stI$[[E@1 vF$"hcٖ11z]VA1C*RѤ mXTjr~@!7F$kQfZ?<g_ʘMd wUDEKO ܦy_ye_/a ţtan"P]qxG0·77)5A>>{!NKq|YtjC9;+&`ǝX_:qa<2Iъ QU1֠ yb\Ě̃]c52'$Hqik:7g DajڃAV, 3"͝|s3X\a48d 8{mdc jkQYkq AM@" ozs!z7憘Y[[bȜPfZ]r|aunsoM"~wQLLMwUJG~k׎`s͊MoRvl2FQ^"᝞\ Cw@0Fl泔G-}zw]b4-\oZiax2}C^Ll჏y)]`=\L#B]&Ie)tATH*c!rbdcc0 l]1^;G: lRF)*vJ#v_2~g!T+@:Ĩ6d`Ŝ=[/ 6$SiHa*qyJX'+6pBVω9{!Dn9;WU'$h[-x-jH洜6euT^_[7wM0aO\Dg~wY:?|?z:-"vNa)At-B=a?iOI/bls\rT%`VɊ(JL&%&D[9*:Z_E|Cb\oZM`Vad OB#g\-ۮs1װ7_/<F>;9זpQe5 \=jR^3єp<[e{cW?4}Zdz-fbj6o Htͯ%8Z)IE 2mn9Hn М61=Q^uJy]8Q %j^^tom`}'̅k=8\RUplU2d"UUGMڷǜsÈ[!.Ӄˡw>GⴔB(%Jׂ-Q/\(((aB0$Il+Y'%^!M0b # LSZuG_¿`\XlK0 /.A.r5T{ zY5_١'Wv㩬4_tR ,Z r}j׺]Z_@&vW4R8^6s#ZJ5rgM/wr"{jmȾTJ"p﹋SGZQWO3֒'?,9Ⱦ|땟kLC\K^P UP\) -Z\K^Xho]hi-M¢H>cnB;PLxocҦBY'vw0&[Yۉr ܻmQxU"nZ&[zwx uz0Tx=]:Q:)s9~~9cn}r̦)F1h3~f1G0k69*xdz+stjj`PFװ3fbUw2sC[p&I<0G8@JӔ9X;]PC&p\Y\y? Zzкw1BBwp:'x S4+Hн1s`aw@ O.VN+$JHHgWA-luʊ"DQy41>1?OvXpp||0!e߳s;/Ҍq8w[O\>]evx0uk-#͌n:Ԉ y9ϸJioznSMf? B=o1Wy] A `|( Ęon']9~Z|K6Ek}|l5y-F11W =teW=ZMqNtl^s嘉C::_@*u9$ua溕c+atf}9rC^rg =om_|~+l br,[`6#m݀Xq#ۮ'c{&-C*{CT|4h@eΰ{<=RK= 24*|EU[Ԯ![u}d%OqT@ ߰|N0EuvDVa!䩚 Ig!xeoj ٢n&-x/quxrH.iQjx25DCϏ6l 5y9ZaF?XhtOsdRb<\y* 4/XV,=sJ-G|bV4*so@tQ𺌽seP5< JѹKZU1eV %B'TN | ?{ r/jC_/brz_% X&:k'ݙ.Prlٲcْg‡$, y.5DDSjm 8(TL_ XJ/NET09 a<3: !B=rmfEifnlbkuڇPj}TA[)2gO洿X8P*9^S%/2-I'tr $3Oj.q)#DK+ TgY3\!u4A,'ԫmNm{fb," Tjd^pEA:C/ 0[q`PWh`wV)JzѬv肁N JJELP\L~C0hIu֥|ɍ$vw@4_XJԋujnb$6W4SgBpCFD y 6O_iHțhREʘ E3Q4{E2zx=!9KSd4LA!RJi4($Ǚ򯗛mQʁ)m>Qj1h1|rf[ hҮe {i:Xdn{0]2I,pA3+Y^t oX]G&??;3qv9-HafDYGy=pk\<=xf#Z|P} lPD3mefphmo{+4{d7n  Z 둸cs 4:%XO{j"4NjqŲܟQR6[)\ԨrЅq_]6#)SteIn C %Wxsjj]3B8q+ D*2#9g^{9sh D&ks]\.aoTCM/XFW$Sǵa\$%HTrIBX, 4$%5)+[clbZZ̙*`?=QfW2II%-NxJ1J|^ϧ&=WIUls\՛L7,R1`s'[Xn,9~Ye;n%N~Pq^qm̧a/*'<#UMB ~, ڛJ t0SU̗W1_⪞/Q!ib{+'.'hyAHg'x[]_ɞIҥ_=@bL#t9=OijpRzh矓|fȇSnT dT1C*7@joҨde%+kx=RUVD-(b;VW<g&5k%7 ؍?ɜrpTW`6YORk[lM@_byp>m_ kq 7h9@#|*A2#tFcET8kt'e7v]|QCWo>ʕ8_Y  ?Sm]Gf|w{OOޗSClt'tG gϩ xڈ8¶ZcO4h4xL_dK>k34~n 0x1?Y­8>?sbZ TwQE"N&7nIG IܬߏqåX9j}׆d-q$ڿ^qx` XIWS2aK_kr m@8y [rL LzF=hCZʀ*+C([@e ARLY-xTJ=3E=R<4v2f$'q~{q%jMh+)rEγˈA"zoh ڜc9Wjm V8FSaA Qhm!ګjN]VDEى4 9ɐ9-(<'pbpVșʡq|WBE𺓜JzjeV{jeVV冀̋)LEsRd1onXJ̼2ō ,j VQ#bè UE2jh^jîϺ0Y*?yLِoL)|&L]`3U-y( =EuLu N$ y"}G 6.@eR n mdf cD\ҽTМԭx Ӆ*ޔR4mRɗ+ ‘pיMt+=h A ԣ4PERbsS0.n)-) si9TJF૯7*a' $x|1\S*(/P=|Wǡ:~LH?Η{W]excAw9<>ߣȎ{f ݒ1JZ |LR+K&g^j?r9yr,fQY5B8vط?vnmn׾vۈep2*kJj@HP6d"tU&'ic)$^9qρ(4aZ)cQgL(nn 02-r=/ Z9> SZۀnǴ+0;Q0AEm4#. *AEH3Sh0$di5 A0J͊6P,T˙UzOf8\Ym{i0 0vVe,Gq.3mmOErn=K<μ`k)sSa8AiEp I \U(- Kh\Rh i~;/la.z&4GFB7TxN7t2|xkۙHrJM}PExHp"`ΨCLD co,WdW3gO^7擷y[Vmaa֘i5=`GnĸQ%2o+4*WtPņ4uq*h2Bt(-I9M䎆Drd %chxf_ɋiG$3g TM8s|fNb)FYW?Fˆ0kq[058㭩{ijb'da& icU:[`nxmm/Ig򯗛c5)|YG~N5\M~q"1OBE 沮DǶ%?Qo"VMc_{(xK\6D!ȼLHY2Ϟ| )B/./.7bb`=0-Ӫ:V~=vzrq4_uÇ/ނъ}O0M 46ZtY6P7j~ڏbɫr}/"6ȼ 1/U%U3 8|Ȥ"60*:yΒ'^;>3''* T0yD!p=>\ x?c*^ʴXit9">g^v0~5ξ<#з>N "4?qOe._hjЁEPx_~c /9~7Km/<͕͘>Yd92ctA߇ tClK=`xGZ5ޙޱyZ2ϟy9`r~vF)R~B𳊌b񜿱Ó7:Y aKTǡxDrX.`sIp̜Io}qi:$(4c_5X%>u-rv=ށI$PpG↓DZC)A0%4,٤C)4nxStAZP>a@8|mm @Lm՚PnwVB ]a;jlNQ~+0yQܐ܎;#PLQg)ZS*/_ <'(7{yr}^%S̷@A. I> z9 4@6Զu8?k 18pwcq2wFb@֌;\Ȯݻ~Rk%8p(+xꕞǀjEɬc;?B' mϫ ZT-KxunU(6NdrE1j:+4GS#ܖ{. ؑvpP?{R}Tɯ]vOt[2? .KfTn~kgoz wi6+R8eydE>N1Ĥ\#ɅДr! bCa(J(JR.J]:Ff ̧nSX4Ad4pBh(]8uUw<ɴB7 OALJarV[" M!m@|+-^\-z[amïyvL1:%мNb:/1x-|n_b{4s1cLo B7ipߺq#Ti2힓nLWKV*ʋoᘏ=|z㉿ bov|6Ϧ>VcϦ .va##6)pPB 'GH oH**R(axD%9S:S=TLyΔFC3 E>&`OhK0dwHAnjU\s\xwLYa0iaTɅ\x7Xߣ+\ұ_$% B u !˰-Z~ZB 6AVF2Oi֓Ş'?=ioGЗؚLdNI TasL JU7%Vl3pdSw;; u+֞B󥆊bb~1>87Ms8Fc9_37B4gP;&lj^07R$5Bƥ)* ҺYfNTV lIl֌6-,urҲ JrL(uv']PFF?sc O({±]qqֶV,0΋go;xEz;H5bFs0&Zp"'~[ur^Kk&dOTűaQ,z7^_}w& OAaޛhVԞ{Mtau||kDW@e"EU]EJJW4F[\y,HcA X vhZcw/[X|yl׃! ;8^Nz6 rj{}ë858׵^OU\&xChS1P,;A*pd >Ӽ܄bn#*3ɜ~88c`0EVv_>ǚ"qZj'ȭ3Bc}llg]{*f"+57,=RL1K)(Z~Ug G鏿f8}gϾl5@3̸6.8 \ƈ@yxk ~'{OL`).8eO@krgO x} a3/1Pm)ۏNG;4T-Etͪ4jʼTGv83XLd@wϯS' lbEjcQ< Oڋ2~B' .>0ns<]6sm.tE2֮~Tl=@"u40]xwbEn)aޏK7sP61_,^/xOWfF5r+#oeY n'i\\h3׾e>]x"H2ewY)6JWz=\%F|;m_FW}@u:EԌos@/ouNDQcBYv'  ^g"<DMzOfdDvRtpz_\ +>v:p8#nbݟ! 27Dd$SxoRM7J5T͓Q&0,PđTqyu1 7BVOtLqTd`ASS( D`}<"$X SXD מ#A)gZErwnD4uņLO%lPK+IPRX4BBOϭ 8rs |RծCC{ )6H@ nNr )H"u`C) b=hSu`QhtKʄ2+r;R [Ś)6iݜ4QT~7~Nk }jW1ݛS"D`IoxDXnsUT%MM%7_"QHAkRXY %=&`W *I-@XZְSSh2?>PJ=ҥ`HY @+҈ZP`.ecz/~(F`&<%^*yXxN,RBt?c;E-JB\uD@d/hK:qEyy<'qܷiKg 6`G|6WSh9{.g9D?~_ijC(4C^0BِoD@f 뚳!B+yf#,1=U}R;x|ǥ<]liת;4E"-1 &P#G`2o~%cscfigIOP}Mi;J< 7ɬkL:aCDkjъa:)Tш~ z0]g_U9/4` '͆$r*(F2p7EЁ#hNJܯ5+ەIa`i ^B4vY!^qPΩ5~Mi!r) ]8F3(0L=D X$N[A%pw+۹+ڈ= w3ٛsC !7SZQ0 GT{+s zr.RGFdCu41'+ N9襤(~ V3( x p`ҷbZ{l&qX2-ù6"tIB(JUθW2Fj`,F(*k-xv.Olg88h%KwA ?fH|Ԉ\F7;":%O+J&k{Lʓ(Q2Ы%:]LS5I;$ >eL DӋK'5q0 c7 B|`Ci0ie1釕w=y0ҡrh OZ>丵[gv`h_1n$XϞ 0H3tNs)v&q-z#¼v㌥ >jEk'w~ifb{h"xXl9/cz,8tݨᣲyy?>cL˃gyB,Yc$ҫ;c0B^&%LKVo;g ˳JVĝ;.1[?=iK>czUP/W=@ٰ7w1eB b"0A h^:,we\c FAu[ozla[%b[5D%plbuYAr9W@!׺JQǸuUo A!V &zBWPNbr-U9dc .L.w4PBRLj[_`tų,Ʋ 1ewYЄ3NDx$"z vMw*İ]ӣr}M߂n/.h ;ff81dQ全0)sqCDXN2@ Q*7ۗnJ5OfD%gY25] P #ۣ}kǒSv5D>7M!N 51l"(D8ᑉ%g"<:=37I#!}c;$f|uT=Ju9^gJHyn|`_A %${oVwKj]vZe0R_feeUs!N5VU4 lQ"bCy8rhQnby yr?:=^ܰ3@3 Yʨύ'ψe+k"z}/ϡC4 7>-пشOAryܽ OGd3cY9:̮+ǪտsS_ ΋ކ{OUJ? /N_cxߧ nU(wEGOŸםaGߟw:ٽ^ҟ fKןvM*zqn}Z$FUv(DoN__Ruw Eӝܽ:L^w=un´q%SYO/vBY~r39b+NՉ|P;T|c}.F^L@թ]z}sۛ0_ Yxv/7Зn}x|Gz:0 ?5c+AG'gߋ?<zrعU1;/Q٫ޓ[@9e,o_RpcPk,](뢍'qNșn/tϰ=(\ۿ@)iN>xY.b0x2>}=T"\|~@͛>jO$PQc%;~i:x8z+t%el̗߿Gj%v]bFO)Rc7犞UgCUƈdY՗j~ A׸ q f:PmDST/N:[Hl>,;w=aY"m'1Ju6ޗNaUq^nh6:nqq@N~vq8ntX?|7k9<˳@vUdHko&w!s4cHqcŚ`|dĥk0g,ɔF8Ќ<m! y!6]6lc/ޗ5dBLk3ƴ̬!6 96̂Ī^+hA.Y 3ScLǻqbd~<ڏG1~Ûh?ǣn?r"_c? eB1CU1D`#Db?3Sc&X.`<]m&$L==ZG+ȯy"Vъ\ﲿGYy]cH"$Fd<yr3 WP +*u3%S o('ly%hKmɣ-(ݿ:[&&%0Ή(4[Ց?ڒJS%eP!.`zg5PMӾ3w6vN!.;%e^Nj3t5[݀B]+i.DUnyɨn/v o/ hI]~fT匩?OF$'x b'sE%a)g9jǚctVVIeŒAU&LvT+T+R-vQU i;aBeSAfJQqQpR:L@a4Nk@))RA`k '*\(@<+BiC- `-wqtw `LrDw ̓Rp !S<102io)^P@`D#P) ^ k0j'G\ӭJKaqB}3 L dZau0*HyA^'y@DJ4&L H\*!`p݌ʕF]-HRvw h#4M 2D(AƧJ<'V" `_U ^2k[Xx% D N GR9,:`f"KcfD0 ]+lrɺҢЦNVslጣ8WF):NcNǔ4nnOUjΧzKrB2 {.`%YmUcJRXGĽi%>gMgĽtE$PN($$V$ XȨ[B._|tRZ2A rI[+]BqWɕ >HrӾvͧ`z ׃p7O wfe{{^t]梥/z %2,rNเ3k úiKuU@eQ`JdbIx0ِy4wt"̜jgB+h7fPDt(a` A9*a½Hb<׹ͽ%mP Z*Rؠ1$'HB\+F&B V,6%xǤb -GnPNnW5`om^􅍱GW=}KWMx/&M=grX`/GTeҧnsL'XU'?'o<3+zYg >H0go\=ťA1I׌Ɠ#uS0)Јη1:E ̱xtү~u݋^ WRKs.e 0eӋ&'ȮjfQI8YD!XK֫Ur΀՚*)ֈD5;3DAOf  2UbzS  ,!YE.:d-4;5vU)\fִ.2Hɬ>D L* ֩Ҟgx=JҺGYlqVJmkL;ާ`O$b+}8z8U\tBsWdnɳgw=L! 3MgkH3U{z9ODDθພ:a4~ɢaRFN\&g\11FMf1Uʅ)X^fsc7jQyU*'C,/߾3u>bz}S6!eQ$B b9t' PHW0V -qh;Ԝ6c"29꼣݌u>n(|{'K"'QkftU߾|mOZ--x;I}hх(x\+'Oct\{ >{ZDrfeF Z3Ǚ3gn8nEbl_IvIiaG&O EosWƀH3GžAq(B(Y09&fAadڈ;G(1Kq(@ZNiP`'=s\dL[~\ъ!+Xo &]|8(H ronY" J=vԀӃӻ@(fW:J,F{ܲ_{צ L}v)iCjxbUM$qEIe)(Yr\< #ScʅD:/3*Po8cBxdXEi@: qh4G `@ Ks$8 = r= AIA.]p]@ fI=Pw$Y BwV:NwA=ƩDwA69f{JPgZ]Đ;*@hV"4|qk6 5L>%zRyF#KZM􎰘+)Vwi拽60sܧ{ܿ8Dxe-tyNBwk3UYJїm)&?ܦxP-M~bM`LQl#ua8Dm'TP=i\`÷/Ro!!j>{tG8'Z!FmN(4/"1-?F Kֽk#Z5ubG4a૭%a]rYOhƬ eLLW-3s0Ɯx$ A UA+'CZ-kF5kWd_}0ؓkFsa3!CgHTn#FIAbpRaJjarhejP ms:*Ũ=h ( Ĩ5-(r(.msIs$5y#m&BzG$pdn,<,N`)lj3̺̆Y|ZȉJUj;~GW8և)FKF):ϩ`eLjXrCi ~|kFvML ]H sI933T8D RHR)ձhˉZ1)=,OsFlA "tv!NڃWF0U5(BS QD Q)9Tz` cLk%t[itOj-J.7aIvSyļ RJR\F!*r<BCrsS% P "{AɃt"4Ri0VuL Vqn V0jD{W54PL*qy`$Yhx҃e "P XPi@WE0L& ]o8WrlA]v.AvHc{۹_Qݲ)zvݲȪUHV1 ô)cq(G&/JRo{;O./9[b jKE:?E[!mM.T+#G)ˑ!"$_›?qEho(chU1,TGC ~GC|0h ?b$a*_ !!-gI"$w\!/h1m(i8T{ .Cӽ[hFTJcg#􍫕K(0v *$͍hxi ?[<h@Nd {Wb4┨ы߻ՒڅoBq!Ɉ$(2w>uj it{%b÷40bMyI{%RGuL?>^O = 5TkS*  !%9`91jy 8 Q<\'(#w4Dv6j)8 )Ζ5e 9=|\P0VR$ *ؙXLTJ 0/8UE aEAHbT~4Y#ZXL$<^KFeHω9*c1<С[}${^Q).Op9.![y@d?+%5||EIs\Tu^{Iy*%MpJPڶ:_W`hw=N7')<6:}6ՔPPt7@H; #b 1D[ +=DE EH A!B@\`vq;V(uXKƼY=%5,ъ}eԾ_<9VXI49?c?(~豂![y'q*Uq74E̟s%U˭V [ n" jŔb$%m;=fR Zn#d3;pVBy13okEd~D̜D?;(O?NZjr+Y/'?o-al}z.ۦfR?]ܮ^G8cV`_ma Wάf~ȹqW! WW3l1vx`՝C^iʊDrsk:f򊉑=G jƁ[P rԓGexJB0p蠎IWKx: r)%ô6Cw&M1-хT]ݳ~s~sdۗnKRO1U˹oSSfR7V{FW7ۏqwYcNa&^+ cL弿6@KLl2_?]Fsh ;-e6_o:ȅ6n q;BoNH i.6i =J훤F<8f0l&A;GOH6BCAsMwZ\ lw2x|z¶Fa(scs>Vp%le clwԜcwX)eT|$2ErNQ b$_ C^~uLUs]ۋkU%Lf}ش\(9#$zeIJPW+2^]~ D3l [IM3l^ZLMZmi5&@ɥqp3TB \'O.H\ )Jz \ jӂT\zUH.'؅輻AXXL// p$c|[1Sf^ZK;aRM4hȦcd:%,{cz[{%*)6.LѬ]-l{ ײpCnp~ȫQ֫- S:{wqx|) *oυlN=7'0Fb~|؀CG}؀߫Aطyq>yƘ2˙ |CNR頩hUU[ 2>z.JMc™Sc%.8Ctz fyd7wxy/HoQLrN1N~(<ۛ~X{}c.10_c^>䭥_[pA ݾu|6+MZZ># r9 0liؒQxׇ%LLʹ# i%h}E_8%N *tOy ~o\#4?,(1!!?*.lÇbv}'%g9v 5Vnf[p}ҋZb/N,;F֢B] JX /Z (9_2uBJJ"(KVS6tqRmZ)ߣmAФv>;~Nv m=8O:B2h (+q@<-|f-8zNsv!XKEsolϽ= b~cyNsUD%pID$2TDZ2R+ǫ;_Lx,vWMO^ HTC\}t 6zfrͧ|%錰zױ]usDח:H̥JJEu̖Ѷ SG)s.D^"Jd2R3؁;xgKRH?_A %—s}srOzq_6u|ٗ|qv9/6CyȜl3I%">*Ƀ! 93|،=;x╦B~:/QGv0]d~W~O^!2u~yy߰W66],`23f~m\ vnI 9bUtAdHk[Xk_ջ|L䢐B;l*o%Vc%c0H#"1ɚU )Ҿ;IkۯWW d9#d>ĘfZ1%0RdCapLjha1LB[ GHω€ -cQL܅Dannv&"iGH3*X_?˿묃;7_~tkt0 U;huw_7ֽF1,`a=jt}˄ulkGcZZAӶM9O'oU[2(?{ƭ M(~QR'eW'/I0cԒm )ixgi:eiFwt+ХZĄJD&kjRXp¡7M(kx陳RbZPCF- zko ylw~<~,Y!g4,R&M G%S"PP RuJiE,%!%`qh0#f1gdi2χ0#iM+`3B8ގD[(ŸC6(&t<ܴYgjm2Jq<2”KaeXPJIVg&LQdz"OtB0r{0A|L]y- PDMW>,5wXq3> M4 wZgm"aDJ(@ E0jTbҺ)K"}g_|}-@䝇IsZWoMn'+x%͇`I!o^?>^MGr#|=?/4/V*CUBW3fiCt_&xp-VO. Qb7dߖnm*ʼn&DѠac,L72``Ns$ z0;o3}UӞIb%B7=}<+FX׿\ ˟b_| QkSn׏"3xYH,%`g:3{=<}4|'ނ-|%<3cْK{O&}>}OCY-獣CiEnۭ{/!ĕN pe뮴w~H0AO/ʰT%}Mp%J3wƈRJ}&Uﵮ6ЁgLim7梐 X}zG+UݓpY>|(ԚZ}Le7OO#>%yxY 4 XtO7,'{:}(aHhLFnuByPFtBݺ {LoMZVŐo\DdmễQEuBwi`Bjh}Ҳxrw<j|n8(yeyUw1 b* oA4ʃcc*:=yگ6Jv|p{Ϋѵ"o]W"DU:jw#A]e>W,z'n$bލ>N,>'\ S橦ݗ;)r% |ۭ{Kx<,n@ɏ\.ΧcΕMLF%I*Wј_LHE_jc)OhD݅(:Zd>T}>uqƄZ+\Q$ h@SLSLTRõ!틽0i(sZjQwu~x ֋5. x*\:~0dEZ0d{u$Bܸ &ט,ubQ5yrT|l#vbP(is#w^@O%GW}pXUK09r-XԻ,GQ^1i"f.ޭKyZUʴb+^ k̈>F[KS9p+ў*n\O.2id0D/ȆY&GghUv( a^pdhLIwd$^V5&-\-45 NFH9شB̽C ʃ23*C&Ň||=uFX ;`)=hEB@Q@imHTa_)$rR8+^L X`ޤ鈖m431ƺK+CBܮmLӯ[8_K9,3w nK~'Pi> [(e+"$ QԅIa^> ~:RDrgPr{h)l+Wp>E%S_,-Oye+Q(`1-2m_s~ KB@ Y1r6s٭KWX*ԟW||pkoy'|*?{_ާ?=zQ煋AC֪!??^m+".[`u/`h+}~J`׮ PKC~hnFv~t4ATiҌbD!"tnBHEptpf"¢ua4=EkiY2_"E9BT5Nln]x?׷܆(@1LɡL/by.f5Xq:1#H#JUbwۭkWHT=jddif52ܑ*hO:`vv`A1eRn7h#Mk'.5v;މGuOQ \?.Zv)fBݕOPJ[ԦB^kr60Bw-Qn%W.ut{8r$5cBH;Mi~*63,-"tZL*ٔ<çMsZbGK XtJQ) `Q7R^FTiY65"OQ[|ߝQ|BuE Vw & toN X(C(2,kd8f#s J%8XD-(MxZξeIRXNWl|uSHWÐOUӻ['/ on  ১술n|6fG7x&;z=+Her>+{*Ul< %TjA9SBTcʘt*Qn0* R{j\IkE] VLb.UōL8op=]jٚʸ;j _Ɲ pdHg3b8mSΨID*# 5Ɨfь!:TV*T=UU1`˥y2-#^$^qJpZq(dR`2I$10_Usd F'Bw 8ݚQj35Od0\U6hW&HBM­3(5 3.x&8MV)k,h_7I!S$\}ܔ_Tbz0fO۹ :E̛>_&?_˕O3qA>$kaf'|ҾǶ/i>^$Oߐ7 /,x,~\{V#DFl|VZwS<ʿ]_ܚUdw͑|{,pN/m>qM8TB%IaVv ]hFaH#j[JVUٸMtS*'`SԧƷnG1TLxL4XYuUB>4?x\Ìqf:(wb|$&MԂi& 8RJ)IŭH&EFk~zҏ#7d1}D>_ѥ+7\KH,U0e&6uTa۪$}nA:X e,`mfDINI"() /x7j `83B;2}Kp HL3 | gN{wgm1x5Rp]"Nwl%P]T z3I#I2nM>% (z$A vI=0 >~b%.}[A`; A5ZWUnzhT{2q*e,Q%YV'YMH "D;9DV;ȣdo1˜lAu1ü(bJx<-|`W΄YŞ~G{U ^Oo"(' 3Ը4e ]R K:LG0)X Y]9`w?~ L/ o qps ? LI|/\cx1cs;X@)zDڅ.Wx;ۥ e@jg;l$(Me vbg7fof\B١>v^&k)t{H;^T܃:&I0T~jG ͘v֬:l1 <"T6#N[>F=x3ϸɯ{/ MBҏѕo(XǓ !';nJ}2 /]pި9YJp({5 Nh.ST| 9šjtlzCubg7<8x0Qp|+ X'h\Sv) Sv\Z]R܈s TT[BŚk_yXB+q%΄*!JHQʵd4c RM ((WnhOQrt!Tϳ­?Od b~?c]sG*U.-&>Una.><ΰ"ٻq$V~6{;̦汮c?N@u#[ZL.$-Q[AJtTݍFJzM5`z~z#$mާ/ vG~a -f1]m/˻`v*-!{+?Cv<$#Nn Ds:.F:Y"'Rr&T* Kfj%1 ' xIv`L_5ٺ2/K&Y?^_ߌA+F; #hAWfCmaEeA+Ld;k^_ba:2~nݣGEb2S5tsgoσƯ,y͹H7 u6U2lAtd.oj^w͇u~+:)Ob;w/gxFȐDNdX![c4c&`2+%-Ix|_.(R#C!KMH~|C/; і^y 6+ 95+D*s()N9\iML}liXŘ,uz<EHM,:܌FIqÌ!èFjϘsRZK5Q+ Ԍ錃C;ܚJm6_~M4~JNWr:?;s| %.!Ʒq.qsz"+%|NV 5ӊL`ΧN8p*?KCB0Omʍls+fB}t7FHbdgԔl_TZ}؄$NI"Rdix!zgW* c[o #koWbǿl9数9"*ۆe\Yvi+]Cu7󯦿_.4hLy>3os On!Ƕ_ {oo>w>}7^f/ߌ/fr OF |ޤ<Ȱrc cy!_aOizjŬD+~l}… D>˧q=8m[5brxL4˄re#|Pϣ5Xq}ޮZ>`@i+Z͐|W}rN09^^S͙n-c&8+뽏D O<αG9`epk/;}u RmKQse}~^>P8CNrGQShimԣJa!V-΃|Z=q0&<$U[vb L|?* :>aѺ1sKZ8&QKX8S̬$qJ՚ڔ$ M1)bxX]{1po1M药×XLn^M]d.sNiEd}HI}t1yJ .V$T jQIycp >v"_'VgG5.ez °Ja PvJ/  (D?Ѯ5$縒Ñ8~r(zFx|%6胊OҺuYK gͨ:xey A>lD׉qTcdXj}(F̚3p :kޜ5VDLI9?QtIӫf-,|fv=b8{azup~ag;<ZGcF[7_P.bXejFnG.n:?pRBeU.j:M3ov>L_]4wkkی|"#St$u0R'[$D'rZhQ𔕲c[XB+[$3(R1ɖue?-*;Fv8xJKl-Ъ֭ E4J4|b˺)*E[,UD'v:x()`-Ъ֭ E4FD鎡8&EbPiav5Su-ʾ[hL] ٶnJt"2ŠĎQǺsKky[4U[Lъ2E>-TQ#x/2;'8xv֛gC``/;R2T;NΊaźŘO'Z1AJݲ[)T^$ >O&׬d 5ws墨77iCFg`kΕK$}p).[aHKŀHbWp`6PLF̾Ca oa:"zޟ0)jo4ŝ3mrga v6H&l Z[y|NsECre5ds.#AkANd._2'aǐfH)wUv"l]Cu#,YlU<XTО"Ez>Ty l+f FU\!H*=`*zS J_#<, *,Awcuéb̍iU4ll:۱_4,/2Ȕ.5ZkNBh[xi.3:|i`G═`Cou$Rhzvz/Qt%Tڍ@yòcGrcQ,rX} YlfVg*&ҷtiK;-qS%FQYkB1;aN3Vr0;2'}K:$;fwpӮ zU@==Y+odSӛ3:8)5Tגּջ_= '|URa^g\ R'jn::45bLէի^Z<}pFSWhEB \ձk낕^Ёu+d+9ǫ9imB7 VCZKӆ.D:XGp̏q?TVA sڈV [*87o;zUu+ةCrmG1kǁ:qV15'8ZE-1N$DmR-1y k6G _oz- 6wkN+^+hXH*&*"c[H"* W"XIV[RDy"C'YKصh5-DU^O'鰎pIG#a֬%%xT;FvGhT]m|DR]mLvB 1CYܒY M2u7|43MPE{g#pd/u?1oQIzM5`z!<GٟޟCW\'-2Z}t޻|>smoAGSC'HR4lv$&{Cwo~~1[Q뱟-fJ1]m[X]{XS3Z r[! d>I8+; zM[\HT8-;4ɃdVRafX4xYIrLI "֕0 L|5MQ|b招`Tf# 2X)&r5\3tNai5B*0a:2~FKxQQc(n| P}Ϧ lO/d74f)2Xt g%4UqWC%5RT,| ۙMMP*43;:o*dY+C) 1βOG7/1Z ( @);U(FGԞ4?(Q8wi T>RXc зuT7 7-cKWBm^Z.cO&yLZ{ `z*XtE0)9d*N```RT LDq L<X1xE4}:tv;?EqHYl> a˧ ,`EV×ozyT45hmLiRQsČ4\6Z:-SiՀ-.+6)h`XMEgftHԬ eCX>Xg P$G_r<ԭ@ٍv zi3Ee3*%/3ryX!g٥7[|׃{Pϒi:HUiSb7Fa<#~2m\|q6t,0CIM)^ sAq[4pdcLo R 6S"8D_Ҧ)?|p I/Nu8@(*/UbBJg%~g~%^q +!;̄'By(rdl8v2.qfb@)Q7bT(RL"6Dy&I8xԭ/9byoܯ_<}WܧGܧ}^}vC5ghsA3By ЯE#ǃjK$&AIc.eqO%.IܥZPe y! C>/?~<)vX;}g?q@SVLiZn8w#$FtI|)$RH IDK-K ]~XSd{$e}Baej pAr;&/Ws' 3xkYbxB+)V0O1-l逐4eFjMѩB2+w&uSKuq KJJBq4V:g(DweKqh+B06*僇ff{ߐ ^1 C&ӊ<yz^oqӎZ-K'/^ ̨Y匬9YEd(6nvE/є R*͠oåMSߌSߌSߌSߌx̘Y7fqsžw:D$Hk$e1ĉR:e "'M%aϘLJ"ԁOA`S8BQYcPg9-I*%J+:aLK#ygqδ6& q+@88D* vFjLR_Ŗ*TRKAB0I- mA<2Zϵ 6R?{O8z^4Ш v=/ۃER2mEG-7DIry 6IGƕڎ/vǘO.U@INħ*pO|߅c@Cc%JO[rYigä\#aʊH@OIn4'Ju1,SEWA-~GkU5*`KgkA, M&9c;g)ݕl'alKU LIݿT̊n>e=o:R\L6PY ]Ei1β%LE5bmdܲ3Q%AJ+"]SٜD~  sbխEl&ǿ%?FnR ,Ng:zxg4qK_'͡%` rΣpxFL$U3j!%g<<7uk\ Z¯.$\6ܭ%9&Dתz:AlӨ̶ w͡V0܊;H 1[ I@-D+kCw2 Z+gdXmvO;-,I8tK#Ǎst8#zm#EiW~?d&?g?8? 5-s+9B6S+ZH!b&ҥvVZB]=W ,4K!UW0]ރY5LA@و7b+F@N'=ieFcG,y*(rN:Gf)%ߚ#tjvF Kii9#wQI d0\reO4N68HaɖX2E 6)L6*<4j3\Yo]WIKIE\9=eOL@NQP/)8jW}ݞUS9J2ӗka0$1v.D@>zJtD9Vd[:Z,\5ʎU?K+"c҄'+ qnMKZ9=׳vI:ʉj wo0H]S^X~1Lu1lyQҥr yBn+9j˦^DHjCIҕ#RՑAP`I-G@*vwAk0r_^ˢ5;39-?gmmQ;ȍͽ$0 vǝ7Q,/-肁DX+8YܻE{Hplog,#s o2I@GRłٲgiq1aJڶ岇a1E#,,*.ItYp=Um #8]uVf4Zuo)^a|bmSpeJ+;t~&r+|o͑@,Wv9,a/VD7;ĎSiQ2eHUnE ҧKwǐoaՑ2.g;I[Iƪm-a_g:[2Ae/v~?bU#šU~:ȋh&3mƚCGF%bhϿy8۝[$|nz X[lO>őyz$8ӝh8 +E`3ԍiA񣺲3. aj o*: fY]9FS_]nQ.| 8 aǕaEjFKo\Mt7cRȨS 6_))d8,lk41yJ(^3/AxƁu7_|8S֕Va<qP{~] D@Wc͜EP6NF w ƼcբD|V{F$0ؗ q|y0D\VK[}5m@Kn/ԕI\s_Ohb TY8qÂ5gmV0wDꪑ:cśD^ z")/bd Z`@%Rͧa}!.gzM'DrUИ{ gT>֤i DŽo$k1Q~q~M6X~g"gN_-sy5gQeoF?uuuk3 vx0 QIffp0SzZ9A&qUy`beTX[F D iT[ J Ի;U,y%ť7gl lΊY=ed-zlE/g 'ہ?ybK&:OΩ׿ v`[\u}-F\xw' HCW 1p~ꇌ3IShr쑜ef-z}}txJOSrTn79ZOEcuk|Ғ ~ТGima;JӷsUHW#oOq#}atd_NEP9RqQT{)v2oSZfr2]w?N)Ey|}V,-؏Q|Ꝓk,h_FoSjRxajf*ַ+ۘ~xxf.(48K(!32Lf~.%)TgxC;lQDѲ횬v%yDnǴ=`YYG])ZT=e755-&_[rl8Lkmnli\ӪIDϡk3] !fG[ eC Λ]eͤb c1HVʼe )(Ɔ>=2`>ͮ{=kb5)u wG0axbRxOVȵHA+65ۜ( l[]b}Q {5VMk I'5㚕KEzC !!'g[(wҜd ]Vd66([.yx(sƍ5wk}MQmk.I֊aEQ b0_.5A~X{NOGOr=dT -A$o[p%&vżl0lI2ʁw/[x(vg:KlʹgŸUn@B\ng~5;3rD8(s`&~kAs@9#)i{1'N1ڴB@;0>!E^Z\*V~-T^p3e[n7%M4K`wd) RY&+mrL1BZM>>t>Tl-ڽ\+C[,[̢V2b[uaE #yƤ÷otסpKpVlIQ I)@{=>X~ PcaSrx#`eabsԸvkh1:`JlX2Y-'lu;֪USR%tOԓOtBnPKϩB"j= =tg&|?Bx;\P.a82J , 'eSUόGx @\Lg%:u#k P<_wbRD?)`V4+3ǫ"SN+Av!VEyI+r#I gSawBDD몠B̻, Z~!G]mjv6t$ ?tCMr=(KF4CMW{i P{7͡K"K}y+j8DFToGԃm=1؛,:m )\LblX7Y3c#$ɑ"ᜨ%E!|-܆ljsGk-1T|F%I*On%ɒz5 Nd}Kë|``[W צnZBZμ?[HKIYa <$^`AhD co><.#OR ދQ~@` n[`T ;&^bGj'2M u \.YQv[,o~xG\Bl`%z33wnDXHuC+"Cot^~s::Ԙ4ݑj4;5:_Ќ_4jt+:^Y~uj <<6<ӌ+̂,M<(LXRQPMDЀ1to 00 udfׯ30c `3RB،0hfE5,Y*~NJ;.-%8n͉"lxҳRLZ m|͵0)41ͻp~v9ܿ΅_;T"?Tf~-We6}U*TZ,UR\FѨ6ԡWg9\Bi,q7!2M(wF҇@bT:-(`k` 0?%U +܎l8$PL箺umg}ĴmzW5'V'+b:\~?AR Vg`±ZNݥ0%Z T=Xv =YԡF%UBĎˑ#Mݹ͞NU!ODE@vQ_kgkv'?zw\ i<;!rp*P|(J';|;)g{?zސQSR*tTQ3nh}WQy&ƸD5uFM.yMɧ'u9P~.>ze+Prj>g$'#̔z c641Z \M~~ykH:-b)[ʑǃy?onKElyvj!_ޝ\ܫ 0Nf\G` JPw.Fᣯ\vQ-=*Temqe:$i|7LAƳqo46A 6&_]oL[ ~_~U~P|udw?rv%ܦl 3W]9h Rgd=hٜ=xsSPnhvњNyx*2ַ 1۫͋P>7C7MwJcհAlR{&D6sH+)TĨXD2e!ьҽ̠ &3+($Y5W0n%X0`QTn)HYs uG#^/< ʥGdHw^{JQ A N)*)K*|3K]Q)x d,U J)>Z7\6CcQUC=kѼBmgj Vz^*UKqz_R'D#,Zbz39}!1ˬyd}/EG!<>0wG9=jdI#y; 7k%nI^ JA5,ᆍaA!CҸ:5?0(WE[GIVlHX7ΎG,xYnNH{`Rb"&SENRL,qBk-l(MwWc}zjj]7Q@?>C ag Lv^D,;HB{\PO.\Q,KCj~CqVm9c88Cv3O !$&N2X Rh*a~WcN٦!}lz7^i&ʤN~kv}6ey]66Gga"(C0I9Ȏl#^ |>0}V̾ـ+ؒ:F">Hh&hfjN#(RȼVy۫޾$-d7w0*//6A?-o9#ɢWS`#>S"4$ E1c2&(Js#X;ܟRb;y{U%{9DKYDf8bI"*XblJ˞&&6IrlOf^O'ޡ a17#wD;cx8,rLa4OW4c/y .0Vթ X5wK(NI3( aG 3B%ic\,(&4KЁosUU>1!;T,~S3%:Mhz$!y;TQ8Zb"5 &CHJlALjF' lEi2CxAXbXֲC3A26gΒ,MHQsB‘U!٪qNDly"?E6:&*J9LuoUB#IQRM@n,LDt"8)lS nL@49_CkY`)h VC;dB+e: .f{G`D (}", fX0Y_,[]ysBw|R.9NɬrHx4r4ϋ#H<䲏`UKg.(l6˚a,+mGqnD7W0p\&Xa і]nXLi.3D:ttcN~ _P)Ho? LRrb9`$HHZ9(.%e-aeOKv;`mBTzz]ik[»6kMCkE<|pvh_5̒\>7Zau f0SԞ4|FjĔDGޱ%BNtg٧r/>򎫖ri9b9Jhi/qTO_(gIk:u>)<S ~[m܏]%BQmd4{4뤗}lt3I;/!yUKdx\I>v쿵'U7tH\Y7Sk b(pkCxWȈ~Oy2-I9oFLiϯ~ţWc<ǷH0r~~+_ f)ks7iu7a$%u+eu7E%iSCbΫu*Xn~=֝XݍK?.Tfh:YݍK0* F1z7EMn\Q `(:t;pUS%aWGun4T]O+B_;w:Uw%}3{} 9`8Fh2ڥ5\!"tqoDjJXF˨:4as 2e ˎ։`ljbQ.U!^(|G&i˲ۿhW+qBXZ5lp)|tJ%Ar&h[U{Kݪ@;V a;vfy5UWse8 r{(p9r1].X=.Ϟa'XQ_pWy ={pϻt{ 37z Tl9w6IG,J&N-=|nprz+,Sp],k{ΗCb]goXY$.a#ƪ;2Ȋ*"xxd kZ=-8`0%4 e[a Xk&,LAy4 NŸXg%e@kufKjVΕdp&0G-q/½3;COCw Z&Fcl)ll;F]zIRchf %g'_IB)bmY_vYUn ؈̇M@T!k" I9IY-v5yHDwU盟o9I`$a6AEr0-=#:HZU @6L s!L`7'! jM.TR 00֚ n'-'J[PCoxo.J f;x0qHsۀjK6,ek|{3p>c~ H՛Hk3>tΊ-ݪ4A* Hf~ i04~M_^1.OML>K=B^5si'odwFs}{}O-FPq ePAu{{'2.ShK$cF/8[&N#6isswz;7%K$pca<>o\e4N)޿FoSao74e"\L[03Mۣ ϘAz0_„f2=rFCd>A6>u^01bގՌ|ӟwq_ F!|yLzf"0mrܑTۋ֍٨6ޣu+A>uc ŵhK߶u+F4׺u!!G.%2XrPw`c}ָg9^6x2h8YbDsqr"Z$SrXzsS)Ҧcx ޴3Z`%@cH6':8"o%B:^Y ACڢ~<("e UVW(˒/ʯ΄Z0Y j ^4%Vi)T9i-nJ6IA8P3Œ: [3#54/1*A)}FHh%Wm݊͵n]HȑhL ϝ[=l5()6 Fx#"dh XQ#R Bp*ν k*ѣv^$7'ѓ GMKƓQ8ƶ=IyOLO҅o2aJiunErݸ[fߊ@B\Ddlh݊ldlw+iϵI`Red*t<邶L уcUʗQh6Eҩ`y%L JWl"݁6P+,Hw7T-AA +t*|F~oZ<5dwg|}r$EH6ۡ'Ynn4 *[t<;Vhvt߁Ԧ]vq.[7QPJgOc^\hv}rr"L& :Q|VDƜ `ÿyo7TKM/vxe$wA;}7 ($T͍h9t67"[NG{O ^13{mL.ί{Cb-.r\ q8G]-tBI(dʎQXF}cM1bLVStZrli9!F1ELP);FaK:M%SדNXh,i9c ,Ѽ%;Fa%nOՍ?i[LeeT8̶+)u*c9u03ӷtbR3C3Hq cVl]NYReĉYm=fBlMv0>7&@5 &W+rϔCޘ"_Ҿ}$)tޝ>oIcw)Hʳ} c)x0n~ +AF=8'L_z{&m,kt=yN]1 p?J\<D\k^y~ߔh}/q80[|̋&c.+L?mMyaQXd2s/~ I3^?6xbrr1zq3%B(9p4 vhg|U duz~sҊS,]%Fǁ1ji'"hZu}o9u[6 [V̱|eUBw"Q獩r+wqh*SR̘>{32-mhwˏgFR 7O1<׳g$e=D!# ?&hԂMi/uLgٞJF c"0TɃNt`:0#/RHVuf̖3Ԋq7a{U_'ZŢ5x"b1Ue1jy,RH 8HP)Rpc-̎6b[xRͫ{|8J$iTl'}Kx"@Ϝ10/(sx3Ե2uxuljs򊑮9W9sVy<!{yJKOn1'N^qgϳ#5yIz?qHL./H9n苪˰y狇/&_,vݹ\5i~A[uzꨉ^?URb>uC,tѢ:s\$HnQG<dYiemM_|8%%$r͑sDќ3Un9se+vsנS?׸2[|(inAtO" h<Ak$Gqe6YlZ{x$)Ic+# %ށ/bP&r߫./zλCiBQɳKh J#7c%F *OW *m`41}A sӝǐpPw*'v-EkeA'YY(>:H!;=4959Ju ԑkersт#J}4rc ͕L;_x[p}t#Ecp!ߑZ%)ý#{- -z> F-WZZD y)Խ|"ZxV19bEIEM"pdHbT7ye(R렔gƄC9L%ȹ2uKa a 6wX|*΃EekWU&F#(zmTt: $rtwXJTt 3Hj(~(V\(; S3Lΰ4EYDY2V͵EjҠ54h -m9dyWEAE\|8[-Xst.&0 ? Q">\*QߦaߎCb28'up_Ejk*̣tUN2.Hc+ IJ =84P)Pҟc|Žl71$ I3p*b jt*-a%{(#b ע:``PIgV$Jm'\cmƏYR ^Lʿ VD>>.V SřG'zl.oLuǵ%ƍ)"%'+ G#ѽrC Tߣ`" SDFC٪E[&R1t3!I! ;"^)41Jђ.K2N@[r&zfa<$ITtY{v߿oty7wK‰0Bs 7?cgpc}QI&Jơ~뇋^ݜG׽ ë>&sƼĬhwR+4Cݻ:CJ]۹[bJ0d.9ſƥZKUI|Gcj.;o򗜓vŚ h YCc@cED <}i$F0/D20jQ`PYE 3 a117b?(E@$%P иGUa p`1nΈ J吂gҐŒ`O㬧"`J[**Qڐ'^()h1"eddZ 3k*A^)~Դ`:VP%jTeR=Z1RedEr:!\aΑO]$bd~TdHre1G3&Hb G+)eh+DZ’"(\_[lG#Hc߯܏ &_[vush Ƭv<F"s=@] `dԪd!$NzFeӠw&DHlDv~ :o'<5q9 NYt2Sd/T>شR>@$cJ]IZJh"w%RزhACF(Uר.U3 rn`n_ue] kc-S "U71H8,.g-O =mW{/4GI:xTiyQ&I]p`Ok%܃~{>Ɓ #@uĮ1,߉eMh *[-:g݉P&4!wύH_Q]x?\7We*dIGYr$VkzP4iSJ25SjIGb7Ӹ(x.bWQT,YUW7ffܹ`f22Vl_fإGJx$4u<]cjУ֛uj,o,T &Rp008" ց!NqPȩ5ЖFc&)~HFLJ6SDZ]Dua Pq ֜ȁvݽy"BA]N/fr*äyHw/y*{\lM*g{t}U`.j``kءWY-s1^elk6Uk6F n ǎ!-wI#\!ҝ||/xxσuŪaB#cƝݽvY`=Xֳ$iOƿn$Q)khݭKY1*ІE P(3옳e*D<ޙջ寋,hǍ[~lJWzb[ѽKV_|GmO'wEA_iVY?BFVrW\w1D1fUZJeX=m'+J9,.;Q* #c4qZBéS12gxSXЇ܃24+MzSZy/=Ys;oqyW^"5-*#MS7jkG.͓>bXFB@EXT|cCpj Π{vf~wSNKV@SNa`=dPm:I||W0"Il'wH 4+ǔńԂLލ3$j|;Uũƒ{#0Tʐ)GJAM*I#. m.6r?UTLJҖ|o{ݶ8ɚQm7w~:bMAǁ_>z{-0?,J] +ٸ0y0s{t`YL2)IziCxf͟ކ}mg|Gn c3宯]LMH{t+n|Kz3)$GGF~::F}>f.eSvɥ[Prl^3޹MuKo.,=zeԻX' ZyvLJr)>'~svɬzǢsrm||x)}T|f߹1!(FE?BGy9|35@ŬtLwH߲Z1E@UU '^ŸY^TKN3?ۤr] AdeY>X-ť& ԃHvxG%wC|=7gOiJ|yxܼ ?^}lnOnZ]қSO'pv zrOdF87 @(u#}Hz(%jFID L5 *bjgQu5o;[?7a\AM Z1'Mxrɫ +_3H5<{sȂ(*xprF{6 j0$/ >ЃH!>J;u8=ƺ=-* 9du:c;aRm@Lj8tX⤭ `uI[ nwjIrFg w K;AMT`)Ak#*P{k%xo&d~fYgw{]Lwr9嬀 hG0 MzGQN^pŠL[?t4Q37YP$䞈e0b٫r_jҾoylw9+M% M$PP/1YmY<}yL*'$ d|smNQLOe?B8>+Qգ+Se Eξj1q*L*cF:BU;ϊT4gX8٫Vˀe^)9ǙvPB粡'!/V p8wVj|l L ^7>z7+o$$aϠûqsW~I䭗[/^"ofrl$Cő #:`% PG] f'DHP5mfM%" O.*67\QTQA3AM>ܹ$|Ŋ QtY唥8JvN*,z0 l>>1ܩóP@Q=j404| IA4+xwoRn&i(@df|n~[}.͊(ńq%uY0E"I"ÌŒae1-*JP&jS]Ȓҩ⌴ x*$WeUM~ PDI$6,V%Q̭!#4abбT6XGEQlt Vef[Ϸ>q;-1ٖD7yfW2e tG틽']%c˷+ٽu "DG&{*l-+K[E sE=GCr9z策z:6%>W'Wb`\M[}\,*?ꔖL|ϫE(%7b[T'߁d}H`Jϻ8P[炨! TpE184xƊ}|!H$wuUf_oLi W/n8B~{uuf,(lMC2CZK0R+)U g姭+NkQX]P{7"ANZlz](2 ]+Ӷ+NOɯ*peoofpT,SȅAU w@WFTzd=,}Pb\{\enzl EvGwҽX܍V^o(Zkx.up@HvC w?-p/<{\y@: ̅jᵺ ]ԃ)˜ҋ?zkKyflS7@qy5D!6$tVϒg~5~[Rˍ n|c*k^|*1[LOUgzPLxɗ6idLWx+ŹZ?R8=Ŝa%(PbB- ٳ_(XLs*;E:X4p\2t+ʉ?QaPaC% ?>P ⌲Fj5n7?>fF.tQLq0?-iKM.cIWqm*kFwfq"Y´ViÐ'Ʀs?N_R2Ezg-0N1Ni1{*jr^q9;n%o9MfC)sǥzgAfcss}.Wy0 12y|$^J_XFmJ9R' L !: [/{U*Ƈҳ!2㜝xcj~j> SHJԔqGMnU"č$7+ɊsoԣEu<}]?u/amr捧JxHNN X) D^=5^]ԺvL6Z/kN=8kErhZj:88"9WiaE"EoQgZ۶_Ssϑ#3;Lf_;ؒ3wmHѬd[rc6:oKrs{'Ŏ3qe55ڸ)N}u)N/5"5s(P;S쫥PFTZjym/#L4Qw~NNNYtrfNrd[t:AhvzU0T<=81&F z1 Pg][f,gW[C4FS[BfMVP<vl -A6! H67ap4=vtU8(c}$_b0ՌrUh&߬Uo,pZ{(,uG,$>vOwq1;ೂ= kA&Mu>靫a1S5* kT3+mqND\|g\PQְW@ɤy[W5U#KUvZT$OmZ6O= l -6`u^6>m66,>A伭)[[IX7VìГ)jxVz>͕mj-x5X5z?7oS"tK'իj@>mRUEFsS'Mzf>E8n[AJ!mLSJ~2h➺Qí1I_UNwk)󶶷fQ$5umGmZS0iƨ F3˿XqvMv\rq V?J|Gř\t8 ; '0hJ7vE;{4]3Xe{ras.B^XAQذTΈvGbS֍vo;f .e_%[ե.wjd]]jػ /7d-76'7v*7n*N7n[n4[nȑ[nȱ[n4ܐs,ܐGC0\ҁF!b KH[nȵ[nXt77n(*ϑ7n( PPHP zmhj֯`^=ɿbcHw 7 _. L` s[H@c 8RHN\@R( Alr~g?3Έ>0!A |D:<h}z E9PG;ϱF$'ȭ*4\fS+4VT`J;`)H%bjz*Dc rs4Z쭥8N|{P܌K-Yά?ѿ6Z#BV^].=Bj76US\Eד[pRsAxK3L!5KMΦ'2rD[K-ܕZJ5D& @b6fb YRh0dQf!ɜ[N.O =tPџGOǫ9!{(_J0nK@g:euK~~1lKJ)-^UW׌칉FϞߏnmAӉ&ѭ-TҝEqք칉Fٔz\L< A1qVxNwR5`!{n16E~+q"3Šcj);njF&,dM4ʦ*G*&3L0GYfUe#f[J/5L4!*kzf5UZ@ BQ^٩*kؔ5MNTYk4#w50JTYk]e A9TYkĶ8谲Ff(UReQO[?Ө*kMmaerZ5 Y#=U֚wfP}G5)ڿZ5 [갲&*kMz_e#άZeeSRe-UJzXYHASe-UpUָB$UReQOP&IgReQOUTӬ5!NTYkĶ.+kBTY{hMJHm*|&Xa:'i'?xK?̂>QN.p0¯t2燋`szr42R!J ryME0>A-w;En Nx o,0Db0*/W9h`+O%^xFbz4 !"Ûjdd6_Hoo;fb6u3Yr<%xͷH-Hc2>Kk^xR,$wG״ nEm7\Y-Cg@Nx$* d9 J)Gdf>!T,${3fɗy;f~^x2d&d>-RS*cso׫02#d ѕK|T}LW:/o#d5ow5vXjQdфwuAy;󋚪IMmaqob @+Bz=;x͛@Y䋷0lT| v69|Xѹ,hztPƭt}~!8xWO%߸G$A +JV޿= )!>@ 04,Ro)uKxYu^e,)"n Ru ??XaOy`l{LS|zV LJղuG>#ƈ5d~]Nܻ+C=Yh@6 Vo{?4|_gr/q<}瓚=8gu~;#D--,ZN$P&ZSO"Q%_Ɠ՛U4Y/l4Vxt?Ã< \z#_lcТ Z.w 1)oK5 *Œ&Fa:$}ZwD՞ BE$,Cg?Smpw=[2X7&ӪҾpr,ɘ6'Ûa?jاڦ}O ܠRkJTb3'Ι-W8ǑI^lxe`wŃdVi[uXR6ߗLITJ*1OIiݭ+I2.x%4Î RB͹Z} >&=`AV߀⬾f재>+O*ZLucDWw_[34C^gh#bj{cܣA.5]|ݣ6wαo\`\.⼟՝MHN8佅Y#a@縊s$/(z,|B"'9_g߅v-kj^R5X0xn|rs|_&_zbwt*([)]c{x^~\>?˿SBuͭm۽iԯTid!Z_tı+fn3%;#ID nnژ8% # Lzv+R2^VV WG7ҥD5Zt~c3J"'c@h](+gy9{Jwwo}y LRtFG?,4Rqٺ*0wW+Ҝ!ᘀ}p˚}??8ɧ '4\D2؃|_ǀ:qVU  T\uuUD(vX&2iMX"$ L0K4b*CT, SH荷Q#XaM7?{x^|NdP_( 1'{wt_ UfKQ>E]:^ %C$TR A2UCAvAA"U@D")=9\#%TȲl%-]5aYڰH"$2\4t .A U%-wBl)EcN 9DAuhh8u3֡##VY2:eTb J8D68YJvT1G Q8^4D0!91X0YB.Cq]4ҋ\%0D7b%x|{11\"I+t$m‚CJ!B #["3I#%?; LKMy]!'c"z!7HlL;V|7~$Bqq >Ɗ+<81V2$#q˗0R%r}oH?f2,ߪ1DS2U"l뢰>06T ow9sO( ړoحY*W$,ġB<==hʓeUr:JDciUU!NK/3s:u?֛6l۰nf-nb%L3ApH1ͥ%8/ 4bL"YOtF^j 2xmc\k4:&Cr7 cB憄MAEfAIʰSY׶ MZu96۞( s6M(-0)nƑ ȩv 0y3 4˔[J(EH YFSkb]gg7y"C(OlȲfa4\ X0V"*<TbEw4e!,sXl*IlUHJfkL&^#U^UvWEբ|jܷꩽ.x=* )a}y]p!?V@HP ^!W[0\EHhLzg˧Ÿș`|1 ՘;;:LM}54_=Wiƫk8ť#s0K@|`A󾛔J%[{e%{gs=zt7a& +f.d}NYgqr?|GJDDc7E#G_{"f-~!~"'q2h@f0BULΒRuzS3$#O5½yY~cuSTOԗ:,nT-0]G՗&QoiuUQ!%ggJpzG/]" ùX&Vnz+ӧQ^i֍՟PŚޮT+LWRYWȍ:65S@)SUNMd-=~gRm8ȹ B~UZT^ju? eyUHzw5{6|`xO?!ƿ6O/S\USctuSZ cl<\754_L*{rźm'm^劳5- P.u[␯E'gwWB[SİN1H^SnKFJ68+g&<%;ҍLA5A 4t;ݡ'`HoҭyH|,Z)Sb,p^(t]ޤs9;};u:pi|;\8 nAQzq]O'jp6^]hC8zwT>c"#۹GR=TµܯիUhRpK؅DZbr=/i3 oǼwOxPlEpP { aA!)DBP~*$&L罇g7xM (vv}."CAfL3J Sk;FEW{tr(YqEv0>=fCaVӾ.VJY $~͜wie2pc!{EcR>r vTHWBN$.@6L$Rƴa690:+fDomY1b8IS7յ|jC0@_kC~J^83~Υ)8O"3?6p2G2@f3ĉ7 !~PKxeOHIX!k0<TdR,4oqFRr] )?E.{V x) :50B87`-6)m3v;RG08mOHE ?\IX"p8O"B7+5`]GN} qޒ$$'b,$C2-fu4l1-J%u4X`N8tP©xE-?u0b^4&H {昃h Syo&VLĠ:cf"k4I& )s롦mD"D!"(?ݽ*m1,~[hH@4¸r ›Ɂ1C;p܎.ݾ ^+cx?IX6Ô&6brwQC t֓/vTWWQS(*J23rdzor8p!'ǖS5V5."  kMTl~mgxPi$ree(\ aSP,Aꖌt9F((X᫔h/yO"H}cD+F}QT= ~ %DOx  Pb#› <慤\A2%BZ7<9~hwJݽ7߼ޤ7rSHՁkRH{tso_ɟ'P[9~_9f;VDXv_qoq*%2EGmn0Y1ʎLGTr˩Pd;ⱧBTu1/A"Exq%fl [>*ׅcFO@PD;`89hÁՉDde 8up)U wpګF(uU;}xbo ixG<-E3*:(+.JY1p1br>9⃿h D)9’YҡK4s5Q1ο=P^@&q=,!'oРL:rna]MTq !R=JMᖡ)>!aG7b<%8 ZFf/ ] &@U0j𮒳^ μ?&?,zP5cu1\UD8yEj;aj+ПK6eAwlAϠAtpm}0@t?X5`!_l7 @ e&EF u[ 1v}/ޘ)+^Esџq^xDѪ_ q ϧlқ3t~4ې|ҔoCm1My3<%$,lh4sڦ NMbp(J LXH:1C f ;&yl_@7?e>!;rh^-]a_&66C ,!_"'En:׎GUAP-kZ5ʋ&9IzVvʘGLJ9+J9`2DXIY:ˌj"iUYL]tmŔ!_1RB!75BxPUN?nD> 9ﭓUMɘ 01lj4޵q$Be3T/d{lIp=}BRR&h.m&Al鮯Px,kDj| ϜDgM'90e UM 61-89!i{NX|PJ5L 'IPTcrcTrSwZqN_,ЊܪW7V +F`b2QbbpJlb>RWKԢp&)/+}&XcC0kSdXX,C F#Isd) Q?Z~ afDjO7X/=`ܩc [.+.G~'U:0C@)!W |0Zju wJ=bۍ$ls);m@ic' W0)}'XD55xQXP@28I#eZt̰:f$P$,LP=$!2ߋ$gdIjsMWyӇ?psquV_3rC(3\b vYᢹ*.<0:.e`*/[L`{M/(R*&O5yZq~k: 34Sۯcф7TCYpKcEbA SC=G5 p2xx? | y\r _$ApG JZGyUO@p({Oƛ*JKiL$]A%T{IHpVCd<|_ _ma iR0jlT?g>{cT\\jK[J| n8q~{ow?@/RWwތwo=74>63vK,5qjY$d},O\ËQ< oQIjiDy`G@kw뛾>Zc39,&UkDɸmwdpbə~KETkEr|'s@t[4k.!h#sI ZotgqXκ[=O@jLҋ5Go|JKԞ:R EJ7_lH3O6_"avg!" [4U[cVض6D.lu#1VS%e?=5S@Ap0 Z@p,ݎ 3g }zKLrN~UqX *T  #Z3$b* a}GÀELv"0SHS-VqVSXv;1 iV$J%FPĺzy :MښND4\j~qM nk@˖iO#Ą']g$hSl9?o'N@㟏s?]Ӄ&O-yTI3}w=Ņwwoo3z'kp!hۜ{dSmsضxi1-YV ?9O9vPi(Ə/͐sx8`}Lࣷ5к66FNwk^2O;u)wꪃ,}.wS&0a:<Ʒt6dE<*y5}X Zf[C3ۯbb }{Spxcw޻)=fPj;~>E&ːAĕJ`$DYv֧[NjBf1SpVi%Ғ H, ϐǰPPvfR"P}6x:—~w7fz^ţ`o{WFנS;ѓcb ֶ괶Ia峭mR^PJ{|wϫ6m_/ C}dZܸ޼/ }\-[$$[#ڊ~AJh>b-5O&LɳXNyGポ^%}`3#M;oh9Y=O {I>Y1[%όa^s':g'aT̘c!oY;SpQ !όA6ys> )cQ‹}i9w1Tb$,rHy螑R^'',TBJ:G~v+TE4npH@N' K2tfdUPV=naEpx"8|cJneȊ9WX.C:J뤶•ڱ8U =)>kBN 2gV po&|(2RY ϊXʚIV Gc~;:Q:Ml~>ڤd Ij:̭V4i+Ra+b!=aXHoHBr}L$?I3ł(F)|_N8SDrhVUU, ЛLG|Ӫxb_-^IrmċX|aP籙QctNk"i. ^;ɂˀ:}*ā搳>S$҂$YcoKF+?,(o4~ɾffr0OY x$.M؏׷^Ohl~ he¸IԺP×TS'y(_lL(=*J8nS.)\R^^+`_"w>ѼT;[cJq+v"Mf>}DM<ʐ.c&uXd\@xNHv`Ӛ`QSffy衆ȦòqI;i?~Nn3#̻clYEn_`.Ӓjc_8-'ΞϜ|1qQqJ}^wSpb-; ǚ r<c+g%)E9L>gu*ٚ>,B4'*¢2Yo4 +"O?96# ~|Rۨ Qn2ܖa;N2bL ,6W1 FXpKrFy@kE[ڃoIJy5 m$ 1eJ9uF)Q-X)r3I&?#6 5pBhT YRO"'Oÿc!է^Yq*p9>מ`~I0j% r/|*j}o6|mRӿZDdY3yJ&3W|=B$0!BHwJﹲ7ʖ&@Om)iUncMWcQ7 31&d֢[OBS^"LXCY 2M0\ZhnUާY: HڂHhwAmyN"n>Enb_r oub:sǝ*Y>W A">Fiy29ǵMmbBCkS&"ֶע2&ub}uʋcז`lK\xqӎ._\VY)([kgZ6̳k[I[|Y$[e!9L:]qYJD:sD3Fˌ & XHM月5HRΔQ+gk=0Psv??ӥ/;I<0WU-};ڗFƱD(颉P!jvyƏp;<-Ml2%%__毸*i7ۓBF y򶬐bl*{gIkT\^ FX,iN 9F ;\@|K>V:q^kBG7/T^9ᕄϽDs#s3$̃qf=TxR b ھW ZncK&)0ֺkbE7N՗ +<(䕡XLjނSi,lxc95AK l-X[5gdV+@l_yT!F0U i@ɌRJy(̴2K{ܸ/+rffw2K\ ښȲ' %$ /meS}CH(q0qi{'L-> =N*,HR ө=Yz۵jB؇(e٬鞝gVSFI ˡ/q$\_~&zւi:;0spyV#ۓAOOLLg 8L-5`_(M4gRLg-*=3@a2Ki_kwwx CemUpP/o>McbArNds; 9e]FD"3⽘KBD3#)7(,60T@k"i%f ]g(URQnR`E%^~/%#Ob:)yw8 ѻQ@;߂| oָq_dz|郯PxrɮSgL|糇onv6{8?z _(3٭olfԛiбg? M7Z>GeE]oepy`C=G\NĐ|N+a~BptG^LG7fYpYC >vhEY3gj, 6q3(4 +R@}aTs++='1-1ձV CL"w0eE+YIlW]2WgboT@}F>afסl#V ߇ +L׃ 7Q:)Q/rȃ̑-cKJi`Ƙ0Bt=gVOk9[i.{l_k hJ+9\$fGq\q-ψQ}E-uh@ˣ0B޳P+ p6t}A$ʍJ"IcMa⬍HD@vc$FbKѓ -$dz$BaF'{.$V-'tu%PEna8IʶU 6Y|`VP>]4k!+@t-xeSX1&h K$qm1Ǝ8TXNc!˚e:6űbGQ{$!E I S0,/ \/2( q|a{Sz4͎ګkΫ|~-u %XkKHHDw Jb vc, {H}Qں//q f6؃5C$RBLj!,QH ;fc֍:fQ(_ q;]@oF n50X$Z{q F#0ڍDRQ F2`#T%،R1(\ j}2zB0c?+A)y6e6ØH:v#A< CX b&dVG~gLbƷPXaHϧ?5X͹`ؘۘZ @'7RZ=1L*'+&&7 9 kZW 9aɍ)c㵬X _)vdon!$8M%f0R))JbOOrzM"8ٴz._ɫ| _HԻV"Id dl1XM $ȅ-4νZZ @@)0Px~*fm7q7lm!ֶ$ն4 }doԕH {J4|pvSܧ-?1,[Zk%^ ggO,iES񠇟k*╨E[VB{+6?97O9P бDGJa+-uNڑV)Cso]U^ujwpP MpN`k0kSإ߹إpyAE 9^ǽJ5G^.74}[\RVm&{in6':mwj*~(րƴpǫ|Q |;jm:`?xW_5d\D{ϫzn2<:!1'aBP0&drjf},W`偻swOu=3!f{>~LÛ]zzEl K.zh,O{IB>8IʹH#IK=[* Ty"JY X./o'p+[}%܂w׏Crhͯn08_:@M[+q$tɤZ3uyv5&eM(K,(b,JxofZ k;yZ_ Hps9ŋx)߽wL| {{.#|rͲ=ٟ-haa'7}/ִ~u[.EI9S+]!`KTc :)Rc F`|Vr#sgΙS|n }I8W 7j)F^[TC"`QE; c/tw*&øAjQ_ n3rà? bPW8ylX` ޳8J4af)WzT!%К}y&nto4)})"pߛGBH/%$5K2.iw%g4jTlj*rdJ&Wg~G)%g3J^Dg9f0Y_('hQyũJy>E_W1Jڌ-?pǣu#d ƈrOZ{^}RiEaG/170$>C"1c ђ m;(h=BNmS\xR튘7 S)p*)j"\HUi)>ddMu cnQ_ 쌶$NHKDhA?5h43/t%QTLUWy$ЙH/(z%L5K f 3_,ݜTYӦ:_fvZ&O'4Ůh_l:^Ofi-*Kzi*n8WJw{R$/LDW\)͹&9XMgлߢ@qEda%k* l Ydeǐ`.v!m]MAp6 7䉯3Ʀdfs0xȵmZfV>͈z(„$mIs>>`e.E~ 6vvUN{ -EBTOiK ɩskLr޵ܜIHiMs)v 4a`e:E6S>;;Dbg2F}0GĈ>f 4t cx; AdSb%:$W{, Ht;[%veψ9~w, іwF/>ztgsiNwLF4A2ڰ(8%sX$R=<᝻^IB^3}6#. !ӻysWl={[ހS6\a;hyB?${D-, x)\.Ti,m_>vg9+%hK9(@P+wtk/oC/3Óu%mS irEW4JQT*q쿒2JsS@eעHoMbx{8n_# N'M9An !Ή/Gґ~҄$08e9?{϶q| îR%@%H%W2u9$ .^r)!&g]=%IEA+OcHe< g\VU!:Q c.Q@W4.H;ҭϙ";d)uVx7l^tɪǣsCft:pP~qCu9}zRqw,U^(W67N0̈́pY~y?ȣbA ?-ZklmrD |&sҖ,üA$cRQ8tݫZc~sz"P4LǓEؓz]}ѻ^D?./:u$ji*/T*-Wǽ\tK|2>wz?zu6prOKF8r9y3qB ~_?~z?_\ՠ(Jf(b~h^kDƙcjc,d. c}AKՒ$#`+œ2GY +G@-l2Fܽ|WRl op_凿=>H8ha"2`荶Jmp7΢x꽷8WjcAA }^jImnoEx4;k,ѡ{x;)_5nm#n\j$&%VrJ'Wtj楩쫸7,>xTd^E/߼8~ٟ?;; ~VcxMEyvȖd~j[dDw oKXEv,`ۗb`UA@`'X{7gFęb]s;QE(>+MPF[ADrgg3&ݱimrZ'CN@a~V()qVk+q"&O-aq];XTO>$6'Gxd%\k"Q ,̩ahn?LZsΕ RxF-\L^ ;.M։p|Uj7 rK;(ke5Q0r@oowk.FsV6F>IG1 )bJ:"d d)#I-Y.4kYyd}m /ڠ:()[U뚴dXXm҆$iA,'Ye9`nIJ])d\E՛E:ܮ[ ; I5Z=+hK^B2=ğ`rBXMʕdY!{X>956u˂$cܒJwS޺<'᝶APу+˔+1$5QeG 'e6ZDȖ{yj ?u ߘ$hU8)hJȳKlF8Y_HƘd"Ά̊'@ Y$l[5׬]t '-˧|"dMLhHAICхj'GHv#qFۈD%&$YQQ ڒX9$Fc7R0V8pOg!Lg>v6XLYfD>^&YId@- **nk ,s9>E ^Ie !(( T4YJu3f7ƶC Efg~m3WZIbfUDjNmK*+VmX[f!6!Ym;֧ȶ{۵g:i3&Fbe0ovݚ f@٬1&kf s7ݘnUnyE?|9}dv&EŶhɪ8N%g0fu߾~ bL֟yWA;% XJmk嵧 |\5V+=zy+̝(τeEfbt&$k`^rF孲JeK$, ,@ZfDh/ƪMsGJWvl1d]b*!Lhk$(*6)*(d 4{bfm4ΗRM9/8y( hN6h  &L8"{4CC/7ެqDb2[_Lfklni "nfw##>scK#PuU{tӫ{T4!E& ںFo}bf˝!|3C0;N}}.nKѻ9+~ηmrH{fu.lqf4г\?3&!/i6jy3ݙѺka0"@. W=w~J;}У㧳Q}c r{7Tl[Of WMɜ;^X{7?§s&m8k?RYI "LX!Lb&cVNE~"I RV@ Q1槎DϏ^!݈;/z~ IփX7yyis;C;춹nzEZ\ n$vwI`,XqNVo+j6W{Ɔ`IQFfI%l2s I8Ve W46jf.$`~#jGMCZvecdpoNa)ϡ6]*py yAIlY:,vE(2ȢaXot]fY74$Cwy`vۘV!*9S 24URb@.b 2A H7`Yd5_&F޾ : 'L~ 2Ƅ(./}{D BvEڙͼJuG;[^|m(`{n6SKyǹ{M26+5T]Gk$qFQRpFEǬ$E"lD7(W D6@̵"v64fd@ˆmFAkͦkQ &X(*YT1)'l"21 tِXpP:x=J8Y YQ\B"aZȑba2W| wdo`l[,+8>~ŖdeKn[NFIz4*֋b(ӎF:$fjDAszÙQ)ڒ|~ꭚjݒzwiV;#mk{S/V2sӄ`UdͨFId|F!3goI-c?70JPOf ??4?-}ϾL5 f8TPy| 3_5i45RZ 78_w>gN0;mS´ G<0y&ECj3!˵3pax^5c4e%j3X%jHLF9 pO 40 9e3`JToN.ٔƔudNLi ^P`ՊP,*WDR)1*+ ƬB2_=JR2`Z*JrƁ (#pC zU0B ͕ 6CRR{9X%.m2TICf7C-!"D2R+EU7n ÷@Sʵ LgT&RȠ)h? I$틏Ko(rJkJ p.Z~~N7@2}gIջ)gKaRbv†kQ)KcJf@eƝcAxIf9~9Ҳ3Uc>rt\)<8u'o*a5yD+58aĶi6EAB?i}ݨ sXzy%: &K kENp6~+3'w0y{~,~(c /)0,}KSrO/QX).rᢪKN$ϭf I k \wu< Z)'>]:P t4]_qzkFQbgT|mlo{ 9>gG 'T,¿WXngq!n~l3Mni7jxu 8+\M(N ,Bq/#>c9WHyPc媋wQPP2ՀPHIsx kx̊xᑎkXKbW=p?I!PnvҌ ^3 ?L|G+ f1/ǘoZ=NqÅyiC%HGo+ tCJoP195F[d묻}, B*6ś;Aֵ2v9&?xli3' MA%*׺r~ޜf^ғ]OLtu˼ٲ(Iobݛ-2 @#Z fw7#2;M~OonfZ~}X1S:bpfKHjVi .|404kHe=+Ui^&/݃YHO񘙼 +'$!rm/Sb #LYmCꈏ?;ʔ71p0z8itf泋n>|cGlv[_<_ol'O6{{$&WEt|/{ߋ[Mq-} ]h8N%rztφk4oHwh/2" "5+QQ{/y3\?Gs3?6Aջ7wvz>ˡ# jΗcog5s,n*-y`qz`GoGAAcq A3dfwpM_x {gsQT@, _ee`.Ǜg7 ^Tv(7LWl: A!,K@Sգ~ig+CǍҎU5SJЯb]W⤮8Q@=CaoGԑw)RtRңFaȮT] P.3(͋ 0r}F$'@ʶH$vlm"mU(t̿e'nV鏵JGZe+$þxɶˆwudH#!O/Ƀ$wZ{Ous^d!].ܛ湤25M)uDx^y 2ʼVA@ Gg)fN2\ҏ\D!Br*$.<<7zMx<yar4"5ty/k# @z14>&M35 cx&1d;Lb#mg)EM:\UtϢ&wN1r/MQ0&}#tiQ~)iΥR aϑZ-vZ#h$LMQ5,Hx5@cgwHrYg$% *9 ˷eZ֮a9 @gh$ëWtqqcY4q׼тwL4ь8dt;O)SC%Gwl.}-J7`deL2 ,Սl6xxWϘ/lߘm4K'oyf+RvepOlv]?ZSI!,'$dFH& cٖ1v4GJ wn å=8qcnuDt\|b\T,=e0ƥ 8Prʾ8{jKBq҄O(jln0gC.eL9cef j- f5?6yQ~ >I^Z'9%`e|G(3ML%ΕTT㣾-BRT|,Ƨ/r%.Q{o5*瘝g@6xwAF1ěœ*zB9LUƟ?/⡣Rڹ.Ŧ$KwcO6R4"sI$QUqu \IE9rEk>4Pp"8bA YK^ĺ@y5,'Wߞm;JTgEU|r;0Gj>=?(ҷZ}},tOcqE3XCZ*]VIA_щP[u-ߍvݛ²dXSiă+,sP4xf(Y ͍+׋O^Ncݎ拦ș7})/8 '?FLn2IdTQC.~yDA m6Gv턑ziN&ՓA]KZ+t $бj/tk:*E8Ft'_gΈN(^G G)g-3`Ȭ&43EN3<ٶ9:2*o Tm Tc{w LRh="G/W˻<[,q$bic`C\e[+,3=,ug9?_w̘ yg*R*D/oc {{A{5)qNjUUGO\욋F;77F:|ⵝ`2zT/ ]8rĢp;c+ɨ$)Hj=6uv"ӷ%U_)YKy:^w9LMYitAkG.xۜS\/xL(izU5/kcd17=-Syq:,SgESMa?o&kJKAD:uR$cg#^Uy5q)y}IRsp)::3!":*U2@ ^O)~RK7<]F=|g_ =%Q t.|%ъ5LK89nnvi.sIJzAAzFuB;_NF../آwBo/h/ l{!MZ~UѺ;; 4is; E@<|q"E,!p[A;q0#ϸ=#+dD #뎥sFrqD4;m4p#݊ȤO[4\]ix=*dTYnl #HnjsĢ:?Рr6yp aˉS7?-X9mJQ+a))GWjœ{{05H_*xf-޳67,W\wϩpݽudwBn=ԩrfFv !=yJGE.زt.8sK+ͽJA 7,ck$V ڴѓ=;f&GLlH[-,g^'GX^Y3u#X 9:M;^) Gә^^ C?%gNr[rܺdp@5e ׭yG#Xeq?~|ܟe ?\kGL]$hSy>8p'xCWQِkvkFZS/b&:O^Oc?)`NJ'86(BG.Y+HV'/ lz@R$]]~L>N:TfLfuo.:jc7Q בf+XjC@TYE "֑O (N Yq]PQAFจJL Ku%:D; q3nʓ9</-%FhXamV.C(fKMk_T%Ӟ B% Zg1i(瘆^ +k/!j7z"Xn cTr7ĉjJDK>H#F(Ƃ+.]J%xPS,5% DRkR0dqrF)UT h\fиi1 /njWH `kD Z |ƥZL+!)Tp l=v1ruhS' NЃMvF6 5n*P  U&v<,1ҁz&l:nsa#EK4Nkt!g]Q)8u;OY /NduNb\񫁞?nlNW!Mܑ\.in7{0 A&MR !5WL$>d'AQ1ӈ[nLʆǓ/H;ꓑIbQM Ic( NS74z?Fmf E3IjS\ʭB7}sW9d]Xoe? J)[_$PQ<*0%4 9Nn9P6?Mp)z 5ySԅ8 7h* %V< \)pN8(̺._.: @cv1a"ace'lqL~ryf3Íp X fjl\* eSNJ$,b r5pSG>;\B󗏟HR ) 4AjNu )-Iޭo1oQ!<hmfxR E$2[t)ȴliѝi!s+zw։ىl{5 0@_|%BM3)3Sb7 jJcrލ|_%љ xs#L9KwK9s̪J'B#$zA%ʃv8lAKqWIJ,P p.4l4:n1pB=xqDPui` B4xQR3Xdt(]o\{M`>5>SC-4bb]+:zB TkNčo'ib)R 4iSdp\s6kJpIP&M㏉NŨ%kfAc՜]QS(,G+ߍнpR@ITcIьh|QBXhDu> S'ʅXƩ0l? MŸy<@FG66%)!̊ԣ]k v=):Z`u-*i\+ Z&#DOf8I4,N>*ffGkѡ Jg{quU! 7٧! ]r7KLUZ %"*뮮{HyZg7טW`TյVk7"$H{FL8K/*[ "Ui- \#i= 5 1ڪU@O+,di䓨kp&2PX距_g#*0ɥlib7sx~⻨Ѐޣɠo$=C_8Y { LQ$,cga0-j&ZH~D)J* 9ZiHj TP-.]bZ.(Q]U{"XJB?M/E^~ zYli,gDw(]f!({i]+SaJE^ *R R |eWsr[}*!yՋBRYYu<,X#^]v* ߎDg  iL{.8Nu[IYn 0&RGV)ƅV(d z^ZvTKv^; w,zB^1 =zcKQzcA\TLxlA{G(K!ԥ:՘[4Y,G j% |u'QBiM9<1BsHj{\iЊLJ1_ JDʃ]RfJ+tSz#5\4ƃa:WuuS*E|O&ݿdM8CNy3%R@875Km w,|KK?׻O_g23{=v{xS_LNOi{&/o/=9v..8B~\7槓:;c<wR6Ƥn<<FO YHwy60 0}{r ϗنƽλo3ӣOfܭ;y|;^>8M/N̻Sl 0 %Q_!l/,O?߿d-~~OOl|1CcT{:h2n#nx.23y-g{4޳Ӿ9?f7b}qt:arNs*d8 vӱDC8`occkfC_Nۃ9k/v'͂O~Kw]jt0q} !pw g0.\<3'&}^?]O#d}rrd xXv.ϯv!3C./|gZkF\* EnӓrФe\~o:}9"y?Y([,:pt$IЁ\SnAǟNmuk/WzL- *R8>*" W) Xf?=۝ɇ |+٥Ef+Ŀʃos x7[7p}u>e 1jM)CЯ= :3& {|K.#L( +!ŅEVnIu]7]?ynO"d\:u[27`k^(cuJ\OR2HC!„^\Q(e~w*JlW5aZ""? ,^6?8}Zտ\l=n$0`lIZBvpd8v_v/45ېbl<of 6pQy̨&KSgJ`V{5&WbbLJ HFMB2jQQoZQ]a'XRA% a1Ic6c1֞K& |$W*ӘsUerwM.G Sɫ, ӜA x:YېJ8*%Oc Lj'ePNA V J]5"_(:X+RQ|3,BIH-LBjaR bja]^2^&Tt1wDӫ@71͐IJSHAcI[fzdBj4l ڻP3 gz8_8QR]]m@ANqy rdIO5%$EIs!%ʜÒVVs.ŲZ#0Ġh#2}[ ؅]Yt_2MM{O75On#&C`_) ULHL*$UI#ce 坣&.rQmcJӺ!iWgIkGpXrfJdR28GQփR҄z:bdacR錁0(4z+KwLJ2z+2z+>osJ_hW;&RN ,6ަ8!$*'$bu2X݆gr0tH]vrنqUFI<ңr~?QK {`F{:bdgni]&OE[|c.Wḽqv_jm\mz\dѶ nx` >+;,åAuMTAuMiյA:m@4r~|o" dGV&7J7-D)VF,tK4Ql e]|.JK Ur!A 6CkH"}g:K:K*}!EiF*z]Vڳ5O5UD wӮB6NzOIMVkRfP:E.)d8r2:]AO0b`e(@z0^6O El03F7jX]f Zsޕ9OJ$(EIQ]P= "/ nR E ;"}LMdYm@D#Xc-fت2Z+bU$>veFvْdR`+X @ՊBj3q#z-*6 Zwh!u _ûV>~˽G)+6Ӄ4z(H69p-Õw.|&p 32Q) YfOdL-K"̨2`6[֤Kݡ6A߮±Wy4Ps9>YR<4ӏ~4sg?|~^yW \/W_>|~_^w9^N3 Ğzdߟץ?^Y ܲw:sRrث;Zu c},yxpA Mp퍾GGڥvd=r;JX>KfmXO7'.%g3$̍ vb=E(t4и$ =]dh|^Sb4hi-z}%^p{}{m]YZ[[Yvku$VݴFQy[[rݞoZ0r#};9"2"7i}h& N>qn;_ysfYn /Vb#=fڤiHE,xL0|kb Gv8 D[E,tKsWBO:H\~Qn#:j}[T61礋D!rP^(B8$%em!6$]A 6 ͦ>s$rOg1ӸM̧M>JyI-"`LaKE :GX: "T潳7cGl!gF_nxxe(,kXն4ȒN) AR@}bcLl]![袤2:탲 .;h,\n#W<6gvZDe5RDE2LKL1R`&Ohq TvCfM|P-=z[8،^5oq̆4uLS4uL3?:6e (,gޅVq33-"Khnӎ[~MZ|.:K@Ӆ3Rr9FaMje2c $q1[cYD<:et<F'k+ όq{'s-HƊ8iK5Zϡ˚"+'T- &pP`I{kٸh1VlYB7EUNJg[ 4ZnOF7;{ZNosܝK)DmT[6 d КLK@qK,Z@A3Ět,%}PctIRLلËB+:a]lw-:8]8hR:k iwGSk%}LAT(Gq:YJصJ֢*% h :Jt)Wڱʶsub_.3P 9眲ӛKsJԕ*SM甽9嵀R!ik-+>uT<>= [>OS넭~7}\9bgPYb'1W|xzGj2uRiMn`2B)M(>bp&e@9e w_ RVIie $7Bq8~&+nD?N'яY뛥ѹQmtjF'ܨœT/N{+'ztNj:Ӓf_egP`Aֲs]2!4#c5/΅DYK0ѪJS g>be.yɹA4[n>2 uq; ~Qmd:gMyHhStӢmVFWŸd03Lt?SUfLN;Vj/ZlTy,oarUs7/+,ߟ]ϚV;W$^w;=[)NXC]5$tR jHY)[5$B…`'kN8V<.iXIK7!lWB7ca[ 4=S9"nMI7hVWYpHv8cͺlXFKB2ZjuIXGr-y}:b'ޜLۣj2 9%&Ee .@b*;A(X*LPގRMgLnv*0&݌I7c͘t3&ݬ/f̐R(BʌIјdX2[oI 7]#Y??őp? 4l ?<&i{y[Ͽx/-IĈ>iŘR{ $sf7%u>1 DQ"b+ۍQgoGv%9ߎ߮ѿPVaՁ.~]&PAF:=P𩱉 pp#%wM`2Z(2 *C9Eoߞ믿d lL3L_"24)Aj2_?x{dF{H0.Ȳ[{דRP| /el9LPvɏe边J+_KWN8nIm+yG ]Z<8vhv7, uO3JP ĝ>er=IU/&k%u%9!<5Q4td)x%HЙTupI:ckx5d xiXȕDP6یT|W([ڻ3%$1g#l^b{xM{~Sx@rvҒH7&h xAD+=vjFgh2HYB`ìJD$4#M/j^\9WHZyt_D=*{W+fAV}5~għ:;YȻꦆ$t|O5jc{xyL$ט?uI96,=&jUX7mN-zYJ*E$^huhLIϐ=2ipf'I6rrpx\?, h_ƋȠtcgLs؀f.0YaQ>̌Q8S*$,V42RyCދ2B% :OXmZRA0S#ҬB =Zm25-\eauWxC7e~CҲxo|\0"EtM(o߆%M-5_Qxtԍ!?{WV y\#3 0}9 #W[Udkq>_L[52ʖI&_D~[ҲP%AQPZQE(Ѓ֒a\jX &0)`.H7w' ^̞Nfe$TKEfaa&.E>4pI4΢"oPrT"#rGZGm?qoQ s3fS ar4-h4mAeWZhlײkedZ8DZ4@4_K,dl7"e}25/Wh$,Ax񵃻;_ZvEEgTPf< FJkQuQe86j$E7ve xBן%gS: '9cIر5-J 3M.$: °ka6 TWA҄#QMpRuŪXH0*u`mgJt+aTkۙz:jJ8'7A/*{gc4TLˡ*MD 5+(9QDq('Vd/%0b ~1콂b3^BOn {Ne [{rAYU跅]laS^ L8=wMxtHB@rR,$d |Cd>,IEFJ ؁ଌ $` %4 bI|boߢSN΀v`9ɚ:r̃ðTPk@<9,UȆ"{` ]Ip, IΌ%NӇ ShX>oQ rB |tHyuN(m=}Xy2Ltn\΄B$mm"p4Lt}!\r}kz@g\H{O l9F%_`{NKH =Hö7.7BwA)(TmQWL;F#s{X݁e8 ˿n2;[Pi\R꿞Sx\ypX[\r~{yӏ]tYoy[l[l@dtÌxfQlȐlXlQpйO'v_MA+WZyrrZ~Z:~2p+u]EMvkwa0e(rlc|NXl!a Y7ޞWן_'#Yn@<.Z:XESf)T|!U%WM**de!&!`Y$CVU) c|*U'DYjA\Bp$ϟ~NQrEZDIK;Jh(Ҫfn[%UQB)5NlFBJdg)͂ bL ˓H|?_ė*``p{W{Gs!.ķq&eجMu\WU$7-Ҝȡ56JS1_{7/IܞAĔ.>|v4 9@ӕ햫j8纨GYyMй쁚jLB}AV<5q:e(hoIIEI’&KxG,N_. 5iA8jmc$*!ķ*>PK]ʩlbP%saޞERItXA/)oECujCWRMT@ $= &(J'8n&B~`Ѳ򉻽zrJ- AEoʰ`d4Z ɾ70V@ =JP:+JI.UZT-40'7]*;N85銥xnG6+2KN^X ᰔ*y=H1VcFhgIƚ`hX>rwQ P ^Y) :J'nwi-~oP ?q>8347>8vo9^Z_aQ'-w`^A sjb&]76?i8x]eJ(Q6(H{knQ]( BmR42l.CٜQT|n!fwIGkiGh68ҋۼ:<| eZ'/]oWk%γV`[7FCߜ(7;+o&\-R]TLz1oxh2%rwl;+1] \-ȥ 'H8?PjnטQ]LR㼪f9"5CAsA꺓4f[bYq[=G6ax~§>&0ƅFS?ڏӑ|v..t ׫xUGpA*UxUW͐o>n?E@YClʫy3Ѫ.(g\HMv; j8t>m3ru}ٟ`=9{w?lMv? .}sjU..\}l+]Oh|~kE].]__a2 oL~b| C&e_e[#˞ۚKy[T v\5Bntm'>]aRo/6Ͱ3gKl͜pA.v1rףl nD`qw3s-i\uxv=o``@G,\tp6(‡ )c!( &lgfF";i*Vpqh͇\n XrqyɮM}g={z\5/\M_zǣyW׳yR]!!~35 z^si=9>e|{uA }2<% sX6&L4ڝ8id _.p]k1ZD~y'qwӫfO8f`T-ȿw aUNerI&AGe" EFQ`R'QL4IY_SqzRpJL(@wZl  "I%:;jސQTBTsu٢[MRc 0qy xyx`70νJ)+>|P1EP}WI']*mM11ny@" c\+pB|\eKG8jr6^JW۟M~3C7e-9[Oj+#1ȾE3Ia+uчpt\"[DDA)6GR.'C{)b餱uR;$Nlj~) 4]4V텇@Pb8}ğKo~}dZ#ߙڎq#TI'!1<k\*3ANXm$-)_u1E)$|h|+e?wZ/aL '\{/{W9/~:32b0` {_l<՚5U5ߝ,Ū""/3}Ȉ8d^ [hZƸWb|*butI\Uzm!ۛOJI2{ylwRz 7ietkODMw1˩[/`}jO}m+_Bٔg:#la(ݘչzmK3YyIy$Sx->d@emO9G9fu?1yS!Y91 ̔o^Kn>~${ѥsOiϿnJ^f%.S4%0trԆ {jD^F5𒾓 s9rƺ HQ$B(#J c\&bS82\~Kҏ7o5 k̟(%p91s8F"2J ܓi!8 A:R 30(}79 Jv3yG6T13LLb RjgWwrEs羔FˍR_+k6\.Z I$_o2i*We㪌z\U/OHW7eL*8KL̘)SdqaD[A*(H[R ()X1]{c_.V^e~ˮ }x]z Ųa_"߮f7?mk^lM7o΋\mAvM}QwzWĿtt2z= &G%Pj|%z ]Uy ]U*8%a~%U Lg2 =` sȌI5nd RH wV].?1A%/d~6W.SӅ*I%n#گh#AT!h4E]'Ea:Wx<ݒ+Fah烠?|V3~i`& rydt)Ų E kE@V"κEm&EF)8r*@3#" _:1q2r3*@q\9A'_R*Qד+\BIl仑r]fd9O}s/U'.a|:nc=*˲)?s"-Wkz,EJǻ$#2C^~+'_! d$/rX| 4Iri΋c(t ׊Sc5joC,Q8a7܉ "8=4* )~ wiXT8%YZ#Et$*ѳJĢca ޑ TGr'J)Jb2>wUU5>W&P`R,MH4/r2&pQr822cdX$Z/uNb].S},[Ir})_ECHG6wc/jKw7!nV.JSC@K+6r>|;I=:JL{E]/=Z\v!rci'QvB]{!DE.]TmJnuwrHDar\al(JG{O~ӏ.O5{fgԟC>^V/k̖ ?|ǀA rTq̄9djdoDw.(6V^&gw&Ы!S"R\΄-i%'ND9`f -'><ׅYUjńd.^8 &c(ρJf J\Lbxl{6ᴲؖ 28g~Vz4nd2st_u(Dzr".6SI]Z4n冁.@[P E^qqC5W$EBnE{ߥ;NvHĔ&UQ-d@H܉ը""@O;y{ɾLo;i;*MH|-!N19S(Re)*&;T;cs s%ؤ~ClN#mMv!K '1)`p v[bPL q8@J|-F8MvT[FMO<[Mn_&]x-dFZ;^ll,/Z]ss`?͏|J~8B'}{> S~.DYZZ#-vAexUkuV;XhbӾ{'Tՠx9L0=` i.3d b&1BsTY3e?d"!@*XԱ6)("r}8CZ27$$,ձm:K^OVr!Uu<%~qi4ޒ7쟛Xo%$.Ka;3}:(!Cd+yRnH/@Yau{sZ;/9W˱Y +¸ %# {qmL,y >m5 FX㩠HOb'[!j%Bow(y.+5=y>]/$8p8Y>xk5K_=f/ZGø!o}| MUr\DwT[d[AJeO)ǧoe,Ѱ)x>\Deč'Ϗ%>u~k>3'kfs̐{ҷBh^cv܈!w[Nǐö_"Or)njmfوm1t-Zs&Y$I\D&I 1 X԰F1o>d-2.t+ rp]O9Q 4MZ3 oO- txD<;]g'su䯦!-O\s 8фCs`Ht?r"-ۛćU>u]h ѷ* &>R 7iEg/b"@09vɇfgFwt {pQɸK했UBXqBq=Ґq uu8@ ;n`)#M^tGH+&r4RcH<%x*NWYoR2nǡR^ZR luݔ6rQ5lx]ScB\ B!H؆ɭ[n[F\! 9dk]gEjܪA2CЭX(D,b6y :ۃ6+&6^eӓ}X԰l!S$fpS_3PZC9C[ 9'JZ$#@sDkGI pe'{Nt| |sA.CI Z `H0w PӜh¯8L2%ha'2c(#],qrChVP}j^0琏/hොC+^#/L4M{sQup_~,-"p"_~zL/C_rk| ֗Yϛg&cS}z~/~2MqcϏ~Voִsfp=o1H,E`'AB 2/"I0. 5 @,UfnZWw/?{WƱe ! 2qZ}![3_dNAlHBRv<(J얚T 8ꥺܪ{ZXbfHGi C{|vxZr{Ij!+{PлI*hr&j iPd4 gv"wU=cdVap?yc"V9NiTp$'x[GaK> O.R! k_XB^GJGO&xa *~g?W(恗@*3F dq^?J 3ǁ2ԔcKAԃXdT.? GÃj6j&ѫza:/S_:_f~I SݮpAwz2AG _f"=?L ]fr|x8{sZU'W?>pbгRNHZ,.5?:ԟOv}FR_hR~ՉS,ؽC|B`c)i^P˕ j!PM=P|keZ]G1dUWC}?o?j}$Ës As:#zS><+%$8:Og#T5MRyR+8MXwcw|oNNExO=. } {χ?Km&ͻM~(it4zyt4P r)|Knȣ&` ?dw0IMn?Ɂb T gqX'4d1eXp~qlty =XWSԭ>y~߳#>nFPl)tB sAQT\ Ѓ+jV:8'rLy.)XR&6b%BJ6I6IQ$|2 t  -!Ua2a |2 B:LB|!l=?StI}WTe WD50f#)I=LxC7}CWNݍi&ui"Jeh+K9sZRsC)ϩ{K^$[y;w :ʹw륜F+yߨFj6n{vрH-6rXaibf7_Y%Lk iE 8 / v(Q/@B]ϣïA2}%-V\yjq _z`Մ ֥+nD הMC. m0ŷ1_i$C w^'7~rq|J?p?lJ|uBc33C,?=.|^NᨹzͻnmNo4+IYXikcjfcw}4ꍊD@W7Eћn|c u'׵;orVJ2lFZEfS߹ WKZ/-4;WW\~/kGP53>Gc4F=VѾʹ2W]Snn FoFcAvaOS֒ }[%gJ@ Pܺ{];-PxYj\GB:8"b! ©yw8iR߹Z^'#țUrzSJX_L\QK5gvO=o)&WVLKQ IULECdomo H=zyw#&ŐZM<9XZI]Yɳl} k:Tm^D-(JJlAZl\VQ1`s1&˴J F:+csV '> 5iHkwԮŋEd2VXV9)"i3-9G*yN*S- %javFST!fYE39<$Bz/4sV_Ԯdxѡ!5C:H yKKTe:hL@udgtZ{<,% [Dr/ڝjЦB異NV`ۦ*!)Y"ZhJsr>9C<4D-] ,$XhY?iἕ k5WgO]D Z8B8#Fp ]VHP@cE:eA'.x%]42QRUٙʤy|Z_N6J1,≋X*7#c!jR^yGzn0ek$t@ 6W z(f*=>HxPI7K؋ ?U/<zI+pVV_t/|yyr;ZH\jxeKΦ8F(W$bSxe9 &e&hnr#Ֆ|)2[l%VYEn_En8epΎP <) C<6.5VsŹ&н*y2W.kfbt.f cV9CS/Y\#vA䇓Aj`+ݾ yS\>it\.-GiHpY%J #@B,@`$'FN4d섘Ba 'F BQa*0ADCV@*!0.%Eev{^=?{`A챽'g դ1*^VHD濜.G׏6Ǔ鬕hd_y1*kýP4ni4JށL"QC3rV ӗ'*,q ʹWS?h:3te($"C#R}>Zi"(uEϒL[^Aa%#H.1d2܎S)..R{9  x T,M-=QD/$!gQi7 BɺR o]W^Y]neM5zfu%n%FTL .`*TY?` \0#+3"Aa.X+B-]-KW.xL \1J;47G8-mnfGxE8|}w Ů4pXk+of rYRUݠi-A qs;wibQyeB ^#!q!0kpIBlsAo S+_BWbX/Tr?bމN$e"Y&T僩$3 ^R-bT!(*.+Y+ Q | 4)xY!&b}{''h6&DRΘ$]Zbl?U#Jjm.vCb8v$ s |fnq[㔃ЄG5b}D!fn0Lp9h"&na_|#4A6w%UȾ3Z;(R¨vi4Zˆʂ#E{+ydL-'ZPۚ/0acj*w ̫15_,c˖C+ЫhP"5e7rwmmYzYζT:e8Y vgI%]KDi)vf$d5BbؔY};S2RѡasL2A%R $MEFQ̜EjT @ ajfhA64AŬ5$.4'xU:vTtwQ;/wDCklv)X#p$ fzxBKKm1I&(ZL85HyļF4JN֦=[7&#(#hM%:4_>wSNkVs}xq|ۍbپtz5'˝r9UpZI  Qp4XkZꂡ+/rg]CG j#-hVMTylZy5(quaVEa TB7zRɢ̴3DfD>?ZEP'6 CTj]ǹPD\_rC Ʊmι1Q`^dV#ҳvyѣD%ַȶ\6Di!b5 5:NJΚEܚA;K0dI88`ԉƉ9F};yn |E`TQ|Gd\+P T<_1* 0STmݤA89ܢ 9, J3`PkmqA i   :=uwMR[S7y.p7 X q&…s%(Jn-Aa2μ70ЄhBźfB[3(8BC;RG(6f,[ZPD ֨GRR&K bG\ AzMu@㿄ځzc0Y'q  ݺAA I0]԰fmWڑR9ySẟVˇO;q*B{ׄDY:V$FoЈYC[-ނrXlA[7iVs+?>sfg'aߕf/]-Dֶ n*K3Ojq:ۈʈE`#;Z5ϛbNrBP?fCFETo]Β-}Äm\QZuz\ \,e:Y87qC'miY~eAqdñG-0pͳ1o+/L,Q-,tv2wfu?qg,WؓZc*@ǟ>y{%)Κg2CQڳx j?ę58vFŪ !?u>9ESi\av|Pd罺{fmTT>z]an$#ˑFu1K7_nBՀu ȕX &j$lJ hsy] /63fR>6SXG([}PՕ6f,w@ Ȅ>#LVpkra!#yRXq(R\ 5^Ҙq{wntM3l%!ak lKqa!q'翿vƱ)9쉛gHGXG臛{jP9lv7xg+zÝ*Ȃ~Ru]x2KO Jhg:eTuʷ*fZ( [t߇g28,&aJ m&cl ƣ+y8tm Yp40w*zc~ydcMv] p59_=gaZ9 nNgd /JFvd-xO^$GeS1Oi=͏GhZ+Rv jNŇ>(x w8 WIAؾZ47 S30M4ULˣ t MW͆ ˮ䈫izj r/hϷm+NUNJGRȂy-Bx/ -bx%X3(ࠌBx}5N!5d.4ۿ4j՜VqP8jLOf}G_[ 뛘w9?z;߻w1$9`g{xFi̒\%i5Y>of-pR3fzww&[ӃI#O+3e.*l ş_dޡd;ɃG.9D)Yގ$䕋h╺C Vb+{nm1HhbUIۺ~Tօrm%SbGN!I?DS~-T2F%#LC07e!/<6٪Ob,ÜOjoȨ(Ğ?ܠ{1+0! d֞˸2n[{.[{#˼Z0U\QQk#E1@qU\|>Kߐ?GVc >h{M_6r׮w[TV&8ڲ`e ʩ VRM\H AX3S<}Ό Ny'hڈ0đEP{ Zɴ`9vi-W;r.F#KA;o Z#Jk r/w+ \ʝxЎLD#([,x/28ejҺOKЂF}[`%QA;+/׿*inXx RB-Y[X3&:e5C5%j2\jRJaQQ۾y PD&W(7Gzrs.|zS>..揾gRm^|Z]覗O˃6˅=y qcqX r$H4@ tŗj"+z@$QCx-r5Ɏʥ<Qjɝ c<8AKÃ'M@޷pȆrSSbrZT+7|%^ր҂euoQUeO8 *eUcÐfQ LNw]-X9xV83t;( XhY1uQlE՚% fL7i@#ZȾl^x\R+(rF@3 4o]\H`ZʚIƕ ֘:PNk4R't*lD 9z$^ff[7(gFI +Z$ 8"̥E W>^՜j)I6z)ra^h}tv W&+ωtwHLpmhe 1o&Z%QA%]f+ ƿ?wcaFA<7>}Q֭2Pixg/++Elt73,,5xb\[|{~ uSu/o,n?Qpa|to*5FPA߀<؍VTBX6ٹTmƃ}ЍyOL)QşKQ Ȕr鮭{p ܟwE6k݂JF43,Ӭt\GA9'|,ي> ew' z5p ;>B`!@[Ir+q5xAPF)ϭmb=YD 4(wq$2(*J-orcܗ1O7xщ/Q}/ԯ%Ɉӹgj}4!׃ 2-:'B-1tOy}o/L18fa ms_D\qqKqKqiO)O̧-Qy#l?ǐqYx ΝάcN;2Ga)AX8,Q/l{Gq#2!d&:crgښƑ_QeϜZٸ_\嗓lmTrhG$'? R -P )*TXh43lV/Zp!Mp-@1XP D*,FQrC-7>rSe3m4#ô륊)$w.0Ҵ6ZlM&_\7ƋJNϥ yǹb7dp S q]%CJۄu BDWE=(hL%nΩ| ׯe]\͸Uϟ0d۔/~*:Cv\-!L`1!8ãd0f#CaSpbI:Ztl} Rx5;ëP3[,ѽiX%/?u2Y}!C楟}N' ? TqڊYs"/#O*}<R6"[Gw7GrQJ<_MN<fD@x?}K4{qF&׳ a03}f=ű?~x>9ΙX.7{̯^O ;N&y8`y)eA`yϧ(G5:Xt$8in_ >|9Aa`Hp*N#dO)7I %S1й@&ž Z5pe[{nk/E-_:i>_KK9L\LJѻ:y@(o\^q\~AS|ȡLh2(il)68%0JQRiX>h$ryw8Kb\@y\0Is kĿ`Xd{ɰ0TNI"Sd3XSp3>\ Q+, ƊLNΠTKL\S@G\"^LdmG4#GKtPu c4ǥ$LHYԙˆJ" !JX!3p!TE_:wW"q,Y?vzWB+"e;^11r0X=' ' {3Yx#Ux{AMC^;VX;ә tNg2C\&#JMxmk^%7`UwOs7-BCEw?\ z)G3;PyP/cdOFht=iUlA6V9hO4%H)˗ͮdL ]/xP3nIK'*&[yxYNǴu׃?݁u b *<ڼnob忈mwʿ>Y+u<$@UTd7G*.WXp@~il"z-FO"u*F+1cJNHnٶnѺb2tbۨcݎB!m[莆Z&F+)HNVny盷e/mV}mv²gz*Tεh{%/)wC{뼈0YWLQD̊#]Sy}&ϑ( NVd״Tll=]i6ܢm߽e#Xn'e"!Ɠٗq6`$n(87} ,eO7%Qj [cOo({L/Xs͂sw)޼GX$Ǜ"U"QR~<Ӂ&ArAoe)g;?i/f$aN6BD\baB2xׄ?7[XDeG2/SP(' Tƨ8D̏.0#_[JӰtşH['1rrd #V (! |0w8mdK#2:KFD+>b<#-IpaZ-M<#v9'JdV!2Eǩ{L|*G`U.s{_t΂4j|'xU.Gnfz8ׅ|-?"̟6\=^ rtnz7{}d+^^ hd_˫ 4ۀ#t7`i>W>^utc74ۛ3uu6xGn ^J m`?gU4X=w l!|%1x,γ/痥7y 'MD 8 9j)R-똃 .UIh`cz޴is.Ԧ͋LwiRn(+iXr{ӆ;A 0 #QXL YP={Ux+IA[!=: ZukHcR Pemf(En 9{k(go-⼙B1c93.\h\* ߶ScG]KJ)KWץ\HRDi'毒p9xO-R=a,+j!/!_/mb|T3/ᡑ +14dn>ؓS/6AWޒݔ{uT(0'y̓X<ݹ2wOb rjs K5!ٛdZ%5pB!BC0~qh5E'U[{@r*.6>ϸ"JcZX_X4Eq;I߷v8pF $~Yޥ`+@1-7Az$Do"5j/"*BJUaÏyK@i$Rt#֞8K"u&PbLFV"S)$ȈH@OL,&X{H.*o[!zd ~'V4alMbeHX0)#ij=EIRarĐ%$E'[>kO.-2'U,F3SE% l2Y 6&ՔK CSfQ+g`hSI{n*{?͞ lèFa]f=W /:iڇ?ƣۍ{A)qM gH@Up;d9<~4IvDetd8qkk s=R8,;aMX$1 ?=&R)) <ଉ4>E(OyW#A41fґ2B0ʜġ4r[J: i%{)9-!wζim~}hA3s(%w6qהjj_J,K?8ՊhlY<( R39% i!EofVa\3Z^-NGJp%O=;Bm%ոJtpwXނq ziք=_ hug fp[vE=\"aEʃl[a-vAw~ Sނ8|߂(\hDF%bS;w5S끣 öoXx@L\R?=[<8<s2KTGrvDGvlnR6.⢴)$j/wQH({))|䶅=)%1[w¯2Xۦ,_j{3Ό*XmvQ=3k4R=ȇF@ۏd*am-֘"a fqFr;pI̙0qimOw؆U;2-|z)@ %Tt\ꀃ^tͷ  w ϙFYDeK(kI̿PJ]$IT09Sxy|y'RuzWL~򿯇>??ͮ<\fFU ˘rRP9) h&pfv1 K16.3F.s㞧T"YH.lR+S$Hf2RH NPfD%6a4f[hmEt{y6R7RSG`ԂwR\K3f)(߸(tBRN*X؍Q0j&fk$[ݻΩу/H/Y_qs)oM.x~0.M?/O]5Wx8>?߻j_}q,|VU'3p}4o)/g?-HjB=<![n)6  )|ȇ/<.+_ͤ-:|{Tg !$_>) ḎC2`c/Fb˯Wht/l?G3f`՟ˋI7bҍt/&ݔ/&-{[a}J2Y_%`1לfLm!h*N0x6@1G }Zc>9 HK{P D**8^[14z| `ع́+zCWe:_XS+>F*m$2XHL8b\iFDQp,IFD[L4h,"*^`R Lt8&]r k_K4|ߢ@4@ۑwZ 1ߐǍ\_2Hq^) `ٳό%y!5L-de>m,SEb,xnu.tKZVPpY H$ 8gT1(]ZH0 xK^2xR$ԅx`@T4yD3;VRBVXiA #UnUBܖ6C$DcH_pM1Z?_`s7HTx-O['`KX *keD#xZI̋ MwADpR]ǕC^7{~Me zuj@&NFebvF}Ǯ/6Oe5硾m$22Ālf%n"=*)LxT4qJ`ԑRHǿ'=9QbBɉv\=EhF7 ."$T\l%rE.1e B3)bZ!9NN4Aɉ6Sł0J:KpJ3{åO`je*}Q)$dS]a=:bĪϨ"PhɛǧGa8Q~zxaH~YRpΨ+ߔ0g4k@HXPn 90CRܻcL"N-A(6 9P9Φ!.$:{WahyiG/M=I2z‡ .)XYϕUe =:Y&0[b!3t`v.lT,@W`ˠ.AD!LWTc}pq "ܙ(;a& ч5C˶_dAS$ʹEG-nH9)FL=ӆ cvQrGL"cֵ*\fߝ)dڣֈn_ g@~l}hT_@jGO 龅v| ST4uɔsL-ٯ2sD=40G`0ϧaWөd2UXD?ϦK&)6rA (*UBcؚ?_0sӮ-h\Ū pɴhJ)W UsІ)V-*t!]H$X5q;cjJA~}k$/SN.vחUc_כ$m=ǥ)*cvݲ9*St.2JLhe׊%3QQr xz{yݽy""JM -TʨQ1D"2DzPЂJŅol_{ە9Zy&uil? ;Π>33f]!$q7JUv7I! Ē -*7  v)x'РCz.cJq-{7^=N!}2`$5/d8ƍ`+\W*^ @)}3 iasd"0W bL j۠\$PnoU1BMYo_ljuՔdߔ>L4l dUӕ2uI-FDJs5фs! 򢶞,uD E3@psL3qH'8!Ǘ/ՋS 5ៃl許+Y6. jkNM.QIIUڅ.N"(8WàF]9wE&rr IA93/z2KoBQitNa?:ЅJud?!|a8 >DDՆwvyvQA0Za3m/44j ջU(&sX@՛cyuXv\ U-`a`MZ[h5b+VsKClE[hSk-j*Vv趼0D8m8 4 vavÍЊ T ԸjJ+N[`.]"+?, Q҅HBhCq[x[U5Z_ P9t=-B`3 Õ:Eha@"aUo'U˧ 2=G"{AUlͽM\27^=}ztaUַzmTnU<(FFEG0wRztζƊ IHS#i;Zu~tv 'Ko;hcR#֏3w{DQWځAkpLyīפ*6L ^&0fNawr<HOPAQNB:5tG+.w+&.$0G$ʸ(oWwO;>xU^/C#ս},_\篷Mv=۶U{v/6g/,HoQlE}d}v~Imi67WZ=rA7~}wn7g,%Ăl Ƴq[_T%Y#oYtRQkS'N:QlExº%,T$|C4AEؙ=s9uRV |yCLPIץ8ӡ+:C hO^ r+*R@0VS˲lSJg/ "ӵPY]Vܸy!| 'ʮ^㤸W׫`*W!;={o ]Dw(=`,ZP\W*R6;eqX rj&c֖뵲zں _!1nq\7p {P|1^.&3 ڸ729FF=K@K8@~ռP ]V?)~}ߕd}Dm]TŦ/$7%#G?/?p͉JQ&\fuf晫\mZu>O=}k=Z_ʋMHw΄t$&xHd~: 9 pW 'F}c HJ&f`5YtHI(:[2j|?`a*%W2zV5 HT̼qiKNvd@?$h2J5#rDM%zLdOszrUN1$4SͤnTT6V˻{1Dw)QN&~OiW0WHt臢LNKH<uI9ɪPou%3rS ,`+Ҭ *SA@WaUD$ջO~WV^/m)3Eh5KW~}}=y2\Kr;˕YaN/7UL>aJ!]^VWx͓F17>o]zaƔ!\W'Y m+?'~[b"D"j%S+d,x=7}Gf/P. PHUpS̮9LȴY}ABfضO6#lZ ?D,ܺ]u*u):8p4O3vb "˽̫;,E⼌ʃ 'ՌhEʌ[3nEma֟G;hVd8iβ~ C۟'mJ\);<=6:{cn{T6%xQ$)qJ۩SONMI.,mJ<nw̵06봵[ڶX;wܓ' D1;֢JdV~`WM 'ѿhY=-DH#ڂ.?×C4<{EfZ'IVtlm׭ _IėjL'Y5ղ?f rМ&/[al(>Qv_SH)4/\MDA2A fyn% DTYG; ;Jgj=ɊCHq /" M˅ _`Ts0$@@#7)&zY Jp,FƔLgG?{Ƒ EK/U}1, 6@諭YrDڹ,O IC{3C*Q gtUwPOfI\y(%yj,#WRycZr.#.˰-zGyP<0; (AN@&3N+Q oC9vmLKA'KC"##"t^F=>4cݞO'SpLFӂHrsĬb`oڢRIRi"P8 sPm?8ĚGxhל(po-5EmKXfПi(x@e"mh g?q%vƙijSrdL_*ȳױN#`(!$lV/aTxek4EXJ}lRmZYCt~cZ*10t.9壖)O2(׊a}7_~r>ԻXy>yYvo'!QWĿhg9yd2eN& = +K^tƠ&釈`'WgM0boo ?/`rqռgwn!-O8 t>tij\^j,U¦o@) 2d#]iج;ͷ m0mli647}s<0h&-XөAY72-OU~s7~#5 --}:%)i#!œ 2xdKG^䖜 JUj-u(KN/YcyvA$A)yf%M<)B֐,GfeR@)I,ܦց+I+{FRGࠂ"Rj&R2cs$1:+N,[;Vh<#9;c?7w]_7mY괩Od5k֓q|C "~ϯnrI!{Fy}`īۋdXNjn`'eo_y3돗dS>]0rΥ~qu1{W }7mA ܹQNޟ O(8n(y f{[%GwKEQ#,<;nNMi:m6UMk!@:ľU^ml7&6~ގmTX4IGmR"gRIgnure824HXFueGuՕ,\b"tcp"h/?-C+XeM:Y̥ED!g>p'$X\ !h Yz^ nrj~5ک5SirŝAp6MۘB[gu@2$#iaDiXž9pUb\S)nJ%d&IBQ0%ױ!zιw䊊dW H%õ,k2v8)d{۾^:#b$~ƨ(>0]M\6@9"h"IYfMJdwdйH -v $Jө7dZ)p5h{q]lcRf@L$C9z,4]ő'RRPEFؐ{낫QZ7E|SoO.Agg5TA>kZ=d4*[z[y5VAf]Rdq!`{ 4>h:+U#ZJ\>mѷ [~f" cd#y0REXyH^Eq-,喕&pH9KfgfxGKԄzYE PoS<1Y2W&'e)ZI%A3dgX(`ή/j$+XLB%[Pd$1+-zF8W+ 1ӫ K"41C+iuN*o'UZIo+t0> y6_m2J9񋶱 5eL$+Ʈ"k[{7-^7x]w#R>|ǜZ֨ )̒Tl> Sn H@?SnE](Xqd(hY{ȘnfGȰ*6uA-Lbm~鋓ﳓ}:-H XddMSN4al<,K[%:oɍ0ՉO5I E&kwoϏ.\hXm1ђ/n<&zk%Z}v-0N\\o%Q:+AVϮbrV!zvO)͞dC]QeULt(°w Tn4C1&M+ǎ0+Pm6h()Ji-q)Xś\(߽,Ȍnr;cOˆIoWYuUA`d!R m25\OMWŏ<,C K2= ˆ_NquuFp:ۋ-0mЗ#!vT}?k[[Ypsp0c;TzeFk5b0Y^!<%otY1mAanod l>ʠ]QËf4 x!SP;-%hA^ּ^I#^-˚W1]ہ>Fz7>O/1[`4g0S /[8f V(M (jrunu:}k'uXV 2Ă:2g}^EVovX,n=R]b܌:f \Q# }I]g]J"aQ&ƌG½⏝IXv3ъE*UfģJAmP~Z%Qh+-huȵ-i¿x#[jC[ p@HkZ,zEĢǼ{7iܶ9~(S)A[v{;/+5>e'"Q|aztwo0pE6ђ)l=yf<1[FGqJcx?ZLe 8EAJ'Stzt_d0BQ=|b.[LH"N TV5QuUb"`R7T$;Ÿo*MJc,8ˁlv"5ɢ.kԗ [Vzv{ON36pS%py dRR 5#VV*\ %RZ bږU_֙RjzFKև)dL1>!ײC![~XᣚhC}v&O3ykhQN~7v?T՚߿>VW>X?;Ӌ(,%;'}h5,9w1F !d6MMk .aU=-dRҟƇGj8+-M}EA H9 cxGS闋X m1]^YjY@KU{dԏzj&y , Ɇ^Bl"Aŵb=BԆSL U2B3[hH2+Ro>[w,W 'fw);㜝n^\^_4R(lb} ) A>5n1:I Q1Ygb҆,ZI gAzpQzixM&nX>' Y3")gZ<6 5b2RbJFa%`?5xb l=I'L%1lxƒ 2G.doL6i=FuX]@MD8 niiW|ȑ(M_+Sv)o446ݼa&:dSཥcpWou$?j5(H` ̘>- O7?{۶!8 3^{$mK[#/tI.-]pIɩcYRiE[);|vwfgfgӻ>Z}ssݍen܇Q K-8l$sݽSNmD`ˢ&jm$qͭsF)$.,U(=:8٢p3w?TXqT1*Ek]GU `%ta"g#\3gY1|c*9ƴ1cT 5 d(؎&þEhQF3Đ!M'T.J9vΞ(vE^Rڎ4=Z4#1_y+Lⱎv_GP߇' ;_F/W+-2}*Ͽ뉾J`k;3ޘ˙3B2\=tχ(_rg/ZXGC }Z1"Ylnf&7 P//~4\xx5C;JQ@ZduY[0V~ ċWXAV*1ʤ(_ 6wQ,xQG}5te|%Е4c}tU+u//}Njbp\tޫ "J?yNsd"sZ&HW5-jUPy+-[ۈ⤨װUm1K h 3K0KVfv~I^blCEk撪ķ:gE!Zދ0Wn4̨V(NT3\HVb}r|[]} t"&' 0S6w -yfiy|7lQ i+&s6q' L~hzwL,(+}|Q65˧LHQ.%yKbzfșx%S᭧F qj*[O U,7ʵNg̓=N@~L"iﭸK\އH aa 1b-f*IA2筷T?@Ѕ OЗ%J1hYx&8māG>Z}곏Y}x L{6bA0h*M~ 0ȥ3`QnO0Hօ}>miC4x(+Pgq zl"&i ϒ6{4y8A؅1țW-Ϟíu< cf}^HKj`-\xohɐ]lZ+g^S2Dq}kuw/3漵R1ʪյJ8P>" TlUyoSLa YR%t:gа%ۧYkV+on>{;|-q?_G1Qq~( 4 GIg{a|m91r=E njFqo h $h^ gq!ˇ:'v#OZaAz_; 'N鲓D Mt煮TE!N00|: (9C5&8~讋@դ$Kvq}ĹU9]<@CH8FSP"GM!ѣ Gm4\M+_71+72|C=*G:I5Gʖj_d,ZuW'sHIh;ǣQ? m L²$I"Q$\,GZO -yع}<mpQ\l.2hUU6yU[%FeK1嚁mml<Б-!jt+$TtE6؅l~i& +F7\ mhD\S ˱ *9%$>O*FqaFHM0&t9T0_;> LuVK/a7 alB*~P.sbE+?fʗpg;.gu@.)5t (F*bx MXZ4ɂ"\Ll9\TRQ͜ _9 W-iW T8)qEeWδE(! 0^\irP3w\ Dbƶ<$VDM+ט &蜴Ot3EC':I#H |Z)`(>ꤌMa:iN6($S@-u(fb3#z4%9GUgx̒R2.++۹x2sS{ LP"V*`6Ixw3W4aȉ C!:` Kc ^\PgbŒҥ$F@TYs\lŽ|1tw`׳['F:}湍eMy"|tW(ᖴ t]d~ !> ;^Ua3m刢Bz5HsڈrYX.[=ڵWNbՂd\o?SMN :;cQo?W-{wt n# ̔D8%TU- B+1:6oz#/*U{._· O&Kt%xXM3l7w. SAjZL]ؔAūWֵKx@!ٴ*׋.N$G#S8$DRKc6h%ym\<] xəwy^UH5]ފJ9fe٢Ω{=?eCmFTzO> f ~K1l keH#B\ ,N0`Q7A,+-WDUpY:"7w{-TRz}36L OɌXMz*`AZ0G\⊅teOd1bBX?8VHVo ̈FXAݺͽ6 l$BBPs v[M "m=X_ 506ʱQ]uQ<( es9ԭ¥Gz M>ZBeAa5y~|a7W 10/" WbhžR[ң= "1F%&W4o F9*I(cuq\.!H)sv%#x\ yXKwC+Jy)iNYqnٗ~~s'NQBZ 6Әr> QM^Q pZ[6"}O }}GaPX8Bs(T+ y}y^ UG$3Ӱ=foj&55nw螔l;UTH$1\F=5鵒*c;)⦄N1%R`[€(YŹYqsJuwl!W꤂t@YFUWT =d&;rf; 6NUUYޝ+{;'zxeu}:'^<'鑖`˯6S". ,{Xwv TmqŎkDuʦ,T 8& KCR:dGSn68+ت=*azQQl>#5.ɂj'qh '3z (s!m`nBJ[|/ڗPn,$ r[+}`K=ZyۦKmoˉ0^%budOBz೘Fʇ֝lCPG&x3Q t_tۯN~ϟdy LßR?l`7FEs 9!GCPDCц9BkνpgON4|e'lyE~NPs A`%v<۲"K28Q.Bm4s&J@bpBB&%T)0SLQwV(aܧ tI \q'$łBwm2vZ4?_˰qn pH bAOcuAi?@c!@IB40𘑔z 1BGu}) :u,HϳWUt^~%ơo<ԝC~v410&9sYmMx+Z_ 㙙8:"r_\㻋%adp4ݾtdܷL@G68u4Y<hh?F7èv-&.)U$dW?Gsӟ03F)|Wp, ? >e;p |<> ]2%.LQ\Ȗyze2}?4 ITPIuݫw(R=K> =v6_@}0rX>J:QS?5R5q9LM.g/#=[*'l;.Y2eOK,_&,'?X\qaƥ t͢2[?%F<~c)}y 䟻珉?l?z%.f__V;;P:=XC?|xd83ҷ6 W;, _#QЇ Lyd.cL4NP~@U=~J*d8.XƿG&}o(휁9=|fpj"]Gh߅?O+j{hz?͓ w1<ݫ̝?@{y gWN>LwɘTM}=JGw&}`]'CXFqV&xi:]oGW~yK>7X6bgB&둔'g(8D꺺a?Tk|iXw7UdpwcdjloЉN~BJ;(4ZFD{yзZ:%hLK_FO&וS*+;92RR)_q1:w9o'T|>\O=Qaa4 hm>'9Whk~o֟Q/Beef?2ZArCxՇUP^kLa`S>v؀<,V?ARDB+)2ܻMG^S[1ut+H0DMގJǹ8-^, QpDXRFsVEHE9n[5Wђ# 0vkc6i DfoPjb!hi䦛,f9 ~t.-yf -W(ӻ*8}v?{Av -Lo)'ZhY1D|WhQuh-QiMg}e{ <;BR s ] -xiXTϊw|KHqz0}m^)@C#ZƑi%hmDI;oDC'BEԧ+q Z 90]w[>5hodim+dW Mʜ˟ս(T[9 }qy5~1Us%RFLn9N^w^ۢ=)@{M<ԟnɠtI_+L kIy߄_Z KSwak-m< Ԭ;i 4eWg$:7OIzTiQ;KdF =S$^񎃆t'<W cV ֦gBuRkBMz3v*ư2`#*onȫ&BEz}'x׻?DZj'W,A}zڏP7`HucٟBW|ҡ.Vajdmٽ-.9Ͻ1ξ`˜- #^uJM9&8\&J*}s2ݽxtRzpgI8ڋ$LU5\d EV`$J$O}ܷ;G&)/v?gET+lT}2@8&\);RYOܹAxn<muF2ydC"J4 89͔xY ١qƟA6K|k *DR+t`0ITF_i|i#]!U3hQH0( .v[g08ꢏ6NД`L U$bN{(fÉ<6E!ԥ 6Ng(:, z5z$QƏ6k1Չ>.j0 ]c[A'GQbfY/??'I4\`h(2p<3@aYmq4"ffI¾L)e~J8u*x7Э- IE%E) ]eʪ΢%A2fFԄ׫NH).e/t)k%:g nB좭ٺEĿZA'1A-" 1鉮bܤ|:D>H'֘ļhZ;J6bHwx Y| )oQ2VY 4d W=-]0e@_+Jf;ԣ!EE4YR:k4@x+ -%UϢƐ1"ꙂN* FA KphI 2V+׋+5\@K&{A+mns[ZͿ5L$' yA_^DnozÛO\mxoY|o}hsAiY O@#ppu4d:7Oeֻ|1ZNr-t2V)d.Bu>sA՜쬰aٯZ 8d WWXZkc]FZ;#zW/0Iӫuq݌jȯOQ#bw$ wGÎ/x wGm37SS s} ~vjt { K BHF{HFFeP#֖ˠXM")boyD N38@<a3 qfeg;uE,GDy!0#=KJ  A$@&)87V&)"#D*4N8sC[DVA"$2/h*)*D0M=gLR\rA-x,|uNlLJXYEEOhfr'&1q>CqǥDnEB~ERƢTKDXdNX[]P>N9GڊK'9N1ARB&S@k!exŽzMIySjcrru/V`]uW8ݭgPvk F#^ce]ޜa xh{'_/4 \}<㷴(9-LsaGLs=۫+>pJޥ@Dnvwq d7z㒇R$@qIlf \89Tsm>هF,e&]eq4o=tl‘<4W 0q$Dp3&ѹt@-eT 1m4ftc Ay  ʶ2M,n떀KGΣ餒P&n7IQt}л%ޤE`\*ޙ9W1 _uSh'V9 4TQVG=A^'(CcV) x : egn3jM/LGmW! @*9ZXKE:+pn̙ B5:!J=Q6n04XG!aWDm Vf\ j 'P1 oSr2jS@sW28H1`y*FLz®5s`(aO YU6z0yan(HqUd)u(*9GGH dQBڜc)i4hu,FSNF@pMCWnyLAfxw^Y?k7}QU ˍ\:Fh9}2ւ1*Zis(.{&#yZ uzRQ 5zH(ŨR3hB׵0L zhS%$,F&= Qd!p +hyCo&IUO0&B?9j ȨPhH΅*iiEqͷOdKѾˤ9'VĐAc%]}Q my~'~y@rS֡ Yv$ 4M@E3]&4Vt=:<"( -)zV՟7S[B^Bio?imڔoGg}\~TlxY2/.{v4|Mv;Ji >J\_N,r{lq {YѮϹ{Tvn>ttD~9̥Oeξ`nnWagz̖픜_< 6!te`-t0< }.oa\vQ>*ʭ+l<4gR'zǰQvHvH%DhP~0zwh1^J I%F]7 rݑf O~E(O(Cv[fM.HPHpBfkK^J}#乄rHG?y=ތvDg݂*Fv5tS,G(p.Y^,l?6 ِRzAt5웭D|u3? Z-lH Fr@PCqE#%>L)D"QjED깶:I'j)T)D_Lo;zmpZ A7D}Z{AsHrj90yǹJ%?ק/^Rm93N -OMXuk>֨]ȑW %EK@LN&oKDr0 y*@{mIA j[VUk\(M3RGcB5o kA & *6y~#U\j*5YLWp%E mC&Z\3j.fGa(h pǥSv+T޵?m+r6dn9Iv:M&DTQvz_ԃD buZW r} `,v絇Zj9#Dj/ݵI8~.$b[?\ n{ t4v;)h8t"/GCض (QNafjņ.=oCÎƮ(y"]gy+嶜2j#?ל灍RB+7LR8ϳe8i3S m+Cfdw ;887T* Pt pPjw+(^{+CplugJp@gR֎ZL ` JQs{Z}+˒zʼ+>_ouU_O'ҹ]4 yfu7U߼n!e 2*_(bObƄƩ"2\@ C)|AWyIp+vd!2alٲܳEf7DZp1B\@b>9̫*_!B{z+Mh \De970}*NHeA(?Hl);>ZxJ>1e瓺ĔNL=12.[^⃛v_t㜗|}x@/Q0*ҪHx?*ӿ vFc18(u9}{~ ;ڮtuEQ?t4.E]_.!ơ=$C$,h`7$d,~ =ӅAo,M1ݧG^4YNW5hug'??G>&cG~~{̏G_w6Z}"`n<}?=i~ͧ_No~9cٵzoSdm6_}+QA;m6.JK*Bж$symF$C~ϔɋ֭tbiOc1onOo/9Oo8fߏtQGɧk}gm3}L-?뺝;~|ɼ'1b H]>?ǵϹLz>F{u0Z`֣jnqa~mo/ \+~u  @yI~}&0S{h|qߨ?ClIO+U)}v%?|ht;[uwO&O9wYè I}l0/ЗGoMcuvIU/0l/̦J{W=ͺXӝt8j{`y;L[QxQ 5=_WoƲ댴 r "mܢGI:}m{vωu3nӋ{3K>yJ"FK27vB?*w ^p2}gVeX6Wnlvэ3]޴ ^x:sj0t7*}th">kUh?]tݘ? Qˁ$}4zkv[w'$Lt H%i:}EGә z3K j|GFi1tMurI?G7#MC+tJ>_jm|}{+ͧxLƍ8<>y63WRvB~< Ÿ\\-`:d*7^iW2,P8iE tECC.mmX2aV|=3 q.$iePO/JkF8÷sǩiRVLs@PXC(C/)pBG 9D>ڔ⦊ͯRK;vsTVq%1m- Grpz sJ#І':]b3Paz|[Vz|뵁J (jvzMJmKgKն kњO)sCjK9 ut$X2^q3 OGé| MJQ "6]\^J~hms uwZDra&w2fb`VwR[ڬL:âaFl!n-j[% fumfG9/I^ټUc"!.}JaFU;PTkNvNQrԳq\F^kDi͇:<+E-q(`=7cu@"EU *^pi˭o^Q Zx^m 1Nŷ٩H%;: fpi(R.c⾏%ǔ>ua%0$pG_I +[niUω{I@kuT T$IЏ_^ޚWp/^P&8Va1LD AE:~pkQs]LTUE&4v#;DypbBJ0~>ƒU=RPs ]yq<..gof">15Lbae/@_1|6br hΘ1_|hK9FJ0_N,6 [RY.dӹuV;$<}Ԍw=?l٭{膺P?<7C0x!* ur"󅂮bJ$uC  ߋFsݜڒd„Z(1n͂*Fµ+R9:ܿ[AVM,OWGRyV;q봚 *|r< `@W'xSXIfOw0@ cT<5ϰInLMNUq>qʑ+@B)*0d F0<#0 }pW~ |C ݀!%ǧ@ `(pѰ?$so:/M $N9lԘ֘_17G}߼3YWm e8aGjۋۋÃd9F&2zuqs9A0-Ż߷''geyz~^Br,g}z@c]-k;cC=i3%s =^_@9'x5a?ݩYsCcή(=FQs,mGE2t{/ ;o,Z(IynFIXBLjy˭\ HQy*)?|Ls%\̨یJfr[J1NtE("&K&(k,cDxu~E#N0,nUx3-٬@zw'j?=x"{_ϙ{xr%qUpP&k3'v|'z(' 2QO DLu ]vx'gꟜO8b3gAx>|OSEM(l6iJ$(gƺ;q1u VdGjrTy (Z|>v6v'G՝bŠwI$Wԝ'!Yg%޲Zx`` IPP,]IN7 <"SB ½1/E^*DUMd?؀trxY3Z/eq:qn<)o`ǽ)j%WіU6gk "(nAփ*Ar7J[<@ų2b!~h",'7J1)_4 r-RC. (g 9$J g HC r!;R\)#7Qqyl*C >[D>BFEC]3. 6ǜU:uliˣ&l7rTA~쟅~4p}7 xy~tJKQ}$ì#aЁH| Ō*ĤY_jŔ&ܞ;$%;K~uS}߭XX=G]+2V-/4!ђ%;zT#%Fcc11.2Buɽ5 vjrKܛNk0[b} v90HlcK4hmaߛ;oEbb: lw&at)nR^FE&=w5,0";G/ٲ_LNqYgB4C.SLU"+ 69|gZ6_a\CCڪ6[V~YyHx]HH"y-a4NFDJdA6Їq5/F1I @vTkǕF\;/A#GI툵|9jƦɸq~]\pْ2>:gu .4e3R [.ü!̺?WUbؓJ0d Lr V 0_򁭡|L)11)MʋLFÙ|O[{0[AaFEDa Yjz-24OO6}dOd _YWVxɈ2XO2,5˰B\R~wdk`,dGe%S]&\cBCTOnD`8:\מlݤOiϳr{Imk4S3'r㘭Yϐs&;448mנQԃG {>%7K@ ]W!_@!ZwϏ{E9q;4ޕAEǁϷ,zG-4@[k ꯼ +]GcK= U+ݓUHp1xƇ"2x&Ѱ7L+tv^aZTKg5vgp$l`Pj!Hhb]"Ç; L<ʁ"ihزoeP#on)'XUF)=V;3W=R$oy0V-w5}f)qPα+Sq. &W- 3S,ϣͮ_OTwUro.PtvkݦY 9a1O݂&Ĥt lȚIk5i5Β6UAa弇OS:s:$C{rm3$?-V#^="OibJ+~n 6귳C` sƒ4y'amH":۴{:g~~;r={`Ov0W&xU3]aB17IsmogQϐv>#ń|cĸ8>p4c,K ~(27SOqn*2FWA3Ü8-0Z(/xzQql zˑgrzH͡sap1p|/SMU=cOtВ*5wh $1B{(јk{'Ǵ`Sx=㲁 RL!DH)$ lqC 0PfatA6%̔ZfRt|LݗsIҌ8,{XN6P.Go?>Bdvë#ra,br{( !"2Dك"JY'yTy78q|raݯ{gÄR+Z.Zay}DA/O@; q (Mz޷7!Td0&RB8"QsoV:CR$BSjLKޖ:+$ZbpwEYoojm47h^g8 Om83k GJOp8itdxrKIoyA'e]ތ,\gşwR!x+\UtP͔ժ˓糹t[7\_\q]n`"OS~0G=?0kw՛/%CgSIrP^"r껲wrh~7mgGSqdTSIoI,#F!c:L p%"U6Fq/PmͥoVCE"4cfA.1„-[Kw>"r*Q!0g=:7r%} eMܧeyxfLYSI\.Yi/'Vŵ]fq&76%SUOoiDnX~}Oos+׃43_?ɶ@:u [&H3W}ӌj 6A`VD+wĘ'"aBjpe)fj R^V0g}Wi6N uNahpsWPYY4γfY{ >`7>>FO_E%,/lڪ]|;u.nSO7j<-+&aҒ0w?Cm؏}E_( >>wϣ*>-\rwϽsP^B9IofP^}.5 hA>|pRYQd \!>{gʯ߅"_OO-~i~_r3_/~-B?=Yz8U0_ޥ(?9/ f,Xs }߇Y?0,~W&Z (|e=,~} _A̵1O{9}nn ~毿-ӷŚUq4Fyo{Ym /0e:'3WWf9uaXutñQƻe <0zgLhQyk9<8/ P8hIα4Ta{'' sJ!:cTx2(c2Pn1؊K1mo% ɫa aPPs,WTQn" ^ht!@T{B̀IH0 `DuuozހAPTk[YDDsGO.Қ0 /L1O:J-L yڠ'ah^r6A}!I! C~d= Lq*QF)j@0 LC"~)q`J>Y%82g`rI$S\ b ^Srq 3S8#-2H̢B8fpZ 4M rz?!*?/OώӍi kȫ$yt|ea0pXUhcxFc$BR%XCaP5<98ڨ[ax(h-29YXjKЍ ` -Ӡ κ zк.LrQ]ͣ.I|?Jk½h .\'ށXS''qQ1Mgǫ35t*0צb^( y70Z롈ҧheֳ|S;L,s IJ{QZ2wt)Ofچ3ñREBYCanj2]A}U s] eW.͏Ǖ7:k2Q@t dAuڙ{/>* ڐ/}7yI y:M35enE:{ݔ^Lw $|0 ,>~ax>^%Pys\ M% _/nVqp(21{7û258ym洞FlVK\(:\hflWy`*Ly5R ֕}t5 ]rѾuĒyۏ](#Bݺ՝ cA;Г\=&fɎgNf5ExMsrCA$Fxza\qO(wST7|%J.P=tOSp؎ubS`(j L~~VG 8䠣= (W 2`lB;Z%Vԣ(! 4l6BO`U:(אcf4l6U&PȷKqJ6\n v66R+qȇ+PV;W\Ԇׁ@WS8wvԐkƉ6AnI4 fM'uEP$HÄDB_*(TǍX/ ^?sppI^r`cg& -R%F0`Gf+^Ȫ& [9Ix!v,&fB.(o2, EJDCnh>Dd|>ӻn[rJOq[7==s$*ugFq؊0V.f/Ir !X46<=%~Y, l#7ɚOmqFFT؊$GuLayI(1؊@(*,mZl6Xd Y):k}oeѐ Gd34^VsT}:\@$$Up<^N.F#݋r/KlIRMM FĐKYWWMJN#.B5dE񱵕?6#מncV5bn/$ $|]kCb:*`Q` 2f+{zb=)z<)z[giJTOJ<?; IK׶`,JW݀]p8~vR=W/Ղ w{pJ׃2Ȕ-Q7']H K5/!F:M@icΉtVKQ|HK[|$=\GQzcr&JN r˧W~1bkQEU' ||p4|HIˎv!`1id{![>3ʍ-% vJT4&wBRTHkZGl~r @[1m}9m!+d)@m=D\8{ﲻ;x>/ zjtaI@:#we -3  Cf(͵9AZs0u|J#{C~{Xf -I%{ì.P JTI}XiKK}&%jEV,I Fͮt| |sʿ,ThFjڸTXZ"mADTpNq_c A >w]S3׫;܈jWI~ƴt몿e|( ]S]m}@SU"=3M)`^*rVv^ 8H4N׷ S#%)A)T +(R->S3jٍN^q)cɋ٘Efl/3囥xg$a%R% ׊7BL4,ҷ$k5 D̂%%1cL?Vj\i3܆b8e֓2*~lg~Qg:堹7uH?ek,C;\/yT̜4c癤g""PI\/+9ϝ^D Je+h奵.X`+[Nm% ZppW7 2pk )"Fqb(p00 -G.'hFFV9FITc{wr0A9ReTvՎ'g=h2#T/y>GQ hr+Z2TSTb;'g HuG31.cV;CZZC. %a}HէC2pH݈6Bcыmd#h=-NY9BW`tj ' ;"uf(MDYQmDBBʒ`*Ęf:Q;,&*L;A,lXa?68]t#.P- ɝ*2eh lQ$4V4G$LʹN˅i˙ N  [`j1%ʁc3e:JI_ᯩM&B{@5A$ RK@zX1^zB &Per,F+.qR-FtH,5g <3[p-4ޤD?F$J812%͒} |Mbqe )E-5m\@̋ 2ZUT7.NT4M ̊I8!_ V'؏:)Y#)\2~P|]A 8 a_jCp -+8%_#vQ9ge˦qͰxiA %c-\Ah_'j$lƯ鈛H{|Cbc[dL7>k)! cP7l![AV L\Ec'Eb,Fc<կ?_j* P)\^JϥZ  qdC[yX`e~,i"EB%rڨx)8ך⬅F俖XۥAvBt^tw D$X4-2פb6 ]$L zj%@<ޡux&IKHZϓN˅値CLjZF n- .y@L5Y<YXSpu20:`9lQ64%үFC{G|.;O(1&\I` 2}X| 01K1H_]ostg[|O)EuI/K$=^)) d`T|kuM\ACaZ wm}yAZ-71뷎Vy@\(S )WW[`hW ZUW8z;;?y7ş c6>>߻%%ro5|vL?,h<*~Ѷ rů|1y*ւT]w~QRh!jL8!Kl&[>R( Qé%btT~d UQm~uYG}W;_u1B^=}L g][ݵKc;Z[w'$rʛ)o~fϡ|Ծ4? B6Ji 9gA%H*T C~u<y~<Ξ?e~MgMCl{O8ڒfγγ[aɌ3<|5[|c섐!#ɘC׌'"WS- `2!9kr[a2 }\Ix]E7gFQ<2dZWd@0 Pфw0cݜnb@MQ5'2QCנ3,L`}dD@ 0o6})c'7~C*Y]z8䘖N2(y´ӊv'iqƑ,L v!6qJ8O*"Ô{ baaϸ4iMވ N_.!X#gH؛5R5rm)6{T}2E#旱owŧgY[W0>L]°2)~7xnTRD ϢJ )x;C|RB0A*SEFo04aラ +VYEh7#jl{"))3@׿?x5BaMHQC~FNIk>DZ;>a#;}&Ǒ-^SȖbtGDao`}kr_UOC\ѭ1iH<:0 %wqM7f& 9}+*:ԒlL'QăE":1&x "[*<ꭋy\5ht5Xx,Q9|:>/0ߜRд]Ε﷧0jxOñ p>~ 32g컨܉\(&z|{0Zg|Z",--Z@N=[0C$K3B-G{yzQKͧµXf=hMgn@l][oF+_tZ=<df Ny9 oQ[,9d}%ˡ-"ER"e bUwUuWW}̮VBk{^Hq ۍOcݕS{&Tߡ|,|9_,ӆSA'63VoE0a*Tv_h4O m0T?QIb=ɋ]j+A]ӫƻy[n.Ww:50W57"Ɍ(i3aʌ>#9MyY7Y;ֵIVJ[f!Ys&KՇ#gu$:^ ϋ{۵HD䀋ǝ,걳~I0Ctѫۮ%TKFJk9N:h\LPk2ɨ4┲Uz1ƨ/O*rxl0iq z+QJT?! HM=Cc(PC5 CB˜lLu)EWМ!{d dg5=ZZ<ϗѝ3ұV:}άj[tJqs<\l>7pk1X1idGW' ƺr^ ;b%hYXh:AΝ}Ior#vۗfgOz; ê"ba\|o˯GNeAB#nB2=T|NDO;P0%N^9L_lWJy.)ݱO@;O3$\1tN]QK:%39h 5c1zE ڒNaSs*MgW+W6ߴ܄MR,/\EfCacB`.W4͝nRgE>3ĚyvcZRC)Bs} (vϩGM/5 {*ą%!R%d:(!HS7m;mQ< =35mdCcj-שA#GPc ,B%a`fq9(Gv8lşZTlaI8??7pi0S12L(eqiH^}4C%}l%š&3ׄ`n8C|MzYB`%U:gڜ0pR~c{A&悦3eg\nhDW_R.)KE}#5xA).-4̿jѩ_)4u>ۥ_UrZ(L&/s7s!q&,?7gtF9KgtF9+Q>Vؿ!ysk>O잵Yz8D`4hZ<[~A?ϗ{1\Qy ԓS *4Tfo/ۻ7r ~ q0~.xsU>D%tjS5*!G-V3*HZ}uC}D>,)|u,|2Rt`!JDioT jέRxrQLǜZ§G_F0@ UЄ ΢}Vp@EμmKX"Oq"r!dVYm>Y!@L yȕv$DIl .,٫`W~)O=^? S\c~3)2b Z3I 8; ZeD=IZq 4a2ZaZ:'( -Hߒ6pj!U硍y_)N-ɴk ^ RlFkꤙ=nuDNz=V4I:jҊc7ZDNk^tٟu ۥ=Al=1AbuУ5ѣf18EÚ[!Eh>%BR(%"]偈 i@0f=Y*Qw-mڗyY_o5߿<-_%5:3gm>?/|azWwqw+-_ CZ{&}u+bķ=OisFț՟\+v߹y$`l+&5 R]3eJg`JzBC78=T a=Ri}޸cζBjA|n >#JJx_/ݵLZI˔_{w9-ee ͙7utF,(2ΌŰZ﹔Qs GuF}tᬔ 㥔t@?eEP@Cs弄1z0EʢK~=~v4bWjΟ %#c֣ *{RFiUZ8""A.;p ݾbV%Wt")hJCMjCDU{'aD1/"U ʽWA`D ٫u *`sԐA6Y4H/ -崍7AaWuT5FQ]]wц:z?8W_"i3*Y+*+,baXS'\|D[F%UFi^nq\|DpI>˴Wޙ 1p;sEfc*RZȃTz q6h%R`Ogߞ2|ŪT6lD:v@aMXTܥw/sEٙb۫LaooD?e19yop0"a'TL=)b5%KCSe!\N}\yP460\h\92ҖKΔA)VX\(],CN92bXdF*졶yYi*U[(-R‰IDʄQۥEdz[C K%eByPh5Rv S j1:l 7h]dDI* r rA\BmbAE,7"=NHB TsM8]P/q)Q\p$Lµ JFX/h9"& Lǩ $bHLLDLxUSCj zZMT,$6{:l~~O?*-Ukv^0WqEkWIB Ejxe@єRK0CsepB0 h%!q-[JDI \u9e.ԒQ e\a" j) VOr*eRLeZcMHr[X2KDFC>Q)4ѻUGT noC`@;NPuin쇧(Ht >[Blg)Y|g)Y|VL/PZKtt$4w@MpМѨ!F X^%DہZh;޳5?Vr |fi㊐&. #6旈qĵ.~Bi^=ꡟ:|UZt]BUo1hX,p@@o~+r?(Y`^8=e) zR߰))r-$)aqW7lw%40nRB$[tYwQy.\byuQwۆc Dk=ZeN+ 63@h&BJdr3KlYX㱥W\j}PJ! [ F$p\4KE-q^.?6]1Ec[oY: .f=8/h{Q /iâ#HE];KJ+p^)c$T:LsShвJtW=!}XsrƓYbxұa]/CڝgZE30J{ǧni*PPc  n~Y]v:Oh|!0z= ӯը55* D.Mr%zKcCy՟rr۽v/)P#́}|-c|X8:w0z%2t ,t@c͘]!ӮZN 7QL+!_m"W'y,y]-2 F#m(.ϓ!>"SYedy!]>@Eg4[f~0U#]K_r;30xO{(fEiIΘ0 Mיd`2kscxSC 6g1rO˟>QFL^m ;{4T n>&{O+M5aMoC$t2ܨ>R2+vyM6F =;w(݀6b0I8j/}I>5T%#ꄣF㨎3Hi;c|_*L NԵwN*`:9ܧBp .ɅCSsŋMmRRB^}"B]}QTBў#vw$YjN$YjNi蔶< J 9ׄpI2 T(FĭwҌ>RsiP!\ڜTZ%^)2n6iV0Ml%4#-;H[r-o"D2(oVBj1i84l1LiJp]jp~(jI_{UӃ65{p =e\g,r[-b2Z(HP @Z%ifaZk8 XgYہZ,k; uUsޫWc0KG7 0K 0+&՘pt՜<Ee^\ǜϵS{y)}bdm5n;P+5n; g)zY1.Gȸc!#xnjP@d@_q#+~ 0 eda+%${f6~Ք,SOjbAI4YUuuUu=$! rF8deH}P|>`59)5Wj8H`.0~N֪CT%S!JeCT-nDVQ)'aFʓ֋X*ݘ//`df7d/! =3Z",ȠG9S3g Ǘq_^ MF3p>.( f]y'3"J{g-_"Drq}ažg@8ҭЅ#HP#m&B}VQ K1 , 2psxwū)HS\{եZI87尼N$(?%6#_z ]gj-)l_@Kdzk/IYUx1T{] rԼVTxrTPIE$[DGIjЪJH % wMG=bѮSW]Yf ?\?'`)*?dYvY ^ѣ]o?Ft"|pxTξvjuH!IM5ws۵ssxģa^b)mwu? F㼰x'Q^ NGdlM=ſvY~GXȦ}Tnʆߏ #dιx$ҼǚpTM GЄXQer]<նO~!6PY8: m=@#w%87?ILبN%+|R #h*1̓R8gDt^;J}N0XglTmAK,&j?I?ߙ|PRL{9̔XYxԙ~$?t3E\$"4Ōׁr4$JoBlp\iKBЕfO0挺g3K^SM14)'LRfD. 8xlIkk%tR\{p I'( 0΋\lV;C! qM2А$TB~<.т*Nl[ J(\"=.\F88W)Qcm]TzK㶛I*ꊁk)Yq18qC*Z\=Bik1Ś尶r|Մ]5͇x{w=WWe2~~=#Ղ^?NE/-X&"wPŲg[7_;-UO|vEmã!gSk嘅\|9^śc?-f:<%m%ex(sex|iض?/KOXV}+5 I3bpڍs 햊AI}FvDE_&;n/ZU5!!\D)%qzNDqڛ(='WGoq N(Ŷ.zh/Vk3ų_^T|yqaEaP" iNa3e& T>>(uJk 0dp>(\LHr3L›TJL 1Ffj][=yXakJ{G^JE)*DOPK0 g=^Z-\x;&Q3aTBVm MBK#zK SlP 5mU^hִ-5AT6Bx=2=2 b fZ6qB&|KS(RE莠pn6eHW(פq;})L1Z/,vSr" h c"h7DT0L5%b "362D)GsvօEVyc@BvO C[j]tf~L1i~y }HbQziM;p}2rdžn [GLG9]pHz]$;+.[TU ,**]2h*T 9egv-G 93l f4pIڷ7V *N AtSzPQ}:P5>TŐDuip}3; gt3l=oL58%D`M);T,!86{FĤtubT T:>N1vS;+KK+5!!\DSdRbCQYH9vKĠ$>v;<%;JVn H3$bi^aپ~/-e^Ғ,!}9^ U|%B lmAK%`bn oD5 Kh`tUų54*Y6{\5m=cX(͛^;VRN_x_FW\ϮV8w0cT[}jH:3[T<^wdv m򑿛MƣLFdv8` RR~r#%A 悵2ӣ[v\DQ˃qwQGՂz6/Z(|`[N#`y1'(B7`3O%{Wu#&/OWx 3tCУZ}|y暓tN6-- %@u I&())\@E|+YUՂn^t/ PhDYaaYh^%vld9ibQJL>=pux;ȉk*Y9tC> đ8Yqz_QT1ON'J?n&f>>P/4,e~_5[p/Yv1Տ{۹[x[g3JI.jCs=w5WXkeFol2;οNY3V, ǚG?C\0{>al>];W4Κ5^ЊH*yҬR-&(ZM%EXuU7LkhO di=%bvn.lJswu{vTLu{I_-s2c5ʘQ<!ioNޞ3qH[N FsJyLH+%5In[u cFtdZRj~6j:qR)=Ǔd&L)䷃[t>5r(?z|zxh6ŀŊp߼ssFf||rF҇q>E9-pG"PGzT<&" g^aBEu4*:c, d), >h?ăWhPlEH\Y?B0K>"c,"X."Z4XFI"d.!K*x%hcm&yF %;a> _?_ b-n(z5X̳dMO+ ,|4Ԇh@v?7R6B$>kG; blFa8fʀ^1eV o-5ǯ/c#W/?&öVRxmV*KAUhlqhU*2L  \yÔ vj;B 4&5$ ao5φa wnp4[YC,'2]#YŚI5'VaGEg %GҨ:XrD*d?)4 OF/_%/>d27o[Q^\q/1"גb5s5+rp%vJ`;jU0U3VUD|yhªb̓ dOYV &~!JZ^Hhɬ`nI0J5X˝"G>my]܎ Fh@.f3ik5.(ᤓ`i% jƻ Y  \28hFND>1+ԯTQ ƐTb!6LEFECQǽJHDzk/2x"1| 6ʞy^Tm't 3@g,0qv}X ihogd,2S'8=0 a~z3HDyw"L{bi Fuf0]gfxFV=F44W }ǰc͡ %mE4ਂjnGm:rqޙC<+*8ͳ 4eؽ* ,ڽ;EZjSLL2rPp!c,AWi823h&slb.^_gbif=ai 95>9r17 +%bS.s1rauȁ%J`˰!ZR Vd`r%R?'3(q "9)=Yv{jAș90Jr3 F3fʲ<ÒYW51mheaIͅN 3T2lfnK[(9}GIkp3#=Oze;Q(ηܨ">?"9_z]TXY晴$!_ChvS jX BD'u Jۮ[1pvkCBrm.Sܻ=hU ;豟"SLD }&uF;mSݖlz37垦l4-oY݆S7n?}v|g}P 廯bm|<:4mFE(YΠ&sShsM2F)JJV[ˁki>ïj#U рub voFKkIeF} Rq& Ν9׎f I*ҌKp|2*}(!ޕ m;[^^߰] .f&țO_=2O-.߸ Ȼϗ^ϋqP+?;H7W b rF?ʙ(b)&͕gq:_|4mB3Y"㣻E[ ;Qy6|{p & \-xlN>(x&,iISIٯ%vI -(ĥ-6Ba 0D|13j uF2۪;Cs|| O|4hE@EȓFjEZuc۔բ'Gxvf)A>Ɵ!BysgT*xݾ]n5 rC?ّt4/Ç 2ˤD7pleYJ y_%<&W-ɇJ q:3:'i<4} Ƕ^6s=z1%SޞuPܳ:"׊x޻VkpF8>XwoR^Mas2aywMF[ kpTjALݤ;`F$:]Wՙ50xGb8nNGyjz/Ǻpnd'5Vs%eʭdHӳ;2V8HP BHpH"R4PFU&~~eWTWJ@S۬yPu9Ţc3h%3zvT\h6UmG* ͛q1` #'gJljJ 9QW ZEy$ RAŒKn0 z F gCWd~o:yzK<ֶcK%{x">?H|bX5{;Ccg3W]uw$<B)$ $GJ%{ޱj_%)ES} 8Z٣E; V YƧA 8z/b_ryAIDGs?^U2@k7cGt,wv믇X ,hTS/+>̝B^/˨e<HyRAy;a CM~Y:jv x;lȶ @P*컌e>\$JSL*{+t>lbD}0A-y[A`4`q@/CDZaBðXB>YM,M;jzD]b"]ljDjV:EPNn|Aiyw&Bjf re.kc8N- ) `~]4Yn) cjAP~~-Q+ ;B@D:+GQ]9??Z1֜[`3+x9b2S&Wd(K)ca1ˉQ2+!ǣEP-U9iplЉ9T1qbw6|Fj)Uh5%ogN$[p{nomv=$ ԲInr@aEADLs4N%%rNHs뜨_x>Fnk9}(:bN.fG-❣'ٵDDh$LTNۤg<+7ڽҋJ;sǽ?3oEbJ^yѹMu)Csߓw~v\il95_^o?ߖ;()zn&v^|,b0 )[)Szb_XzC)de'X|ʲhKhLq5xg[J^ [,!:Ƭi\p-uvkCBrm.SidꞭ9HtJRm=׻`T[_+ɔ"7 @WjRV7YKEU_dN" LgͷAnBHE%knÖNf%3-=!N͐1b!uÉ AUDZns9=vV "3Y!y-c30>IN%Wz y.XTiumgA~yE8gt gv=y#gu3َ"8Dq,  y,'{qL}Γz)Ac3r9093t;wSךz\eȻׯ*7x_ azsnW7ӛ+%~xUqs!~g-kS~gDǧGw'pm yK}Lu$_vpPBh>ݤOTtvM]"C Z #2jPR)2ΓrC g)m J1Au2_Lm"0eB%=ZVH qm+sld[;$7ZTc{n8zCX'=}|Ư>e I7v_~>XtlNBxgh&ז:K>-fϧ~NoN&uƴ7`o{8 gSj!SidDY" Vd+:6Ыn<FkCv.;1d;΅1Pǰ)c<(zHy0 pWm/F*"~Y,J^%X&$#/Z}R*Fjpn;,#/ $\Hvh eo5 Śl/x^6?t~sKwR6? ބr8DЃB;`pEјTTg$<Iafri&:f1,5gE Ø d'65U' ^ tomimr^U(}"B) \RZ~*Du0˒t/,TDT$)$4$EPjml?٢7w*UvVkǖwm͍gCW!5Mmle*}IDKKRŦlfIŢ5 u^~wy$q8mn⒀1f6Zsw\rc߆.iFR򮼓Iմ#W}6*S+kQTygy ;EHl$pgl&i.؟.Xϝ(~6qg4oc̜ȃ )Sd"c LQk3X? K=%1r(W#_VB8RwWTZ|Q~)k}X fD Fd ܭl3l+Ij0$qB;bx# -N٭')j/xD2"E,Jcp)ݛ8s R 7׮&oC'DR)YQqibB0D;Uv xRLn4 vvmXDaѭ]ƏQ#Cp4ABBvHL[3abDe+mfy=+t}mA!]gv6{[HNȎHp{xڤ)Y~@@EC;L>Ι ,u+]`(!V-oFfJ軤Tk%ڏKVE΄<\c8)M84@*'l vR} ObJ>(g|4BO-:]woYK6:_i?z@ߎFm}"`M\p4>[Q_aٕfP?SFX_}&p$c$hɚ;:;5H6ވ5vFy9X64]ꪻ"GNbD-HjqF_eOS Q)Uuv0E P^CtZ8#H3S3lLH"&/ ߧBR2Bfb.`,/X*P>hfju7 Uxhd ge͟dUMP"$5I.5IwE&(C )!I"#R{F)j 䔰;1B) y7Gu:xswSEM.vz b8? &fgfWЏ`kN%ER+F"8[c.F624 E(Cs\ЬE1ǹQXE%$=#=#GL='=S[?6VK9A2ˍ7h^GDp;G:BjNO]"d'!7`隋7tig"؋o94a0pդ~RHCLYy0WEDn ’^ja8r_@q9SÿvNPDT =`Q.;ᰦв7i. v@UQ8I\hrbB)vإ:xIIY+svf21`TZ16B;ED?5d*ry@^#lM Q`q#)W`F8[dB"+PGK֐B0\f5cYՍ2#S1`0/a] m, ӆPBjl`1I`* 6 f'\L`Y7VxVf 8UBr]I r{s$&m ^2*hZ$A))oʇ{Ωl`+p̸e(B:0hkW#s 2+`f ˩wbo<i WX5)i^ÿX&3/B|1/?Sڐ]^=p݅Z\:?&LLB>Cޜ>.o:>it׷gCKqHVz=n:[~X![AJ !?|{p}LVa˥E>>XqSa&pӝU{/:QZXh7_.AĘ3XEPDS"#o #]0U]zfɫhc؋SIHj ٪.9Ü'g`` eT9Z,lELP3`]rɫ FZbp-S:A`ZwbxEJ"Gܺ`pa0|E9 { "c*uFZl9cZQ*X(Dy'ڜ)LEgACK@yRIcM{@\sroX%d)( HSrKHL@eUU\]a-`f:}ܚO9QKfAaan5 W @lc,Jg2 B9&DQ2XC'oo>NK4 ?n]pZ-q^wOLP!r4Y|7ō=\F!e^qzhT|:}t+5ƝA._}{/j" _V"h!*yEaL!»+;Jܹ\sEIL;4w8E]aᕹj$HwMaܹ%QBܥӱBT¸cFK .32ohL{. &7n^o ~s"$]wZD=BsMH8{d-V!jgU}םc!a֒Fϡj1^!Pi؇`B=}j1¨,jOhE3sZ}0*~.)՝VF~v!՝'ê a!;={F;ͩ{F$̻{=uZ PACYty:UpHZAmN\ 6a~17/N] INq߭7^i5 d5-3M—6}|;rQ*[ Ӵ xzH mIRt_Ʈ1$ׇYϤ`SmxR=ЗC\ſJᯅ#GU O]w :Cx?ziѪ'.epTF9+xZ)B5ܩK_v(bj\Voktvܽ gw@)#cY$}TRJZdo/Jf󣿾86?d\;T| ٳᵹ,&7S!#׋{Fu(y; 2.'׫JEf PTLy(P*(RIjVt^WFƉDu@251Ln.Jy>.L]5k˶M.UލR&fWWj82#2K I`%k<VFճ %g֝gZ_\w4ipT=pthth<)OJV1vB2xNLAP~q>w Md2_ +^Wy翹Pj$av]5WRxSJPߞӛnSQOU S)_̫1&3秳`B@W9GLǟ~Տ֢Xh8zJ?r/į*f4lj6=} 3 *ի+oƳ72֯7Ør0y'?(۳ {G嘆!;ip0yVP%|armլKa10df5)\0wiR0~`cգ~FJd}Wѵ/{lvqXınK]0P?W[6:e7o=`|Q+IUؒH)B7t97I%fF jD]3Smsq0^3@ >}kh™qIaѼMKs ;xh}W_Vbmk!`c1 y\Do{ʪ`>݌+s_m. =JE"3X\1dmxƪ`v\0ranw-c Y`:w;8zV`Eo}AVJ{̬KbPĺd:2kowr„n@ UI"F01Gנd[ X lz:Th@;?{[]8 gX7 QkQ@Ѵq ΄0pSMbGS:|ûO\Qm`.ğ$V}BkIZһk}ٰЭޟO|&bv6Ry6#nlWv^ksFfyF=py~gu9VUP5 .5(VِG).Ez Nxd=OQʰF9쭺 8 $ 祷`ޙ;P`;Ri HȜ)<7S |yhwm=nY2̪]?; K XwVvb nQJEIm!IwSw\kE8,?5-n+Lϛ˹/ кW{;xIH˿@V .-`P^scrռpC(7ICyc YcA!hƵag=#J*G^{_{ A+X跓׷0ê$&DuOԟhRFc~t}U){q"]\Ľ.6_#8,xVxL vHs Z4Q8`$„Xg8ɿbޔ ݸGg_}XxWxWfO [b*rcf:0=իei~ jl;)7,Q`vS)s5nynBTQ:e)@Uݴ݇-Z{MZuɚA]d@l. ٣gsVްmv䜄1%HW;w -GS8"ԟi_{cL?NMGRV#$k3OB6M>Oz.oS{|>W|#(ccqeklm[0<`*VY<.FG1e5Zj ܫ[:e)J!E5'bhVóv a)cpRKGg CSH9ܖmTt=wg.@ڶL"26j My6Ü˯9goAN${Agb +"D>]UH<7p"A, !70|zPwTqiŅ.@a, !T+BxnbW֥;R:.%zq)/9P DT AI57<;!d`B>DITLHKl[Kk{Iru3VcX" YP Zat1ZIR`F90DdØՄs۫\@aV 6p^>0pZ(\S^YrjUԼT>gAպ|%ۉ{uYG)}X \gg\;G`2H;}Tp\+&+OB{RY 3"Ls"} 0x;yw);냷:vk`b"~LNsq7ɬcD6 M~THC RTv 7<ʟ Q1KUZGޮt%`ٸQZ]o]e{ёAPn >QWDJőYIHoh9)Aպ0#=qz/@}I8 o'Kbjqgבgq+ՏeB~[4Qi |{n?Zh_䶱<~a@_F})b䴲4Y+mV>O՜ ֞ɖu4WmۀMI&`r1HQ6X[vGS[hN!<=nW.hr1HQ6X vhuBCqmSR T,{!ݻYJet*.P65Ƒ$(лlCguO_LT~ݜ!c-Lˆ]IFQLa"7H0D\2iBFOECSsaV 5- uY:9˙7un2_}(?*IL0ά>z,ngEHJ!l΋k(UI 6$`LxfPP ] 'BY4f@Ud"ZIs&`"Z8Q+M=#Z`q,d1hD5te8@~% =iCi1t kλ>ƅNEl8Ĕ-gߘ6[^a 0(F5~w6_S&PDQP#  DR=fZ/'7D)` [e. sy@mՖ%cE&r'S.SN& 18h.ﺓI!Xe=l}ۦd4;\Ьy; uCs17R;~fnn N0ޚYy3#J,$8i- .n4J2̴z~`Q6߮۰*T19 ϱ(` ǜ1)/ X=1iDwv[,T#}4S0FWro4` ,E $9lhWn& $kr5*kC+*wt7ӹQ93"z37'w9bP'<>L6e6M!c`fm@Bk"CQ cŎXEpn4Z8\ ddufTѧuTɃ6V0 /=J4G2Ė@DAC& MLY3H'RfZ f9.kT 4|Y jw6_Zc'%·Hl^{Zx%#U-m9;u݇ՔW+il.dJ-ﲓic̐ NW9 SDDZWy ,t5[N_$%|1c Q67AU`mod+TKTKz8Z*6a6T\ꤖŝ'o. 6!LRE뤠pTгQxr]j]fiVcgoUs/Ĺh? fe]C9fRtWpH~EawA劳fj4$1k!TIy k2 E"LsaZ#O<ΰ3ELp^Hx 7B2)"tԨiHO f :@DdBSf<( YU5ÐWk>M_iNm@hu(OK݄({'@ x9 Xd[?Ԋ J.fmY͟a6_rpS?Q`p!BZA|QތYb@ A!88 [Jd鱷w'7,ci7{׺skNw;W`wb-[jlU06"Dm ER.yAI<Hǿ&_= RzK_qCN#"LEd7L~+0j9VBI۷'_ V2VUǍH0)c9-|+v"qi|9y-jY?'A;3Ԏ"[VA͎ʑZ L7\ԵR]'m/"k"*+U1>Y{B*i^6CZҞΛHHz1b+̯‡N?LQC.k/f7hԇ3ʊo Ǿ>j{/ӳ%Bh^IdGoY7it{侼 C$}3R2TqpzVOqŎ鑚 m~m~zpѕe:0̝}tr콖f Q68=.,:qiμҚ7dz3~g.@ڶْ*v9ǧ҆F& |G>ش1qU;|DH016܇jytp04RV{Z{G%{Dp^7MDB{TZ3pҧWt H?i EQ8d4ZJ:PϹvJOT%XD#ktrF0s wO&D(xRcYpT\kJ%x=Bϋ5)DV¸Cd7ujP͎_2 MnmȮ,06})WQ_rX>a#kIqV"fašr1נhQPh2";T|_GU!hRϩe,U n/u\>cf(rM Qkp)'9q!ɴ9k1Q)X 3;EE!F^61QF}TH):-G?*LjZx>S~)W?tjRsO[bncTQ2PD=0Jl5,;}( 6 9"zIMw2JC }^T?{?lna~i0=FndcH>3>O'=|:sgl+w&l1ڿoM m?Gz7'{朵bm.ơȩvc~mRÉp Z{J{r <.FW*  3<1׉gpD^3_iF<$b:7K@ h?k+DS;,uG G}8/9\dJƕA:8h~q9mDAcyC)dLK젥;ZҔl<Y&?RwFh;?{RD5BZ+]zWeĜ*zqר ],VKxpNwBR,ÏR"|oJIX?N6ٓ``0@/b`b2K ?̭#oۿgc}g^!OEn/_?oq!2!kup*ULǫE 3*ӿeK8fbgJ1 _U\;v!߸luoHb:m4n\VYۺew4պu!߸f%|v^B!& ;ʐ3TҚP$#^ &Y5ݿhnaW8 ,k{ENpd XLJd 8E}16^_~{f,7o?! uJGJLPG1VYcьl2(Ȧuk<28hց#tT:auo6b6"w` '"*.LD*_S-?'H>_|A9;ǖǣVNc.ڑ|Nhee/\pZ),{iV"HX-Z!Egؖ.;F;3o7uTzp $ ¿~o; ӭWϮtO͒cчuW}?3dCUO6~;)"XU6_܄`w1+?֗}K"`^Mai:eJκvHI{?bNx I]mo7+dEsX9wO@6ٶd+d#GcvXc8.>O/zch5-dTU 88mqʜ9g4rXypDc1H F$FNoXt\p_)!F.g镳oB\"^Q> *'Y5t8OxA9ltz$oA[pH9A]HOlF2?;}U'!<"1R%Fͳļ s7؆6Typ1{i֐FUN*Q P ]BP"\(R/U2 Cǂ^^̌ Axf?^5LYpYw";!XTT]hzA+)\hgAsaۨfNX\Ab  ul25BiP: :LPALvḛ>Ysj܇SƎb4Zk'0:CƵmDJ’jB1 1h1W]'T55VJ) Wթ _Or?)Cnh8}YY`o6 D4Y]["l_>˨Yn$z**Y.OPazי>}SlH㤴m0ސ0=h5"xeK-),.OJQ_tx*%Z@aFd !>\=:KQNKgÎٿmQ|,qؒIV8x*ϐk>%zDK(ҘÖeP*t@Ahh%ypjAqo#fTXi9AcZ%lIE 닑[Ipmal %i.&Kvj-~DZ rd jQTr)v:V*UT*CVTF7yGѭש}FIt;Vq:U747!E) fo翦ԧӟ/i?CKg{ԌV8X{ܑA1XɰY^֏1( ԇ`6c8 d# aaL&M+ㆍw7ún==5q`_FF{fK0$\`;70SanU#%C>jѬ@dS!bN=񩨷TSGd&ҐjNhdݧJl7ZD !zT`:0ԣ lR1>P/ޟBQ^K2?<~V.eM5p1KJMֿg.݇6sYkqT2'LI.Iwhn>ͬ ^("BK +`&~đ<0-+.lmԞws_wKK/Xl "y-?%+_Gk 4@$T8|ђi3&3D+g|8pʝvFnfƃr iŧ*E΢DA}RcY;!9ؿ?tRO[-PHjJ< L)ee\OTWZbzY}2#J~<9)a0:345 g;HtQ';jxOFZLB!yM"R>yJGշzVP /6N2o= _x칅8 fLy9QYbp7/ǞJ<#4y")>v^[ M$4f>/qQI*:ؔlŧOY\xTVI›~ܫţ_|bϮi,irm6*Ȗh@S ÝlыJc$RIOA(66! Yd*N'oѸ$F:Z>8hA5<2t>fqCѹkn[-Z˹jc愵̓v!F?!yۊ7NQ ˦> [oVr͂BzU0*!#bd sТg LZ!xt&z>jX-ـu,5h -R!5uzT0</5۪p\SM@'ЗY.{n8Ԧ6AS!pPͽӚ&nST{ WǬPpl Iɣ#5"'oos, >~͢e4t_RȺt^5qSq+NznݟZQbnIAX 8r+54i[_cF }8è<5-Ji`+w[j dzeghZQbɴڼx$w9IVkݸ~<,%1[~JԃU:J +[heS K]6ue7٧(\?}ԏzw׏$'všn ͍n}x3w*"3 9T;bt ujQF/v 2?N;]te;-7L6ظXA",p^fXԴZP erشR{ͱ~ x2BKNj/jb;+5W~1VB|v[P v#"@L7ޑope%e,G 4Bׇ+A6|Fgi#m$zy|kM`-S#x4޶$|'430bU@af)azAA&p1"np^Rl-4QY&qM(xCBHHug 0Iø">Exbc'Z .`dvgZ#r hվbhyRh)1hP?ѽNxeVh'ˑ ,"q-DiJJV 0'q:h)c'4 %*k0|UcUc#l}qP!<-[CKd%^ƧiUGI_iSneuHk}Σ$T%Cz 2eNփuFc:Yr8K Xٴ#S.T8іSa4 ·P[:ok@b0cWa.lTbZ(lT>W Dj gv$N3d`{wn.n='4HqR޿7.u -8Irt̞atXd#n;L'S Y8|jޞ Z$\s.׍"1J/L]h\ͺ%W'`9ߩ.<?'n֦XQ7^^'5D:{G2>L*v2h୥C9%8 !y WB7ZTJX!~MXv pӀ O76=aDDo-Ji]ٻFndW 6N?ef,A&}8 6 %e(Ғ-[j "*֕,zWr 3L7{;Mlp@m|%J{8[H7ھ"Еf}EXh>i"8~&)=_p4J.3Ά MAl8PD͉J~4u$ Amv3.fciF$(!oM;Cm&r>7iA,]PJrC06 >02ޛ3VoΧ=ar)2B+(_ ?I Y(('q`/Lb˓ BS5٫]$0K^:raE*Gw*/ 0|(uWk1 `1+#i) ia&T=ei<1u(_Wx`Ю5Պ E R;I*x]/$aYEI WVf'ux !̓%D.Tfr2,.)O|1_]%2f+|L@6 āpEU0|t+75s${`5US;sVtl;5g27$@.=X>3&'F>%'` O_5RcL0LitJdʁ,;<}ya~}Y9hղtg5C[%٠4dWbZW L.Wan( 0;]"w{j7m?I׎yܖgh-e"k-]E ǴqP_f\dN mC[&2%DjL^>)܅M`+yW}e:_az1bI8י1=Y~n?P3AC2bJՉJ}Inal?&UmIF ]:ϽMft}mmBAx6CՄ !(>jrZ 5^JJ 8-mÿlo/Brl{vxsJՆYLޢ-m/Y2ۓY -4hBm\&]`{l{f.\27<πƶǠМy2O4o=jXQ%J!ezЍdK Z ăzc8dm [Ndo|МёٕߙPV<檍JsuҾ=FkaWמf6VQctG{yR̳ӒMc2ዐ k\g6/s${sJ8ZM E{ɳ ʐsAǘ { ҘRm[%2ȃ'J#^6n|pǙh!R)Z:7h@*G;9n4+vF݅-`Jvy3FmLR]]рBeׇp=ȅD/lX1Lxk])mjڹzA^ #|#IϷ>\Hz}gok=d(#k.Il0M4R5JdueCXI*}bpA/v.FSG}pJF ).NI=]N\x1l y'|<:crp]_gFrz*𛖀qNI3:8+h!f.}/>%ܠLjpR0h|oFm':l$;ŕGa1:3_9_Ҥ'ķ|{5ڦۄa|aݦ#!+$e!Je]m8z?9dh@ t;_@PG=˾{GCg gD OUr"LJ@^Q41]1Rtx<888<(RCknf%4V)h>=]vV钶o/|ѕ,{Ϸ?|sz'w:Y|ݳ|3>Mڲ1vˈ?y+c!@ wAV+g-JqcTRk7zSzuUR.K3/#N`EۀUx>(`ͩ:x'm(*ǬI\K 8)shp>Sgp>ZƑ;HbJ[ 4eA3A)/GMXQ ((1|%mIcʪ ^0T{"C)- qx;D0w疕5J!6XbJ]E.qg꾞o;/_pet |}sl17a/+G|0;Fݛn//1~Jş!?\dv;~NH`m U܀)"-(irtxJǪAd#t{W oW䣨;?Mf7Qst^>MR%UH;tބd6]8w_ 4v`r K6 0Qg3 4~v2sȝƧC|۩/CQB4$tto"@ ^{/X2b'JY/y2Z5d)$/quP``Ce]j0FT0hm$+r;VF^R# 55/C j1&UQEA#Q3O=(r[CfZE,;F bmmpݞEڒGW/z̥w~ߜ%_i_&}8#- GRNͤ,=Al#=Ɣ~FU{6% H)߉8?C)48V *D'H!qCɭ,km_i@E˯:,LP.%Vre*++ z`\ȭ#41Z+Q;4x)  9BW0 ™b@rRP_8" ;Г[&%;QY?!1XB\0i&) F blfieק}Zonbg™C ĭ3%UQ*st봎>\ǭr,]Sk1>9댆@btr+rc C7<0Bp(aD!XժVX)dŜ9\ Fl=0xlǷa3[8:߄fƔ8=*F=  c̑<1iuYQrj5*5š1; pw5] jccM)CV+r'} 'e&t8o&T'i aԂ$?e ୏g33 ?#UR==fDVrW'˻BܸF- 7M3 qSBYL}?*MZh"nP ث$qШTcqGjP&- zZb#$A5Z9CR%{iWXcBw^UPi͚@#hb  ^jP)DXj ԃ=W ZP%(FlBw ٕPUlFu0\eƊA % v^vUK]>BJ{]H%? X\C* .2TH EM̺% -i2QfݪA[H"f~1|*3]:NҦ:0>uPp=jI $' *;OtaryVv:<6ӝvUn}~I#0$=sr x y{rRY;gEϹ5mBFث^{ތL_329p^چF'#]4n !0pxp\w07PHf':޹n@$Rl2{Wbu~>ܺ+JT]mvѕTN"/okFY[ڞ%182jD @ qy6(Vf PI* 02},cKt^ QFpEsB.xZX۫Y/'͓.,d,d^$c]$B6݌H%T6uz[٦U)ä=ͨ@MoHʹ?~5@L!M!@xK7o?Ounf9-SOm-N{я:ud]~L@eh9/{yO(R-VA-?9'~ICa8$=jjUVAîYh86,>Sϻgw̭EǻG#1~^kX O1It6~x;SW\،Eg9n/g-0Gb>(Q}BNFk]lYG_#A a߾$q>_[g;Ljnfߴú8}X/ۼ dC̽>,/#Pw=M"$ &. oVMipvqgQ76Kոe`)0Q@KW@|6c pN4@oi7d+nlOV9xLj6n{-{ײdt`>qzRֳBBtP{6*?h 5s rNtO<n\Y?&—Oߧz*SSK2v -v`-?9'~O- ʆȧU<֞ֈ/aj`\>!/{ͻ&r a0iQec;ggsɏfmJ[ok{jtk62n _m+vp1S*k85E: MsL %ڐsBVvɒ*Vkm#G/9%|? ^d >0kǴQŠXV8ɟ\,OI?tJG=eWhA"y6?6?ɳ.{;xG;]Y^ ξ p>!D}Y !Rz^U V \cneyf# m(Jc0U`< ,j !Nnj%v˛Ձqvs5YE_N؂h=.e[,@_OfS0w𻻹K3 ջz8Bb 'kv'}<0R&̫?]́f4*#mA0lӿ *z eCuaQcrHBs-):5ykuӸV[)9SGƊ4wuBBs=X w]Sj\kQ?*#O#S]Ny2|;`5Z>UVV9RRd2Įَd~x=:ep^CΉh( yT(sEN>9o\Kv-mx$+p\1BLa #K#)is-# }r!pI<1}8/8\_%]V1e8u%CeeIp=&$cp"WƘ6,q1C21- &U[BKPB T^=*rrF H3ccxT*B]S%Jh P;ofTźY1u-ޝ_kyy0ZL|\lE0/gi :_{u?y֍Hٰt7D෫cU[v\ƃ| #whYfֶUězExs40vOwE^27c|AN:7(g x`F5ౌx$&cL:.C(A[EK G͹_Vv`Is .'^ _:|t2}e YHEUյO?1&Hq}D=-7̳5 Kx(pXЌV4.~aʙP&R.ÛP-6߾?zsI;Yƛֈv-;k (0i&4ga7x[șCG>*併S2-@̼tS`K#ECsO;,5G*49DK9w`G>=zBhR ؋iF865*0瘨b3o,95zvd&+PjO׿]'uhXurlqsR9HM[T)9 -{Gw C|g,P>l[zT VH3+F 3+F 5Kٺfi0OɮyT+ ĸ2:KnWa1p۳ y~ҋHO0mSIٔQ& 9 m/dg*ԒJmڅd&a$" $f$b$4G&4,-e&8'Pkحr;+G@+~*[ŐQ8ff7 UFhG:yaCԳOԋ9^$ꚉӂ+h@yր 8r a* (bx*RzJA " Q\~GwW=JB Dţ)~w82~ OcDiwiG;&Ob 8 O䴰ϠI (1ӆ)40sAha|h`6̿8vK$+ f2Bt^o1Vg0k,LbdD3&T/a G1 eQ/Q2 R<Yx(YI= no$l7й^A Ŵﮀ\J!E;tmq_bɈjyo_B 4>s'itHK̄!p GmBрWE`> W-j|1C^wQrБWS1CxdX]}q6_}xIvۓ_bӍr5g5TzNBmأ)ù}.n_˧W 3W z8kIFWn_8p=|d3GB U/5Ц^pz)-%s:(P4$)PԆH> ۚm&pk'Bi5.nu0ݱ'\*uI\pXͧ^;R=56FXTJsMkXNv~(N2K两1MϠ>N1{#o>a]ph,95 \!pIe_c;ZO: \GTz+8‡TCw.^N...o@h#ފ,!xl6H]{ZMo.:MZS4jP0ɶf)@74SrmLd-aҀJq E!˥C .6YqK fI ___[Y1P*\6ہATXhejKQ{^*NUfq/\e<)łzL1 #-U XP fNb! Aqkv:3Ddh6*, Zi<53VQb`> Dԓ;`>+o|'G|Б"4yDj=l|RCCz,jApDS%#gh F6fmD3;{!xbS]~7N: )ƍD cq,@j:6ѱ~ bYNE6 Ύ>ƻh k 7.vD+sZEw>F]3 l EyY"N8-=$hu_4}s.:fVg_ X-#xNDP̔FGWǿi ej7y#ML!PU & οKtFXPǥu^bh$FSk(λQ>lH;%G`*]+O9"5MN5o )e=$J6N"dK8e%$*}[\j(<@ .pJ>K0w7}vPV>)Fj=O>Bb.w`qy*zz&ᡊe.gKy3Rt^5K_rq[xV)"z_ɧ_ޜMlX wUw_0J )̘lri拟VQHO%ӄR]fy8EXq ܬpw} m5D DB$iF64*0!2&61ryyן*G®rmOD>/Sik4 ITvZRu򓹸\*O<'Uq9V{ܿߥ> M*Ŵ6b~yש)DZy2ϓy󺖹j.nDF§n ?n5AK%F-c+ǫɻ즊J;z5Xmg YPUŗ8'ԅ/|S>(Z`ŔɎcs `{gR+LXFD& /RZZ4i}rEOE:QRܹA+7[:"Z s,0Kh(Ɍ9GC4!|\"QiR܋sQ섰:hAcAxe@ >j `}"LB(>!JW>muJ`v?^gЫN<;&6A2ԖA;N džF !~V 0җ.QaLwwoGT( S*4{Uh BҲ~|*ZYV\ߥ#!J3IGHSvW:vO'qF1h@ŐPј;s68CS1BC@PDϥJZ Xá2jҶs`Ėֳf̙\\41;|p +II!",0 WZa-n23d}̜6wYon97gw-Gr`;fh@{/IOi{Czf^s]79fL%baL][o#+_v uyJr}8AfKzYIIrEԲZc!Ab-U.[UL;EwQ㬡G/Ѹ =D[׷z nvG RM`|(p;O<0-X%*n-C+0C[O9Nm\[뮏-jN?=xIOtQ8eѽ7I@w*.ܛhHsBaTLbwjᵶ%,_\9[D,á= 9 Hk u~+"JSxj<0tCCf#SJݺ[?s;X[)͎AJȶ^on' @aW aҞk>gsZa;[k(̽X!xwאw%劊n>T7e|TO2w!'2K{Ĺ?ϟ<;4!z | Ȳ^~9 |qz,Wn>Ǐ *8bXRVz@ RknnWʯ.4%ّ5j@5.K]v1BtY @oTha`09 * ء)zu0 FDfEv6uFREV)9H9_N+^o,VV)Au7M#I^ҋk)aLKSbMb@)[kêU Jĩ#Gz8xVyM9X7˗EcdXF4GEN뀚}WEoձien-\Ayz5u٣\utCH_ $ͽZ֎`=SC8'Թ鍈|. כi#ezj(y94a70@LN=kw vQ=GMU׆{N^,lonBҽsMoO:ïwᦉPoo{G-fSEFU-)KJK.Kl>mFk 80ETTɁHP''(wF!4TIf-Ec:×NJZt ˙"L DQ2$QpM 2x$*-ɄYrρxl R3֨O -&)UMe,y$4bTzޒTD:Vyn=(KZtga[]]&].@`o'^0: α$0yF9 3FjI Cpz_+cˏ4>YfF1o٣@ac[ǡHPJK%s[@ NyG:47$Bw `w,v,XLGGF@&2E^Dg graH+ AA9T g`G' mOx8) 8\𷹭KG~r^7T̟{~q{^z#Uq2Go@^({l[Fלӎ_v~F !(TkC2( 6ۊ,O4iCc_fxK&+B7;[\_2&2.s8vʒfVV h&d6hSm>n JX0'y-vŧX(8%)Z7W}Ka7(Pv؍TvT(ׂkl7O84jM1vcf@3{ϟ9w.dۋUA /osqJtZо$D=~ Hc:ru7h/b} eb 2RJC:(,>MSs4]^H#M!e?\t~ݦfyE*- l~/^xu YL0u zq,NqX$&0=][pYgO7+Ei+ 5={JV!}:jDu 7epN|w.xF^<9ݣygV@<{gcMpϡsSУ?DV9tލ(ՆQ 􆔖U :%#vTeVeVeVeUjWO &$z+4J#yipɁ2Yt*Bjvoy}}>Ho4\ O ^$\KOekCJ|ķTHV_d[pЊ=PBpNQROy:p0G/&!;\SFAIw/jyAod?\Uy[W+9%wt[BX^a;`,s\2ר1/&oRjUmz8njLԧSLKSbMb@vz-\g7} c~Dgh5A{pOA4BiOp݃'iQ8~^ 6et lk*Za<8u @a/qqw|w%ˮn?).qљ] U32ôx @=ﵞz5Bqګ{h TOY4+εK#DQ A  I!h2"Ew˰ٲ}+S6P<72FZOVP)bNST]eNdp2EaI;' SZ1 x8uL潰ch}MPf Le7ox6ohAꗍ4T^+dAq}`^wЕFEFtԈƃ_CNeo-bi{AN ,$ԂpSz8dk$-j) "$oT|LGBWхdq$$ٞ݇Fr jr &ôֻDpSƈ9FA-,m <&CA΀\K*Lm?o>-2^7 $`+JT"= ڜ^y}&Ga~cJcV=n2cDZ; WB\sU[`yH/VaKyc;1Z2ZTm G;M}hGH>^IYl=e7ٻ6$r}XbqMo_xHI1W=Cǐ==Z 5.O}=aq2+~_Wߌ>IO4C|E\DӠ^oؘxbDb;uBsr.X ZSSTE4FyQ :|Wxpy:1? I[BkӝdZT®mj]7aT2Y<)&/CJDš}KWOvϵ:$j<$VÝs(]3s| y W/5 yӽ&"B#בLzx Lnpx ~'8}@3o'x}m'[+H5((N`khE \މ]͵ǫTmַrvn:At7hz}% U& Ԃjz9׹A&@McIlD (bsb24_/‚&4:|ke[EPP:ʇj]ʼ6s$]# lw Isaw _YJvz,3x,S-,P#Hߧ S)y|+gbm =ŬTئge>PE!rCVҥX\$ċ BaA^x1!1^]rT`n NRE~8]X-|'>mٔPW#Jn6>d0Zz`KH#ak$BIMM>)Ʒ6ںl2mwOC@ehC񖺫XKQvJ HbU>nSEedxP W3è$],1fd[e@u9?"jk*2HfG6{lujkm95?offi`W/jnѿxc`kn]HfwrBo}\CIb.B!6z;3)BUo 6?ܲHB^zTPڍN|1p9hStgͿPv !!/\Dda7#C& /.;G߱T2_n[ y"%Sޟ>n|5G拁QG#tm_9Q.eJ 27?Zgd_-SVpqs:fYƱd*EGa}r Te]X%Np5X5e?RhђZE\A5ZZ:_1n sj6CE.OÔ)GS]5qS!8p0y@:ʃZ(u6DH$QK1/&%x_&ԯ>J?^͑=ɹ~=.ڒ1j/><Jyj3RKh^0{yEv//aP}u?$FBۅ Cా S)BoeQk*KU"@jO\DjJpQjw$_#QG]dťoai0(δ64KI4E4,-!#fc-nf= 7%X8FDc|Y2)GiRX |^J4]ZpgRaC s3T~^ ԊW0Qk5^1)5-]{׈ƫ=zgDjX/G]OT *xkIX Xj +^`(5RS_;G{:XqWq .AX]^ɑe"-Dθ"v;˨1d,J":X+yX$V%qZ@[ QopS1*6 -Nܩ .'\#k!A n*JD(@Xt_0!9zye\ILd=dkQF[ܽDPJJ*tiQ~aEY8J! A g '$ձT&1 :rXsM2$a?}z75y0wKEQ,k}d:Ɇ}QbF0,7QۿZ*F,8g^x8/^n)Ys`B& uhdCuǩ*ֺyY`EIZ*$]Wl&I]<$5Q#JQ)um/aMjk>G~e7hyKr:Iql+d ??3;{&{gdg1Gi-IS:*R$ Ii~c#0A}.(x(!G b1ϛQ;ћч|1F<-l9G{&ͅV1R cY,LiiOҧ9r|[rl!|ɤQ̇x ;&,W`1"rk"oӿ^jmT𔫹ǥ~{pWLS |5Fշ&'ƓZsM!i& Xi 0R4K}$JyN(31AOxcŽHk$* DN`5R+|zkA΅PrziXNd1e /rZOh:Ð\؝bA ]!vª¢-Nnݺî 24PQB&}+k&=.V#*`cI$bJ$ϐQ3 Pgma[)yϟw.kͻ.Qj0kUkι7ScVu:W@wrHc=8if}h:?_cW2mqZgCľ?Z=-ߎ=yq .L3fZ<&Eҏ=-v;|=Ld4MAŭXxc`{]sszcOf>Y]<-UKB6ݥhPJxw$?f}}pTHOIhV%VW I Q҈?n .H|1p9hb+"mk7un!$䅋hs"DP2x1޳ר\J%vkԀ+Jǩh`70-_ D[LiU-i/2uCgϻiS.#TesǗׄZ$U]K;өj-HȜrv0޲K&%LoКK@LNWͫEĔ3Oթ"|t>u嘡ְ?ܛV:[wRzՎP=vNF:唫9d睳ySL*^4,4UxX)yud\; s1U*83^{s.cF*9bA Gg! ]`ty#xsk֛}{JV1QtmLxDNLC,A[c ϜϤn'xԄ*.Vl@M(k"rɤ*LkC) LgQ J"X%iԂ Vx^ .Uv;)V Мj*(.`ks,Ad)Pm Sf`0][HߧH=DJJ]Fll6=xd%%XjJZe5M+zPdl6~:yL. GgF1N""t$C|%BN1<ЈcQ# ;:KGe9#jPFM(hq#(Z"Pn9F9ؚaWcKPu=ΰ:c$hiv̭gUhU?)5K_4k՜\__r3񚺳q#(Lt9+RBn) rnWԞJy2ɒT%Oݘoo'/e(%vN2cR\ "QiAܮFz9A pX BįOq7 |;itH ҭ@*t+NĿ9} 5YE\(ٷȧ K5}lɪ߆t5_ `e3do-:؀OT]}bI;T Q=)0SLIZ$;lAJ=OȚ*fi} AxyƐ!W)1DL5B()ja˅#m&1^Xϛe{)ׄC _ b9XOF 8!dHk.I]U]&, aU> ۵]HY'grsnT;6όc&v – v5楅݉l(1cGibF7[Ņ%9)hV`^,3EsMeRXUp+3[Y m5W~nV3XS)Ed B۬,JJVZY4Ea5 a%"baI X|DYL7|rWE3yiXJK vW{fAY0j$J1MsJs%s [kAlN-fZI7 $LrxxR$(FW K|VGolL\e~uv%G_X>{{SW29:k¯5 j4>\i"uw< wvf#]7]T >s!b+^PT {o&{AyogM>NAhyƐ!גc8xK{*'̈\D]DYM1h*ܕ(-|IA{ ajCW>wHԀD'iaꏑ]05`!%BaUSLKԀ[dCֶ2e;η ftXue_ͥ3YY=TVnRZ4-1Y SB2I ܒe(#M9j ZJx4*_|j}QNhå 6U@ 3 P0Q7N-K7xGJEO?R=8*vLhQhT0ɉ TɉiQRDt.$ KH$P~&-BFgܙ!'N'1Nx;ń#ՠa(~C0B )$4V&ֺ0D)u{o: QJrVZS1l o)bS< K [<8]Fdlɯag %`E|r/;yI.3(a1̠̈_y[@w]rTirdD/j[ "w- U]W=є~PKDMFtJBM(tIIp ׶ @W&~*2b2 f4$U+, F{;[[ؙX|vo g/͋[X#d6x*1:p]:zR鮲ξY8OL\~afݐh)5aGI[,!S'MFS՟t? n)f QsJvNϟ+A`M;J: =!J뎝nb i2Yf)Ǩg\#)kׇUrھ)z;ϡ|O%{q0Z$q`3#Z>Tqab{>MCNMuA-fYAHoĒ`Yw_ Vϓ Y% = 'q'Mnh]k󗬶IT-_.AtZt}u@~&y30?zWjbBsq[퓞 H+)QC" ҊR%kC>"IGs*>ՙY} ^]M1ս@. Ou[MfmxU>/G+{jJ,8['Uauv0ޝ}s6w6+G_i)'V b_XBՀ*w0 a^Q'^[' :p#Kp/=Y51"c F~x}lC :o7B&CsDt ;khsK4RL=/a I` E3o±vևpT@%ϭz1R5,j XHk &P)p(cD-XJ=]kA|+1qZ0P._"3y/.H9YA`00rh["5+Q`"` C4&,944)X3v$mM289Er(>pÝx[)fHz|w;|q>^My`.K%%R`&R6,ȩ5nz3xHxϥ0p+Sgz==GH>+-لfI s}g|k1Š/ Kh*  wvNfñbcԆ.L-{]GFVh;B`G )FCƜ3[Q@AC.:PԄ[Ngh#%Y&Cz_=CWhԓgh5IpO]Z-zȎz6bdr`,l3Oڄ[Sĭ= ^w.#g׈b0[H[ <+ >X^#!qpC"=2ꐲ۫!8k b)0px~Nѡ8y+^=&ÎѶo;EgMK2,c^]j/D'+znMXwD!8Tݱ*NEnm+p&VPy1:CO=XkP g]1h4ZTsq8S^2[P[Jn . 6إh Ji‰fQhI֏fr 2(D5BOS^ 4Ks#OcmthmF6)A>/ʂ [)% BI-aYsjye%+q)JKSbAT$\=T&)-[DHOTuV37|XH-ҍ|}RF*1f SL7T3G/4hGx)eL7S{&!/|F)>-v[ t d>t;<I'- yS4jNy[mA5:H=z: ~ۡf,e᛻4^d2V܆]on]*|Qլ<)N=^b ~lr=lcdAZ1%JG!O?o b- {J͸CvX$Y>(g5|ZVB 36)AŦ;BRR>)W`ĕ7%,oY42zf@݃S| ;X )<8s[/1Zbh#"ɛc|m^H_ւkIrµk4&apD垆PY |C@jnX B|hV;3jn3OsʕTk:庘NCX+Q2J1󂂕`2Qb {ѯi zIk!wߴ) P^@d]+NԎ@sPK$Wdŗ Ϯ` 2)'+U괋f:f'wiy#Yw?]YPQWoǹm ܛf wWqwwWqw;wv>4wة+do라x V%NIrĠ%r͒{uP+ayO6+؊_f 0S)zrK3†Z^("*ѵVn9^e-Pk&'&{'ϛД~}\ܟ|rwmH_ew0GQ2v1y}&ٶ'HrGʲݲ1-6IƲ+V)a.w4\iPJod%oo$ I$p'UHdp"1UR9[B ( hFy:weCJb]lM.vN e< SY}6C(tu+$nIrNwܸW]P Y,*]DOˈ !6\K(@+$ݶ ݶ?]t 6I<(WМŽv{%@ ϵ-/dZEa%MӽSEpExJb^#JaJFa +U2JTBԴxEF-Ä94U&gX} :cF~ uwV2ʻrX )pB35u};ivxkkITUSп:I3rY6bh*qHOO68w|EMGX~Vz~ {qrE('U{&(rh;23Owvۺq\l>{!ŻYԸ3E6ptc!;>xBd\H~B^ V.=W'0RϽQ"btrfӮԄ |(7&SṉQQﶖ$ ="|w4Yh-0GwG׳`˳?vN )6J )Ash'i :|2_n~#6tⱉ6EYЊ}ף>Mǻ_n^* 1DptV JZݝiVܕV9,˛Zyzyua9jF9Q%Mէ먪WBۜ,{uǣQ9ݿ:e7"SQz5BS LּL&{xS$!#E uơ/X#Z>|Z!:@;UrT9iUc VΧ֢ JylZwwtN~£qR2q)OiQ>nD6KDx[p>tIYxnA\u$e +6]"H7 w& w&v~nHޢ Pӳ6xV)^Z)!8%pGxA,HХ-wRn^q^VmsۙnЌ ?nrϯIڿWyXm?d!KU=Qzǽ" VY$0e Z9 DC3.)?| YeӷG ox6FG0F{BgO(ɫSJA'L7%!8/TLP{ *,1y24Ctt%-m| \~ɿ`]9*SClb?(jyG<Ӻ(k[.ڟN@YQѼ@A %j#u4gD(julNpaM[ˋOą锑vuh^W}/ӆ2: "Q9^=rϖvJEVmdLJSږQl͙64pȮo'Ju;=P+)r@.0kߎ"zhܯ]D1꺘~Βg1i,k/ zBydCG|W|. E)r$Q ʐ0Ԭq T\-gٝemM6f5kN3\DP` ^k@d &ie@w~/AqodHBh.U.SX905vw]Pcz[[$ ̀`ZꮋaU]TV$JsAj"n8m7MzPX^)P90HKhJYhZYi @6ۧSέ!N>TȞ36 PC);f/$QCi%!Jâ8=hsǘG#T$L S#D}(1CrAVFv ;7/K;MYFlx {-]$zO.EzFV!Oŝ0{+hqQEEH؁=(Hm'O 4>L5XFR06C0r_K2i9RU+M}dM@*<^ga_ga:~L,#eĞQrx-ƴ F*ɺ] hy(^aב0ZXPszgYNNr?դ0bǎ&Mk(qoga5d$џNg :A԰?~k%#GPԩIAQ Y-GJ%V*z`/F}4)m<R 2y2)2pܳ@@4~M~ =fe_͛G~ 1^,;|/o'{ꆤ:pcWu:k=SG9ύSj0Չh<75) -ȥ:c*Q f+ 1ü @ xM ԛ&RM5i̛1|B̃q8V٭ܱʣ5AI;SSJ"ʓ*|jx]klV چXz?0VRgU|NQNIV}A+Aj\(YJ*g;*tMakl4hm7>BѺka&2>{dPЄ˪_|f- ȍ. <\ݞ\~]"YT$ټ{c{dB7Tm';.m >2FTRQy#]ЩM5<:F,HR8 eNVu+&ga=(!m$yEYլb!DÌH͹;[9=k}C#i'FϖG+̀*5 ]1;!10ƨZ+zƳĘ˟R1HO! JMp Y؀0OT@Yľ#*>h=@IPzw>{CbR k%^qY0UagqF0`cyAYևz0(igCafJPIUC_]` 3Wx6U& ".aߙ]InD[nQ! oX0ZQ]/ 1NnġXJ (,$($Q"]i$Yc ǵPƜ c~_6c4`RCd ^6C9n-GfXwW_{0D::;/?*<3-r !8 % {RG1 μOH!|؈P՚]\5&X'}X}b5}:1$T&G-^~|yw+7"K󞋕7H5eQRWZb)R$@˂ZAw-„)<:105|xPbB{cN4coJN1Hzfd4>)ph8SZ^ќ _TT?qhwBhVwh FStbI*)c^dX-4A!*)VʮlWծr4BK+-wyJ =5 mh_êqx>;q0f?`͖lbʊVUg×QڻxmgOa#?$pxϜˉWAB|96HZᇸ #b &yrĠ>CS+`P;](GU=hf~3/x3yS)* Ѱ"YeN aYaLwe3_jںBZha%ڊk!x]f~0ok "dp߲fytIzܦ7P#;_+6TNaiRRRPY718V+[1: \Ssf_=\t&2h\wjDM'&t`R[˭Q[4-(Ðd2>y)5bpEJ-Hdb}mҖS/'x< [|/o=|=EDo s*b-6<~K*q&ֈI%Tamj%Z+֜$H +RSp#;h#~[#0g0IJCଦ!}Y,qN Jd}y_Ë͏^}@ʀM?&N{dC o_ϟO}Y ϏO}ޟP`֍b'(={ES4LZ3 z7N7o߃sK7inNmn ;7^Q'Hn'(UTY\,{,>sszpwMli$Nq>2[[.j ~n|Kio ܇ĆN#K\Np(y:>\z; ѕŝ (dKf q5y'Kv !8GgY5yժE8>@a(a1:m֭l[HPk)HeI([ )ƐtI,q5*2n[?Lbʰ+8#  zF uvxey-8<nsaO#o$A5Z)FvP!oK|D&9Jג~m3uMML5SPծL{ӷk @[Rc!Bmakzʚ!ԺHkҤs͹m|Iv&LTy{[ϵGe;ɽՉ|H:j1N-,I\_yt>sSNjrR}kU3V'S Od"j."Tkt0$ƄT]lXDy` Q#l-dCMڜ][F0Ɗ$CW+v'i{70dy,P)gm:_5?NfQ.S79' :%iU5GFL=Sp[}k7CKtf_Zj؀NYKlJnc~e,u4$LM Y\R(-9Zԡ`b)C#I Ȧ-\KJACf~- IWIsIZt*#%:Qrh?Up2vw1i՗HҁDW'J6?SVQG+VKKW\=T 4 Cvz8i"55u||q%ZV[Vy{%MAU 85{zzIO5tkYr:j%b/'~(?EYj@_c;b3Cb.UX' 3 b5Y^IĢxAn8WtƻGD!23s;:|%#S%l9+0b 4\>秋xP'^3:KX*TJܢLøjTIΗ{ ]m &CPs8$ r,FέrƅRAr!3粄Tc3 ht@j s mK!JB q|]d5sA+DLB )Ljlcf}|\%jFlѴ3 eqMuS$IJxEnKh1Ü[7:8CQOVEo'Z|qqb"ȗʑJ $bU gFc1Dq(w?|6Đ]Вs&1BeCq:)ĻQ[J iw {F"cBsMnbEP7PQ5i$VSx+`֝PHMVV6cðH41H *aͭ_)3 Iz`tՀIVK-?Ș }޸$Ph !lDɴJyࢆշdzZuK6=`Kȡ &pl7o5$c4_O߇[..'B$iFsLTJM -$M=6y~>(הN\%&@Ir\I E=U \OߍI5$H^Pk=HP4b)Ql:5Y=T=>E$?- ^&Iműxsj%W|-oQDֵcc zHØm-piNkJ$_ NW-.d|-@P9+j*ItgE6I$h% 7Ʀ@7gd}dDATIYڛ[}"լ\`~e5b57Yz~dY$J75ܮ/[ߦZ}o}uqhQyd}"k;b}-m M2 ҧ&ytԙ1Bӻ7YsƄ2o>j2ܯL|`@9sjɚ,=SLº*\% Ɍ2&Ny@!Pr%kCEx^8Bhisr\[86,@ G; xa\jY>7+?IPG1ͼ F3?` Gϥf˨{BNQ%+6+`̗$JdD$$qsro$|Y'Oa1TFI{A[0yc $Wo9`%0Z0jJ<5YYTIXvXaͺ_/2 X7wfdvy;Q%Z,^>[M%𚽜ej~v>+u~e?>zQ7ƹm5Ͼ߅Gt5>1[}'zw=bYf 1 B(D{ix^ȌE"L]CTMR+{eIyݳ˾^:(͋<^Ú_=_Nخ%w7z8hɀrkOi|bӞ  Re#fb`FKEq`aO1ʻ{/Zz I)R5%A.9y/A MuRi#PlgiIaֹJ:DYg*SԻ1;oڳ*Tz܍1#7cuA)C#;*nr{}68N0]4Sk[M>=|̈́O$?!1uY`@2fw2P!@NxpDx/sɌJegu υt}NV$.bY8>Dz&r©(X@RY,o]>Pm FIT)QZ)>FBֵ.g⻕/g*7R\>ǿi|7߾|v%6l2wjx~|}~zQ5H_颼__}-9R,YG'M>+gsog3zQg?w35]fO:J?o?O*F!?ǀӞ8{Ϣ6 "~7$d( RTK&aSial0Ilt>y[qmA3j*pXN̋ιWVK4>9kLԻ,@u^#ʲՋ¤Anppmlp0v]ꛍݺ F<~Ao|Ľt?= Ua/?H.?MdF~ #;w|rl̗t覃q2"tm{vߞ}WӬ~X }#eŇE~wã{\oݔ[T'^yV:5H$S`מ.cy9S[q m#W 6%vKu]šYN9[A s8vdRnZ D9]d{w~?RPAF a<5dvlk|~deZx[ߞϰRZp͝y'W!X 9]ov.)޴n;s>8Ӿn=rv/'鷻c;1nοk L&i2?di#bQp܈Ɲo`T~cu9m$-aMic{KN+|ifەk7ԗviB<[06B10,S ]©bw%皃+nuHz>oLq*ʷ 53ɖEZC|%P`2;lݫ{~Œ}9`pbXE0xqqؗaXKb .hS"[c .'B -9ʐXuaZ^"`ax믒 A:K@q]ÉA–u@+p#c~.sD0łI/ܿi?QeːaC1ה"Iibq{{ΰPQ"`[[s ~l%<Ĉ3ΤVXƔM0'20Y$!,Kr1" 3'ƉZ |tu2 eL.HMPH!G! z*9ix Xt5QD0raqL Q0;57b'd6[,IDV%EDQıR3j,ժW6n>b*=yi; #%\JirP&u줓O" a5`<cid"mcm@N] D?P`EXJP~-a}Bpl1oA 5u"ز ryVaQy4ɲX@ |QR.h>zPڃb8Su <ӄ_z(,ެV6[M4 7X]|ne,-D[emc3d\y*ACֻNCSh{b݂La *d#O/:CZlΛI KE de? 0]>V%p} q U|6%f>]rH_wR7ʅhy/ \Q@rˡMo(/BI0p>i>c" !8ٶٿӮ\0ȞL=#`KIQe}՘݇\ܶMo9.=30crmpyk)[C*赓 C;;˖ YF_F-993k%\#uwʂ8^LM?/Öl s7KyzS=.h,6?P֎caDO!`(cƷ &摛KFH}01:4W%+Ole0` ./'́gY~!",Cž>SN^O1~'\;w,pV0Jh}3='[n7/ 8/7:2|324檓g '8a,_J!Njgg^ah;W5,(ij[;UhTjC 2kndz,[u-0Q12c$ OP(~#@yx53g!YoFUQ{MM#gT*t"RQ0ꬲYMe#{@}#h VÒjr4*I -}r bYLt{@$X !Kv '%+.w[[Cmρ.W{TJ?[j*1VPY^hp V7(^v)p2%5tMltd:߮ٲ̗伖Z_nAJM/ںgu0% ZnCE)cS#?gڄ:￱Dn/,ڧ[FVx?TnOD V[x:|zɰgA+,48xEi6;U'9r=ĽIߺm!YuNlz֥[uKॼtj~LߝIf 勹z;|4U~b9Bk~j|6\lʶXHQ\D^}<ˉ>AR`exLG1C e8-ͣD?/"Dqp8k,''}?/rG){ˎL^f[-Aik9M5 IȌQ7orD8ͬdq1Bİ!S7A_)Ӏb`?mO[3r˫n{aKj[,X琗DɊ@ɱz.9Ehk[ja\IC\fZiLSplg d ֒QeT'''Fgt6kk!I}[OZXNdfIZtW0I5l+ջ;\)3EQexHe̎ 8O%f b|.<\) ->)5um3 #E%LȔ:V"ܔ\:ʒ4 ɶ[V?r[ DXTVBreb02KtiD+Ndz~s',)_㐲_I?[JJ 6#؈) w%mh2s$; $?:#Րc&N&Vs?V/*.S=bӿ,‹˦ bL',Ad(!Qa$pT,+8RbKjApu0oyx(ŷ;C}xR}Aa_0FSjҪjwHހ$G#N77s߶=FNX}$dF =oχ֛qluB]>M `RQPC2/syY6ٌ?,ei pezȽ%aȬ(%^區}7t*xL= ]eC?zַ{Hg CI落02,aH4dSڛ-&w=+)Ekn`^wT1%K\ HߔJ>U+ڛ(wÑF8UIڧ:):n '.8!>Rr:5;*[r~; 94RR@]UA١Eef"L,: Ɔv76Vou z//n.|?Lbe[nް7\;k~)ŧe|,o~_.VWwJAtݻ|`煿]|{'[p+n+iZ nDr;r > '{q po_㗰 F9 kxδt/([| 7젛Lh_E5)t0۷Mc&3'!^S1e|ІhЎb¤vF+ 35pe[_u 5v /'xИIy\FeÌj_YuZă NխxWFxc+l}a`DB3h C7Zr9Iv̻(v!DH 1uQ(k*3*A.2NX`п܂2qсƲ&3}&ӌ\8+/d5aW^p<Zea͙RdQ(W>(K-|_W׵ѵt,OY 7:"E鿹D._oywIx}~ohI2˪1SA|Ę5t,5MuvVR>imL )n" mEAyQo;um!ndULl|BQ^w;.hdWy$d}j鷚eQ cʢ6 r(7 Ҹ$mBE!`<&e +Ė]czhiq7 ~qS7)tuUg]1+BKk쒷=qka0gG.ヶ jAFY$ݠ)k ?t) }ou3} ]X4&t*g/Ԣ6ʒi!ۅc9+Y`G=*D0&?CY%MaŜΥw(TF `U],sƸ`;2zq?27=CΔ> N $Nz$XAsyƢ,].˺A plŚ8u.B:NJI (܆4烷䵗{!1~ R:s$%j0rClr7:DѬImFeH׋._KYPۻۊ ʍ,D:fEA()dQK qp'/dK„Gc"+ڇq2RcYTšc RiZÐz8ktZ5AΛH.,Xm kh/zC5<$ژMQlmpw HU4x@"=䧋IFfo Eq; /,dQC@Z]ETzGړb;DJ8΂y8]LV>ڱ &:29DP=rI":o][ĠW̘b|(J3F6%aY! 43%iZd]WoANHLt9f_AUvZ'U2CM.6X=a96d0b xH>)c&糍"%.䋤'iPŃܡ"}*T, h<)0**8{F eI:/`DTOIF|d\lXOΚ@Z +9@ 5f('JO‡ P'iI.]KkRѮn`C'Op`35F@ sfϘ Y&L> nvLCFHx& V<=cMLw^\E( H 0'sa ZlkJ@]rgK5הQ#{23ҥMu3SA#hT&JU 8 f8ebea4;ADJW|&~YRv(@ |}s-8 v2Z1,l Biߚ` -^ZJ -jMh^Z -jhجU,ТׅVoCK.,2敍F߆կ˶Zr˶ вuVCZuVjD ZZ @`vʣ)BiSUk3A9 X)"&2*G&H~i6olM+6]7)IgT~]ʱt5Oԝ*󠮻O4)aLsxNG2kd'^ h8# ;6m9KAL3ܹ yȡ_1[} g:PfB ^%͋g:{y2b. Gt۪G1Fl7mT->?$i9J d3ԫN0O7EWWn7?M1v$K~9;eEǓ.oX8g5JLp-NX-VCL$>7kq1eΜeP gә<^},\x[|'6ǘqL1cԵrןp}A{_]l?W gAM*D$wve]7ߪ\֞\^ﲳ}qv:ivb5발.V{<Qdq:h+(ⳕT(lֺysii450Ypm=SX|ijφ7?"4[-1e/K Wiͩ-O2JgSO)+![]> ǧ#'<G *.HONLjeۯ~9GxWnosݹBeՆz ˃{ϙۛq){1{;8AZUb&bRՓ*IZE$@lDdb[BK%9:|3h{roK=H 0=е5f[aV;/f}V^A^,z*jSNoRE(™"Iz {X %baL*5 9{voh z .oYpHk9VA o|:46{{csbM1MgÅ5TY6h^|u~{Ǘކ3V)~ Pi!z ߽{}A#_sH '?ؿre8it2?赲,ږ$7/{4Nd/H۪-MS:!3#..v7!bcC E"I0,1K&RYF-$edBl)tw8v烾lK=FX_8zQ{vz~;:1ċD+Y1%DIClJ;!A$>DVEyJ1 [NTZd\A%d_7c98m٧~Ǭ,J0F\S r< fr"hiy02YBh* &4aLf͊R%e}T8`nOl&zϢ:/$^2=QÖ<n0_/>ㅩO; j_kP*e@3a],X]~{2~Mk:lVqT E44nHED,V~ga6Y/Ѷd1~ӥn淨V+Sl?!xoΏO / i*0) p8/glgn7gΊ97{sL_~cCr*YGe3 W$F/^iU5Q 47&&4dX0"* (Ҙ:(hJƮ@pU58Vg&9.7#_*xU<PI5ό  cct$`7#_*gwFP3#C9(C#-l_*5/".Aqﳴ#0N|1XPoFHU /f#/qd㈙nDEB-3܌|@^ !F993}/dF|I_noA@ O/yS,PQjC[uلE"v[z3EzyM+Gʳ+^} BDVQ8 q zyjN;\J1jKR̛dt z(bL[w,=9ߠI%==sj )o!͵,){pUg JSHa\ؾ;h‰34|źw^qن{Z \)L ' Ypi!{boFt?0v +~ܕ{iK0,k%[b^l|he`TcqD;,}1nv뇠\H2(5qQ82N}*2BmVL-i-;I ƒ`8Crк~)d+KcV8C{aPT&4':n8w;aY>e@k xH/?!4g̳s9q+rR o;1wL,29JB&:8N4N.S[x;7 zic`}R1sj~cn4`ۧw]2L]aH@l5WA]}~ 8,Pn҇oZOilZ͸9c `sp.o]gXuI0iP;:=>>yvFzs?:9Y(_E{c>3֣mptJ4O_?iw:/x_Kŋ?%,8y|w=/G58?O j֑ϓ  eA!.ho$< ,kF[f(V [Vf p>I?: YL0{?O~l0UciL{-sz95R^~WOjK`-Ʒ[`;#fxЙ*3KLh&.[sFz ̥ˁս̭6.iϓW]ե#ࡂc_j cGgVxOׇ_Nl^-yNsiݰsJN?vNaw <ҫtWt685v/}٬d~Y0SvIsD^g};d_F@d&PH|d3k .,o{ϳyU7;y]bN/&6HIYqBkJK>BQ\b5޵"puM{t;px?Hsh ,Vb&GV^3ҫNhYQ.A z[qBLz[ C |[۷KϦ$}[[2u<.0%| #/4,_OԐ4;ʹrwB/Kg!YjBsQ5bz#OS˰C9"$&Q0_^⍠xyk*C`}w_4-s9WJ!r-R6g( e} fg)Xhx_ts$T 0C[IS4b p"D!i1<^(A@CeHsquPHEg\Ae Y[ز`S);>l"ԃ8vCw>ćC/ A/8G\=J;9<;Jz>Vy$ίe⧏OrWw~Ů %, NDϯK*" b"=b5axgYlwob%ۄ6xgYmwVoO iHe]vl Mw܂)v6ܝ f {VYvq;nzgݲ%7Β,cKn3\eH. Y/Scq]l !Ys7)l5Vxi2go \SLk^X%u4G€˖`/^\.DDHQ" OAzk`mҡ=?}xW!yW"߬B^)9΃]=Cz^][w=;7yNkZ֙׺`7ى'a#2IiPZ3I"F.^_-R}>E0-Zy--2%< ?_iosB>Eb(N(-"aeWBV0k.'D1q'NL6! W )i 6ܣ=x fC^:.V!T&YqD q'P48 sk9.Q2$Ąct""i56RXHlPQ ACWq{ E%dƙZʛшJEcB"s? Iʋ"?&4k6_W]`P=a[ADSS7GA'̄\#`9OgA* )rFXs77]AI]Ge3$XQz[{hHmJB,G!BY5 ؎@Ӱalh 7ԇ_j:9 494^]+~ moqTa|_B)# 6ܙKD)\h]쓩+H0pzuQ'lDyLb%d)M(> _S*%\p@e4HO:!6zu4v%UQ PX&& `5suro$A㏑?i!KF1 IBEpFT8=$˨\@YG< ="YLE_Z+*4rT+r{ZwC*m$XQ+m /I[ʇx`d%6uj4C2IyfWͫyIM֠afUU2!$)`"r0YH,)Qàj܏.!p)r {SͰCVU8d9p}T[lxiB^Wm=jfaCn˹; ɅCr½<5ц® |p7 !pF> xPTJ04N!CV9%2jb4VL0ܗNX}Sړu<'ߜ,@NrˢX(F&8K `XEzs &~y'lƫl:s!0>"#*[TjTAWqWTUA-]\ۮ?)h5T q!NEgl5vڎ;DusisVӈUA^!˶r:o5잩=oz9޴ nM'M{竇Ѧ<BiC!ubMCi5Mhˈ86>~BMD8u(Fgg7CỈeGao"M^/:om.<֡1F6*qʃ X dqV!zl{ݮV:Qz)6EFfVpP-7V}n^Gv4׷ hBK-jZPivL{ VT̖2A)RRkL %BCH T*BX=]S~R԰j/3v4<^V͂q8z=B,ÿD?7`δG?dRoGP{q4B{,/|+/}0՛Oi<2%mT 1UaLzǩ`9ක+ ӭ_-,A#s{Omٯ2~{u4aǻY|f?ˌ$"(rI()E"!*M*BR 3kOF"NuQ.IdFhRi avu*K2}+^7۫gT Yw_[\be'In=+$D$2.2=۫n z9Rgjn6|$53 Pލ+. K-cp6+^̦cP79|\Z/ %mޒSIӯ~SSo"Ɋ%w:+Wio(R!AW4SFdVu 5ᵴ"zK>YFUBr0akXcY6pNXa<ϊӸekJh.oOq狕_, @kEDЯ/j]|R'ױڟ6sYMO۫޲5ȍMp!W7poǣ>Q0I͇|,3t&mGp.15G: s2P8SGp(1-c<`GG^-8/RU٣H  9̻y$ܿ܏yoZµl|&x䆑5=ޞgh b+\T j ZJ=l1ō#` RH_R-Zߊ_W^F!X]'zcH'[P}"bAaGkQRFHa wesAX zGMp!F02{ ' *(2=j kr9P e!n :gow+]a5e h &A08-q8)Nf X`,>P (0SxeЪс d6*]v4E75C;Kf?VްY%VRudetU iZCz<{! Ʋ5@WM͐}(+{<.cM{ɍ};lp1!->?599=jhϐ[d!&U; FŖb?Ӈ ތkY{HDžK(b)h?xA6XXK]Yjw7c,WY*=|=uhPI*6.0YNc*f}@YjВ&~']$n=.G9xQ@`d'#9DX.jPo5u?2j//V{GkmtvMDž9exm}:j!/W*Ow.`/Pf觺,*4y!D$xV Ju*+`v˾sDZVtmQfD/}ϥiM[O,TވJ3)=KlU{ b4iE(gh%c*_'1RzBkTEg&\4B58(ú˙>Pz.qz23ӫOֹ6$OtNSPJugMA!`OEV)y 3hry伕8=Л.Dq٥ĉ>H?PMy+P y(%y\N݊cs2SoYHI@eJ+Dp1};c&kE[wI;֏5v 'dDOj-S󞴊pt n#d*5@[߼$An5,lB]RTPJ@`o QAI;7Q X凅+ʃnT,fqj?/'+{6ǣZh%s8,w?y o|i0ב:%-q&C})֪޾9֫ cJ6c:k cShPhS "B n_EzIRx>2jFX21\6I xǜY}xZp…U9h:M7fh(/.mhA9pR`|]:-}y0BV ɵ p#oOa>hhDpI9d=Y46I)dԂZjVV. eSBǕUށz.~ꁚZfējj'*r2K Nz`V'S۹gs#7腥mTwj DOV c(xR~{=j)iiku &P:VQ*j 3`>ɞN'ڑG;DG6~!E1F<󊰐ٜZyt`THir#֊&zugߤpgm߳xtr9)D։JIU":[nGԬ|Z,(i;D阸(7D6p`ZAL`9RqMuk#3'k;= ,Mm=ÂgI ,SdHJi&Ђ_/k9RnFQl 3^qkl8 xQl%r_,㶲=N;Yн`i8ID|rʃh= `"JD[TRAȱ wT#(Qݠ5BH8jp'A9FmY_fggکavv7l|SC!) }_Z<#8CQ~T;+Ze2j X.37SARjs)њ BdDEIR$1 G2 [q4O4@or}ᛮŐEսչu1>~A]W }w +m񻟑pÛ7gļx8d:Lvno-zѕ쿗{M/… \_ۊ2h+51RWZR&%2❝͟㖇*b KɐvJ vh|RfNn Lͅ;'S0;|Ʈ4kDCC%P+J*HHZdM0I] Zu6&B: 8 [YԵ"& 9ydm@8<3ct^t.ZL)7ADT3t9@,`Qp%` rVPy" 0ZUF# HhR zM 䃈܈u\@K$a(X5y`: DvcCz7.ԸE]brb5)z&Z?t^!q2x} x ڳf=ozH4ex-~٪V`=Z1l M/Ggw!+!ʳ+5U`Cň6oE^vŰKxF%RR0Bt@BVp3<]C\(W<+ qV&[FdVЌ"li"ֲUsf~Ǐqh>O(#-s6D5Q39e'ɨh­=֑hf^P:ţ iТs5^؇6JS.u >֓sōblg(fyO*V6A.q>U6~,zW|ӟtͭ*8C/}NS,U{ogJFK_&"5~ndgvr `9祣P& 7Ϧ3|w%n4+ 9]u,T*H%$,G ]coxT{ 4cX`QtS0 ITiF1f3"Eo-y1tE)A!L\خXՁXH ;az94db$9N3 jHP'?@#:0Jsc1!ijTsRBkH֠O = 0LcP(rBԊ+lv3Xo2l/9+fڳhr> k|M"DXI2[7yhv Cm+%W1+歀!+Vz<˽\Ǚ2 8(U'q} P)JG0,dd+;v+n^A"wsS2cGe߯pZDfԴ>h|tc`QѦR-gWJ7rvΘ[a%kj;!YMbFxne-rxCwPwm@)+NeP%(HUPo'WR(DFuKAPc;GIX-`Y$05BBkuR'o#y]huݛCUK`OxH fVԍ*8/[ig2>hUUC;X}7>ή8 kУw дdpzWL_ώ&f7N4ѬV#h~4,ޣ2Z&+t85&x$k2D s*̎evc_ ]pwr;Q罹業8 )}PQsH&>KC'tH f xQ4V+Nܥ1E1bRu&*ABq#yBO&y)Pj1CiPJz<*H[G1hs)rc9P|!+n+PT0ŐM$+oRΩ=Y\\Oδ>U,ˋgۭGV ]p,/ WBİ2 gH4I&Dv(]]^OvA zEwQe%E >G_.'5肋ZR芀bL)L8[{\ ] prto=@P**0%̢i!ǬYPz# w M{HʸnMI1tEqQT!Y7&C6);XͣuW49U_Z^{+J'aO֚LbD%.tsgg/f| k/M}ƻ,?*\̽_[lJ 02FjЋx4[K ;xt7jNOU1}sd]{x4}c t$F%qr/~fn<]Ĭg/2Ὠq9bho0 "TQYqΈLkbyNgW㻼&ohYӔp`,[O&N@>꡸}bmF%!*ZsuIá ? 4MkE ÏC T1UZ,I׳D_fDQ~R)\6^Vh'INFkƩhJ9 ɷY5LQ&Q|kң[qre gn}w[sDZak6n8g P%m{"#%njAln{sൺ{t~Z\ kb_nԛޢ~8YM RmEo6dfUh}ap{q5Z\+Ow;+K(*뜢qNB>{vzdSxd[vn£oc Vό[FtuIj%# MT;coL",BXٳ`#*buG_AY! 5S?FpH&!g_$0XI_$U(>OT,_Z׎ds֫RhJ]3غD+>g@RU뿳שjR)a^'znH jǷAiֻ7m1K~^C^y}l(Np5g`)۫炟ҪTg͟nDCX~/Yl1aMryU fYA/٩xPs$z!緽᛿P! ~'5׹[Up*d~o$"Zlq.]!fkpr#zwƽgqG{8/oVpՆJj[N\jd;Ź=؈=z+O{mnQ9zC t]2G1\ˋ@A];w8@Ewzx:^غlB[^Hr͏xUc\\}C8lxɍI_f&ƥ}tlGLg=13 H$M[Nx&WG7Ȫ G /Df"ˣ2)j%4@QEn1C},۳ ܕ:JW odV9-Vh4f|`JTM)u۔oְMk[*}Or|0ӿ?%mczsI_Tx^AR81v'QIBΨ?I)$?I͔Լ)҂K&EĿOgn$AQj;r9saҲW{TP?u']{]x k'Ww`,Z/ı.Dr[-;OԿ=S{Hlr`>4=#P5wb>_x|Di6Gېݩωb9ġ(G!IO0M0%;T$gY:N)Q>[s!J< HV웭_C&?,  NLÝCdd%8| 3ڵX;|uJj|k7tl@-ٽn\' U}FըW CB|I~J"OR y+PTK"3r}҃ӸE`k^`P>I='"6d홥[ J|Je+^dWRL+9'W~װ*!10Jat.11fXGGjMi͛6Jʿ8 #}VcڣG{䑴JdBv>KiwnΈKB\8%!| GLqM\1˵# JړuPQ&G/hYݣ;sԈHnk;]bW#lפeKp$jN:]:n?} ݂ۯ3WEFs:*2stt\˒?S8bR#;Dϳ_MAA]u-T5+v` -*x ?1#p7^n]Rbz{s0Ɣ ``(nWRXN02K̃nX厢ˡe_G?;366)8ZgAyk(޳K\?CguHzBW9)Ai=2\K6XnI9G)wUҞ-Y۾q^Pc%CkGjfN} l*Hx3Ŋ1Ok1zSo]/o8p>+`z$26%5l&nLbռ3ZOÈ#+I܅ d'7}X0ZF, B0H^{ scJ857a/dRX1 1 PJ`w<,zَ)n4oX>Y,^,ޞS"*rnJM{;ԠY7s廨&B$`Vr=,% %c<'tP1׌yFpCqpS( +ɤ>.}>q)yr@^}\3 d|+ 'T/'957/bO(|9tkjWZ>G:V _m<;?Nbr>R XǝY1UzZ(m5C{A犚Ŋ D_fϘҤw٩W>ņ$㲖KB,7{[j)l`i &9+t`Ut\Yj<^xy7{2=/6g6BG3/62 g 6c&s_dSzM–`QPF4y;Xa|»o=}UM J/ҁXܟᢛDc|]$ @gf -`lgjgd-ǫćӇa9xjfc>*kwe0\G~=G3Dblgp"3a~\]/~˿D97ƆHnDtָ00=`/+~4AFrb+BR#=P0އ+?$د\77 X@/ַͿ72 >pż!ù p-!@upWkYly4D>Zmm>* ݜtcz -i?]L L3pʬy -W/Q|eT2R &'cdPƣ͚Ot;~0šfRl78͢P ,ӥ+~̆=!R*xQ.A^7R5n4t0^V^zݫ_GFK."\ CKI BKE7>ࢴJk,)F;5B!;kA|w[u0/a3MWw'A mYR^ӖѸ3~:9߫u?Ƚ셰@% pY}<{Akki]pK pF:q(8 8 Eγ&YJh,3C.ݯ˽KXOiiA|(DPXGBaEVsŽ썟VWe0&0+`8 ߗb*8y;pi{L%˻ yՍxDd.yɕ18(1[CfTs FkB3WZ"s 7m!^;6X]Rh|Xn-aF_\ 6 ["\&w`'%bF ʄ oJR4( )U1o ˽J{!>71O*{vYE'wB$*CIԚogGoZ rI*@x<B iy &İoZ0{=JMʏp՝ ~oduFTWOjj 0ߐo4[x/~> c`}b@bgwX*.z7OqQ;@QY,]L׈Z mC f_( 8 7n<)DPl!h M {LCR QPRbeIJ m@&lxi`5ܖFa@ +HLK P +=p AO vQ ._HRVSF*ID8UL[Y%CsLCQ.Q\{(,ڻ@I! 3FEh6Dx>}ƄC,=au bQfnl`\Z cʘVZ䖋k'@ih IS&HQ7![7gIDe@KRb^)X TPሖu͑F 0]+i%PNb|XIgEh^f?,Gc|EV},9 5~ͯ+l`?{M'4utUv`nRC\ je>zĻr4z-'#IkS;R-U)m\훣#>@CO $<"v%NjU%uyGN0Mzg!H B5MX W]DianZlr4q}YuݯepsqR\b;IMD)=0BXnw |6٭)λoP'΁[o""+8uoRɧC6 4 4zk@X4 ](8Er0[|! _Au gn89 ˇ<, 1Tl]e b\"!_@ c)( MD:%ZRljgQK2Կ]^Jн懅Ҟ@Hן+ݨ: W9"ָFПɩ4 M*0Z,o7f~3ٛ5/YQiq oMCl1*И﹌RYc i2]3tRc؂`;"#pyl d3ZÄ\ M|ػ;΋~*Nzm['z흧 aM8iblyѓ\El}ׂdJ"LiWHzwÅT+b£g?ұի^ uMeq|Uz-UU:cـ~^w`89 FD+)__b/LrR0r%Sֱe9^?QzH;2$4 KNxT&av ;/Ma] 8f|m5 $EgGݟTp,p`͙*VP*r yrqLGf!7 5U):jy4̞W`RW,v>z}6C!z=ׅcUĄ6726Mz\>h th~yzv)rn9J^pu9 Z^B)SLA,aUQnX\1q-@ S6q6q 6Q*c5/bpX40SAl ԃؒr40\V>:39gЯ3nUaL_ӫ^C8F!'9mx(qa+aVJSvx4EU$ ]zsX!eŗco&l0V?G n4߹#Hp!P Ȃ( ȔF亻Ȅ6/xPrdfI4Ǵ坁9SZ@ ,B[ !iGzzgΠ(IP8tFYx{z)=xpR IH(' 唇?Z?gZx)_iY+Y EZ 3 \߾ܱ݉Ig(z|uԩsr ] R m[T403QL SCsETq Aq,(f,ONU򩡃3 x '1K)P3T+/>5t2xPaVݬ1LjZAzgޯXf0`d0p7YhB.V)$ *jx]k،IƂHm!ŰTp̧.֙Xv*yG#"I$Iդ?lz$y|#i(3:" V,rFI1S`o+[=B'OTF4E/QCtL"[.߼hc̐cw.?^jjnw-UqV~UP՟~XMYoF"^?QJG(RmeQ]p+V'%\Yn^ HnjpNb-`?cX,D)`4{3%.(c#uZ͒x1IxYzj2ŢY3Um/oߛ_DnGR ._vmzrw+{\b9g(׶;N o5q@쮂s1˕]G>|m 2k+4 {ͷ}anݟ#e+k3j*-ѳh{f.'bq ΣcZ{EGLqbLGR h?9+<*g?fcш'ݳsx *i{hPȶaR|jχW᪷m=髙HR!>K'F.bv /=F <[~ٱ&7ѩgC-N@ND=+8p {O0Pf{:wH:#IQ|s .n66&2;92,Uy|e -(x]E)E٪8(&̧EAIdWWB'O[OWOA7%#E &vmil;!uϺo.-`i.x;wjQ:r?ݍJ*?;a3Z~29ݍ!fT3jp@-,edoKp`#{zm$^0M3׹ƔN=B5B;TBʸɗperܬϧSg zK7`_&Q6KDG%ϣ:n+Zy[]_o*՝l1Jw]:t65etCywH39?͍4 5LJ6!³POH|RԯxK@DQ"7WV*G!-PlHbFs,bMlf0!6vQܯxKS#rsncg!Q0qc3'Y2cFZGR%$&Ҕ۸>u +r_Fs_4/Q6F!1=>1jI1-^Qq 6F y+6q_:p߿Sι4N81JǩdmZVHFȇ6{QܫxK _cN0p469$*XkЃ)# m[=(3zo y*UvU2JyػͅNe뮩1_7)!SVGqԘyoz>F =Ɖ5^Gv!^W!@SL:j[Щ{"ktLn~<=5-hBP{\X:{8 $;1 82*}UCO]>?`"р6RcQo(t7$Lwgm7+$wB2 R^3 Gt@Hs Yj%q"6_m ؆OWPH҃7~DK1@$g]힗e#"(S8FQ`wӀ[EuTHNz~Ѐ>+Aʭ~N/__&z~.]P4ܚFW{Ge=@Kwv?NAGjLVv s/cGp7hK5"s7<""F$&LJ],?}o<,fKEQEaYgd\}R@S1r9˦Ҹ>(6JTHT˿טk>/@y'N8y4q?s`-X\&㊣'W߼mJUFw_GQnE臯oi?aOBV׽P;&4挶??jAV%6"͙:δ 3keŔKzQ 3LS|}> EPR4ab3IX‰FZlI J-NMl|YvrN~ӟDl`Q-vTnR;:x;7z|jۢ^vtJpx\rUG'ÌJ.r:7w(J9ݍjhv^ h}V՘B,J`QŒ Ir4VR.Um*Q 9A d4!LJ,bPř%jO9k -Q9eƽ:-H *˼h,O|Kn_bA e"%k4\sNIbPn3(a5<l9NeRS=ܮ,Ⱥ>AIR!iNU( qL6(Y_oڍGPT#1;bJ]n:"Xz皥1W,k9NȜ)nMpBgʥtuN 8-`یEB廫UжHr Ƌ\;6>՝]] nJ:abHTb . w$ m\BNvcJ1SbL,ͳ%L2{<(DB/J59@N«#Hv&؃+Sw]jL[9µ\4Jp&a%OU6,ŸJw,IgGEͮ3ovy5q=]5x}?|i7/Wt}>btF.a]R:1^-( Lg MGAo%As]?i3QrC]g͔Dq =dJa 0i8  J&~kQiYzqͶK#6)Y-a]JA}YZܮmtwknY֛h>YMt0uP^MYZOM#l[Ș,sSM|jBr4a;6/1vf[ס 6P-WL$ ftV N{EZW(t{ .*>rm ; H(Ԩ2ܦx;*op" K.!C_"zzKZd&HPu*"*yc4.ϣTDHϣU"h*^ mόtuvyH>g<؀Z$uH#[W<(R8{05( qw[ z]eIˁ-drڙqb:8 ,D#q29 N] ELLt.Χ3l7`M?ABVWk(2Д-H(Gؒ!y%q962Bf1N #eE3[ 0[?M 뾌&ţts\qeIG]Q;,g1Y[ٺʣW;š(zGaHG{ÄKN׭K/X ,?HGW)% %/N ~p9Չ֕DTkԱDػGn$Wz]xa_n4IfU%մ= ԑS<筻EF| ^_tˑy%K&Ngr=JqcR'O9q?DG㫧}[n~Ww4˗Ť?=L6}&ޥ tU?7[h|}o) ap4S+#Ec> cKŎ_YFèmډ)``}NiW_:/ڐ:s}*tkxnl%|͛9r#b~ighrV67v[ km]ٶͭ[LYsb<.5*%nlc109~补 OOI>$ OsJ)gj^qGXc JJ$^ǪլsfcaBSszJw/363k⹞q,D`WT+E[RQ Y\ |9IQ* 1d [dńIg߶ H_E{Vp$N2lg2%pG>tG%J$$1LgOz*WkqCk{&q7RkEށb6v!owT{73x >% 9,a(-rw'x?oWB')!3 @P"fDO 1# wO{OT30w HTZ@?cTA:%*`lX+L3FhaWp/:Pwӱ|K;y腛46QMA`l]?B.{O>+ xj'5CVB!AHqju aHKSm@kB,bbW6oDCQ59ȧcG ߾,Ƿ1^vo~XfqeT=SQFˑQq령KG֯fu3~oii8*"@R;nQՐVUR؊ 錳aVo7C_v3]Gs0;!L+GFEwk[Rk 6{OjU[iM dݢ[+!6I.9鸇Oݽvufc bU kVj D4bP@WĠC=L|PByܥH<ʵL*K5VY_ 1}^R(O1sTH *=.UI=i0!=$Q^3S}HʇByV^E'+k3b•Ԧ\P> RL jiZ`[Evg5oeB /KR}XysʇDyEjT¼N[0UZuB+$k:\"!2WuHp#k!=P^:I[pElEc1u!=$PA+Xk[ 2N "t3ŵF CzH<'%"Zl%'-YV]BzH$ʫ 9Evu5$XMZBmӠv^R(ϩHmyԚWVm 1KViX)C啢o!7o`EF_,BAҚIR3zH<ɩ^؝s*&PPMEZ+X/C@MVY/Lu!=$Q^W>=PTX[숣 PpZf`Y!cQH)!IgvrBzH|H v^|$ ,NuOC~'lE/l IQCh(k+pLUC?>{KcG4FP( űu;jl3rM*fS.vr~Ĺq94R{QףƘqP|rHsqk!PSMdF+M붝nG-/G)y {1R48CB@(QaHAc C_SP'iRMqKu/m-X}c5&k$R[P–[=ЖxxNyx׭xxX *-x+ExbB)B2.>c670७eukv$Ai ٽ8Dl5bl}FlM'l\cgյq%VR*Nl$R"۬XRƆ; hk_/5~qTR.bTU մ2vnHEMI ˙(ӂ]1]IkL5`R+D8^7-E$qJ&榮ˇ+Ϯ枖`уpE8:ݺ"o6>1y6y_r<}\_a!b1@ډ4$OB;I-WZ=G]Vf-gqoaI|o~T=,?iu f+ҮQTҊ?ȤB:+Sq0j*X46ٚ ZT-ŤmAybB&Et@j ӥ)s,!c !ea 6&eISxȟE)8wO~y03Ka9ymJ׏i6'e}Fu ?۞i2͸=yw䇹5W! ~#2ݝϩZ#%//Gӏ_݁3|l;pM\׿<3w\qPdfmvy?F`W=ORׄd^"(qQn,oC;{SsvE>")C(W^DӉ(ל{_K):t^%ޚ%B`J WĈ eGbe AȞI3fzHaͯR̳A3j>M]>} < tFlh=;xVmH(o ac͏ϫFpIS|9Fd&l{S(0Gzۭy_?*[_wU Z? |tT;IFD:-rkGtFj7_ 5.KDAY^chкa6Uڼq6|)4Q1x+)/7ȯ{ˌ>n96_!;Ĩ1(^f> 52&BZ*#gK|QΘ ]0EåcNr=Z2[&'G|!%fn}~y̫AE?J1ee'y̎)#~Gc8"MͱqOzĮ1΋]?-NцQ%%ۆQ%q/%Nws\*;ѩIvF{z!LTC3dPN,C%E~dlX-q i/KLfu Zy+mƽLtB8(8n44n- FyS[OQtmQ,in^ҟrnO7#VWcrUw{v԰$AzIaJ.uY7p}<Ӥ  KJqvذw$ rQW#3ao6*3lLS} qfZz|nrJsjH&{/.opq9K V.|)}Џ~Rjx!,d6<)EI}oHoI1seߺuqk AF`*oB/yIe9}O䫣WKpD/*E HRVOCyrtd487xl}L.gU_ [xVtݽ D4Rc-+88׭вBdeK9j9H*0$(tc.5Jh ϸ3-%OSR ,$S+kFw XS-.о6IS>ƝgʯE%X:K^iA [|<EiSy U 6&5:{MIQڐZ0Zȣ{YΟ+Ѷ\h-U[}Px(<~dz=alH$7xHQHymA@P ]L Hńa=B0wcڷh̊Io P/+",R++&2.O1 @(~; ʇ !Dб>7Xr`vO㧺^|M .hx+TS`N ᭢~[F?05I2k8[->zkz6m|Qr|4A8%Ax1b5dzI>BR%,}H3^V]qfM6$~~[nYz[z5}[ cƵ.t6xkŽFō7zc6!Ty۸tW(B`.Tc$Cy[!L5ţ ǹS O4-n26!1W8vN0bխ7&4ȖSW)]lk+ 5"bY{8+\H~З 6clF",ZcW͗(gc.Bqj).i.@[&ڻj~aW%2nI[ĺ:"sq_i*p7uui:{MmpNԿνjO OnZQÕqLZ)Yg|?6=vD=sE-=œ0=pwv|֢& UHQ#Z]~cVil0jazHPʸHP{S7DlܺsnO} nIY:a[ cSEmW)ƺÃ}۝3a0?&T1Btf;?؁TY6ra20~X|1旟G3dPnFx.7D[BEdQ-1NY A) RJ9g Ɣ$V 53,5 Oln @>߸5p3[Wc/~w7Wino^(.ؒU>dҧ"xn@3=q:0P*^)F*H3M ed^!XZ25\?5[EQ,R[EpII \:O#< ,5Jxt.`~f2ݼA<[.<(k(USѻR_31 ,QE3`X9UզzHjz\aK@ASXb8!J`#@JBVz U/K="t~Wf:db|Q_}Zo |R6a<2b(d #(-%) !q^yge BvviWY\[|uhxsgW('A>Ik&7VQ#!W`A\ 5`q<* Z`'S9-l5FzzJ"ѓz.ѾdJ}EB5‰h( 'vQ#xhA;@Qr$"TX(p 남e<1I.hb)AL :ƄObVbFub]( lX 爥ҬA/*z㻉 oGsjM/8@v'j)XUb+bRDձCɗ3zsA(:6(fX5ѱL<0gUIyfGʮn.T4ѷ2RIrD$2;޴JO4˺]$vE(f{ 2i(М=z6k֠Ԩ|q*2[w 2PQͶ)63頜`F$\v ?b8ux㠘/;}bgLwW,}%g6 VCF f/rX@z0MgɈ<H5d|\1=b|Lgw8 恷W-Os(Š0rf$䅋hLQJznB>h\Nu"0D3햟Hݚ.Y2UTݸ'&0AD_cT~`\`)爺DNg.wBQr4>--6/yn9䪓s;C{dn˜};Na?_ϱʻ8zn~tz6b/X^ىϖz]8 ~|Fkϑstc8l%=LWt? FS@ -PA>Ll<y+V}%}k[`4ƋxƜK91/ry*;cF4 y"%SFA"?0IWs< Rz~`. H ,:gWQދ\Nu4eDu[$j&$䅋hLqՅ24{f ϜgOw%QllD=$䅋hLU.g<yΔ'Dc@mWڂFŮ=\֕z8kvb--sE\2w;On/cAi`LvdϗS/:&|]Ν|M ϩhIRE>e&@džOj()#4<Z{+nK4 N@/[ 1ޕ89Ⲅ ̨ъJwa\&>k97RS[xNuabEi+J]* |ko.%l0oi:`opGcW03EH&Qs \_3J3ug JEoaZ͊j԰)0e\+[E0Z6):U -{%\Zzzy΋V{#"PQT'qܥ}*%Aar/U+$]%®G!;FT+sV}osTC=fҡ2$J3*%K#>hj.ސ?'a!@0vo /s*K#hdaL>S˜*I2x).(GC G B-)&eɴ.e9Ne ;IcܕX ܾ_ndW0<@"^emRvNzƒՊ6R ]*h\WBc*3Կ>aAmԼ&쩚l~9lJh"s]`亢n]f.BTtVD~pXjUJw[U!@FVX6L&kEԁ0iyh|iؤ~K+'}V?ihkBZ|_uWz$KWci,5ޥzXM|p;};C/{)1?L|״Uݴ Q,?QϏ"q⣈?xڴ/ϱ\ףY= [Vk)GE к_*tCh|j2 @h=-0չw \=| \MƵ-PK^po?/y0 PtƽG֍zDj8X2H'#OTy t8D!E;cyj7?!Zy2ǭ D+^0^Bˀ ) VYiȇa6Hx6Rp.*_M3w4I{MiMZ6 $%\uv,fzڰ3GINQ{$}CBմKi[?\:')9'Y{(׭ڀ:C2[#2ŧbcDtg>xsNO}XSnq8_5a&ow҇~QŪW~?]UkwD^c@m'͉y1 Xz1ѵ69͜o8kMւ^vG}25XͅÜO MӏBT=1pe $q49)l*DAA2`Ac*}BdbtV 5e\6(AY~.&1h'1O,J4$X,G.mr&Klн5Nv>DmLVƦEp&#G %Ld<T9yvc\MZb&J-e2L=d"H&-p/ID9g1HL4X4gDtQYK(ki"FN$〆@I@*npKol47{ Y ZA@&\zFQ#>HE2ZPؗ#u&,:He.5Y{`VtNғ_ɖG&14LԠev ;V]b," CrJj41k;Pd$~(%6Z39!xLDV\Lz+ŖЖ!hǴ%jtw"֙a8ᘬ#md΂$+ݕdTDyt RK#&J-2,3 QV:)DTG21x:Vrm7*xk[1i1kT%*W_~DPٿv9E8֍"֊AUconmNםym9E8իMuӽr;ZV j1m ҩukhu!gN&Nòu۟reڈAUQ+@Nqحy+rmՇMk ֈAu1ʺm(Qtv=:j6CΜ-2L>sc`3Z1N7XqHլ[@k 9s6qjj^C `Z1N7XNz֭ym9E8rYCf.NZ1N7X2YNz+Z9rm━gGrnPcvEJfݚZk`șSSкllhZ1N7XnbZ9rmAm5n꼛L0 ۾1ne ^+Ɓ֯LrmX7#`gT9o6i[tWZZ3hZooMV j1mS-׳nns0))w_Kah,b߭~cuۮ>z͕nns0))v_d`zRohZ1N7X-_ukhu!gN&NM<%r 2ťh^YbILQj09l)z! @kixO\J12 d.)# C f*s<`-k/4&W#@7GvI@D+.? n+c)O?a*-G;ؾ47<7xQ9x^kBؿ[ȷdww2எ͊0}Cuࣲ*1XN|Y\GJ_]?H_oӟ^]nBz8} gկWf!B7<Siͫ+4;qoz)L*'K&9Т Lt̲{xN5׉a'Q1y\eY*PvZ! 4>,iŴ\=*>#`H P!+jmV7v 潢vk69/5oj}zv=M m{2Ҷg U]Ч.Ob}Ft]Lzi9)5BծM-ڲF9^l>ָ:KEQrNV[.ffu E1،ɦ Q֚^υ, _sK/ cGl:'8wBKRCf֬ә%\bNXkfU`LyKy;B[.M |lL?n]5nF&$5AReyVJtc] FE(N{&3g):e߈dHWr,5rK3'KLn^)*S`t B}rT(Wrʮ폴6JYZEۉ:LB,$K ϝ^oH;Ɋ>JMfPI{c}F4}) eM`^ PHo7}+R^~nTһRJ0P y:B8 Yt޲ uo|E5df$'Ĝ r̙dT 9d20,!u6iDگD?Փȸ; &vxlcY%eF:^ks$pZS28ca$Y`tS:2J O4U: زºm\Wv9(OUN ZӻA5zC>'IK#TsYPNdCs7ݗhDVN7Rɭ6].&~B Ͽֿ)?\]\5c)g5Z%*.&:^u^*첐19á/ TCe5 :eF~ 3CW}۶BQ H~BjDyhhCs>Ck--Kjғa a+CMDqQP Ydb,45YSJl?{ܶ uo ir/ƻv.N@™!Hl3"E1=uOO2ڸMMa=}-Nal1bbE3 QjddJ^LMڶE<85A c8(\MqKceyc [mLJ~gI 7>LatKZEX  :!T6g3јicx-QF D3@\_%%Qf$A$\,)RiKڝBiM]e\e~bGtPXUŶR 2vFejV Y)Bk0EyS,hDwe{u;RC`a$y?ul"-X #E%IlKdnrKRηg.Lk}ETD{%h8}dv=i'U L( иT:ظֽt Ll [gq_͎LeU7C^ 2bh^ҐH K$ ȨD8@ ˹6,).^v:ngO~b[RC`8 OX޸?jvJށ@m/:D,@a8mD:0dRL2DPcH$1(ke@*Z2򨐠{`㒰-`-&iL@FCSg2]2vF j fQ !yl x|8MD c.N7}$=7M/i"A)b1Yul`Yx.1Yڶ km;ۡE^zJB`(u!0C=1s!釯Xx\(g;r#fi Fu8z-MJ)S'[鋂uywRndz/ %8 (!0sN +MXoWPuS@'ܰ9Mۀjs xz<\I!:umyۣnЖ7qUh ZWHB#qĄ(dH k $PDƹ hS8!P-MDR_/4=S!g<O ]Xhcy hhAӄ5 !c{lA d7+"%zSimsď֕w%kUpHS/v=$9)  \y1J[Esl`$x8t/ ץ0pbRҠ^;nhϡ z,Yȍ)!N$ /٭f׫1x!U@=C [M?ԬSM!o)r'`\EsSfI)ʫ ?u%\Igk}u#X - Kc`휦l(:k7.Ʌ>0ْ$w6 ˆB\Uu}o_NySxpmgǻɽj_{>vO<4N񣋥͐nijOe6K;jDt;~+$=Bfv "rͭ:naSDrVnj!d$*W6Gͨ/^$";O|0Am6yOTi ۬~6Oyo'e<.$*f)F2,aAجy;HWdcy{qn8nA$$wYX S,΍NUfsݥʉ}>7DlftPE0Y8F.Y-F$^!JR&hpGP+ ))$"*[,)O ۿ #υg=lSl5)E>B,zo@(˙tA=tu'4Pm(7YRNQVQhSeK.ʐrF,&2e~*0#ϑ x-=iW`pVC1ybrM^_Y/ 7EJYj7l$2kf,1V)lT<9Os0Hi%E zf*-6=وOTTc9%RD $K4@$p”4s7IQ,b#ٓQjWI/xMC `zf0tfK FZ㟦OߨWE%q\,fQ3[FբZ4v@n/y?|VLw(+4jҫ.o>8P|1w꺅qI]0w*߰wYuKO)s>Y3`dɲωBBmy/Fu{ՃUR)'ɀ0Vri]hV?2j -#fBR#ݘULe l "2`"doFG Oo)gb:uΩӆK#3Nm뤦s/^|v# ]bId{r,~.:\N#{o? :Uh~>8{F+ SՆ̦n-b*%u:eU}2z  غ> &EyeQG?DG*5Mtje^?Wk_ޏK[+Bhb/`e62MѼJOO^_MsFq xGnu.,n-:QS/d5p'xA{*RudQTN_g)oBđC'=;߻{'ٓԚ9Uk(+ńSĵh:Ο4߾WVv-%|T=O0ݦՇ¶qn.aֺysP+!W0hy;bN>dzK͍,JA٪bQZu ݹ$s |+4҅;.ѓf6ubhV9 VR4LSgkkY!ڟ[gYUnouQ b=W_޼}o>wq;Ky?xZ1t! 1+>ګYxU7B?u+QE姬9U N@e`kcw<_ԵClA `8gS:-bNp7#sgN`;/3&uIgϦ>FSYW\b.7ӿG't# HG@:t IG@#ɦϗ&$N7aw{ƍ߅ [ DABȟ`{;tt# HG@:G@O+\/ G*|/Gwbt'Fwbt'Fwbt'Fwbt'HG@:t# # /7Kz_h'e,5D14a vEhK,htMJ 4JCR~Sj+J(~2Obީ)w2;AHi2^M Y[!,ЮX m%AJi!bwHQ,Yb@%[ڈ&)t#f`%H#U ;[;C`?sKїl=Gksk^?Lz46_*=2>7kжx}@O}x~Kt+Mz֓^壱,bU1ymLMu*8UˏΜ(n?雒=^^}u-S>Fk]32jo5Vq=+SC:{ɖg|5]S.kjUg<]!;=dZm/EIiˉfDŽs7 Z3\+Rݕ\\~6n F4$aĹݯ\7fݕ71PZ#94zW]7 0?,YGW+f;(넘W=P9=;r@ӝM=#yBT`ﳝ8SCg7q4g dbʹdZm QEQT,rft"UIIXPIuVPj6;hvb㤶ԗ!q.LY#ٱIyʀaq,eVrߋ)䘞=ZtW$ʧȢ"m(<4"D˩H*):+0<Jҽo uA|T.Ώn+=Bh$X'J^ϫFl)_G|/nҺ?)~zOm~i/ǯ/uh>㾳ܫzt'ѬT8TM-{k*;&x檮' pL{V{΁޳vG+\o`Cpm)G'燀n::n{d_[@hONWCeOMuVCn::n"55/M!_)Õr]eJ*q-ʨNRb a zP͛A ˗Ӝ:'}DX7zS4-3EɃp;),>;TE| 7#__|⸷Al0Kt)t [5:r#1`ݝMz3cv׼<[I01~LB`"ݞ&tWS {ЅWVJnʽQV>aO9= V[`|+]?WBmmCNn+g–fj򱦺Wg!;#8`(Nd,rNBR2ɖgwު&>Ymrٽ`.z} 5bWn/m;7xH)2zWb/l}Sl"%HtVqDٛ_=;Q6W! K޶ȷ~{K1 `dS͵l@|ABN{C ->n;B0ZQzw*e /U$(dV0k֌Iqa"#Sh[M}^{}fm_woFXQI"Kxϊ9=)r2%PЀ{.ݓX <#Ȝw Tc1\ AT[L%E?ɧ'j=(vۛZw jSCxˏ~ww״/s RB}[zk9i^᛫O_>&FIեru576V>S%!>:'Z(ϙ^\hk5WL'3 N}|tqn>|6N7=Nio~tOǸ#:.z8HIraƒQv܋~}gg!Kl1ܖAόNT:9ߩӏ@ AD;zoN*z`!;>2~Atlu=c[6$9:x0ڪvË|-jf{(ʕևԀqᩖ#r S\΀"m#^p)k7Mz7[_=71QS!E:j;jv$l[n $}5MtPzbףM"p_ 2rypr}`|axG]MO1aȌ֕Rs.{釵g&:kKHfP|x;%~c!;':ZC쏲6"ٺ%=L!_6O[|ZuuZ1UЭ}TOtCpm)'iOMRvuu1xær{hkQiĵ;[ޞ;q2m' دF\g>wP4 Ü,eswʆ@TUH0_t#bIH RkFYjE1ߣD%ld) d*,CtYt=ӄ,㥶j5K0\*1ЙIk5s`x(1sÔAj؃(4y̭Eec=%pQ*+/"]*ٽj5Pӌ\S |I#N6K0_r$]"b ٯˍ@ }aNp9JvXR#\% >Q66+(LI*p)4T!NkkGJ 99Y!‰{(G>2eH)RSxoH/vNR' cg3% i3^!m!ɧ'Ǔ$lbTc-d`%7fd-EiKq S0fJ $qO^Ƴl٨11 jCtxJ.JXĿ4%"obb x9c$F\rH:M%0-ئ ^Fg81JʀjS2P5H-؃Cל(*D2-'oq,CҌtJpyI$R}$C>.#WWp9vLؑN$lR#/$9H+j 6,ٌȁ̌kN'6U2r~#p+KE}`\%[ 5(SQaNK``!Jj rD^2 *#RLSD>`\d@$2g琝e^0>Exy0Vh"m*D" 6R8[uI a;^xΎu&AC4e䕌{D x) (OT{$um(:66,A ˚gfY)  J~lJ7H]SNoJ]# LuLi+-@DS6=^j5ydDA\<`n(LLj`SQ҆}@$7DiYAR9cs ,C[Ht65Hm0k"-Y(չ5" HoɔF#\ πuf1GK{J ez*ȨB@r@D<|uR1' a;LF* ȑ{#3y952,Lx/)eQhxqQ= 7H 2lFr@X!_f"-K`c5ja+'܃q,dƬui9>vG`IHSrc79``mE)<>ML/5XC0WxlL nH\(d2āv   24H QY$ RB%' `Zd%:bq# 1'! NH/O4Dkdŝ$FȻQz㑺+d@LD9D  H6 Tixы" m%Icj JҍZd36G$ȡlk^b!Ɏ$#1D8ӀB!۱~h-s^fYWCUB=E`t8bf~kJjgU[ENvxzY~X|ZSH_=h?w?W}tׂ\yT`eள@]׼G7˛ˣ_/ (`_YWVu0eL" S<=C:mTɁ.oDξ" w#jNkѦFPQ"j@C/38c4\+]J `񅠂'"ԾA+V]U2ծ'Վw)Rm`>rBU,0+YGxU*|ER-QxrQZ7 5J>/_u`x >Q`*R`@cxo JCUU2(C|!p%"'=P% bBF$ &axd9B$m H T#5WN>C ^FlbCvqZ,hhVܤ]}nWSL&7-q$D-c1 vI<@ sn9a_t'lr ~;9[N y+jঢ়4LG<߷Ec &8pkξ^ dF=Ak|3(o͕SPrUҝ]ꂒ+bP'*8EzQIu6Hqe{$CZ`|>EDNuA6mQ3̫MTRYT)i~BEᴏN;2 )?rKSL<%:$ՊTMU5Zrbi!l״"[ʒU\r"b| 62d(Tn(? R0xN)bABI?Ƌ8 }eKr 9FzG2oyzT;!>P "'{Cp/h4sɡKTR`e81joz{~H(V+Z N-p-y>6~Q['TkPo :@FDph ,aҒݫ*tJW=j%P5t w̥@VcE[Y`LA5B:9F [jSB|d`('&KUP~:l!0J¢ :gP lN.67_]JI3LaBGsdJ!~GTjbG/_C~?|: }Jߦ+?(^~i׫ @<+,mмDȹ~60+Y&o޽OчO]/gm(!]$D|ey2f+bֈ媯(~Nlb v@~fTnMz{TuoKYtadO,25=jƚ_'o%H=8WMT7y32֟w5nMh)wfD~y%YenG3#B|{y4$4@"cWvK/\Og͉L?t޿!%y߆,M~?cDT bxU9ۺ.۠& II2n! $s$wiӞ 'b3WMbwzrsb A7Tl;A1ɹBNnFFNhjK]9?n mR)0箴Y@bh/cKdZ޲7~U5/[[OjD a +C'^A?n˛f3&bxC⧎/6/`xySӷ;Ylu:{5Ϥ<.9o7?,wNf?c{NfOx|c$\G&4\=%$z͕MK'lFeF\%Ӵn*uvdE,F8/}[x<9sU2IGdRfy,kr9ஙw' "܍:wvxS躝䨻:/g7"g>ݸɉ]ڿ͡eCzn+VZoJUAՋߡ%T< i6R0Jpjv(}(5=w A(`5C{cY-W1 ^ =hu}hhY@ rO#vA8 5 G5"Υy,E.H,gehSBd[neWx͈ by5Rɶb$l TR &&7TR݋&f$(1+2*dl"ʹʈ]N$[ M$tԠQO1 *9l=q-Е۹W&6ɉ TnhO{x')X^&@,20*8i;5ww( q`c9jɗfi*iuSFTae\[׏j3>G2ՖLwgz%B."(e|J_"JR AmJRP.y(c jҎjCFPjy7)(J@PdWA)cS}L7PQ*]?J5 (J;S;eT~(UVT~(;D(@nsi-ͭcy?˔*EfL6[K6jAA *s}:9=V 7D,G ˝|G,TjR>϶Xhw*l5deɃD$I:tzO E|{~OogRywq|;Qwɧ׿G)Ͻ`^ӻQZ,~r|.NN,9%ay}^wC'E{ *F-b~7|`=A|qzo|$/ 'k<>epȺcqӛK ߗtVegV-WcC=C 23ӊrq~zq39Vjr'ʣ\'{<[58~]RqA~.YGbwedJX%6/+5Vnu%6@~O}nÁ;`u =cl A'm>8a|SpV}7sv+' ^s4У߬AZлyzwߍg|MgOw^>IM5a4iBp{d|s>}K?\&+%J'T;wW|6V,E:?Lu]<<ݼwG>MTĦfKpxt\_w&dwfn0o(瑲! Sf{v\j<ȁNgh]Qn3[y킩xJŹH1h<ȁNghC"fi{ڭs` 213ayKX|p3< ')PuSxW?!aū7(\L*H3buT=&JFQg OyǍRP*x2(PTkOQ2=A/S}Lu/OW&gJT2mpd_Iq;LC )߂ u3ja!=l:-7ht+qx\ZKA>|r c|/yk253J>CcRƜ;lϦUIAN3GbiƳvk#yρA$\$/ ŴI (.X_Dk̡ŲX(T%>fkPÞʡ{V5KԇÏ N)2YML2*T灭0]<э-es^-O#ָɑH@p6')1 [[{M䡹O-;}'4Ɂ:vB٠@TG=f'T$΄4CzRxCz>,.1hJ+E V{PuLLВ$aFFk U*Bl52gZњn%%;I( I9bh_-J~c !;RDߥ.|{.ΧH&"}HW94ρՊCb€FP"ȊK8wNZj]DJzfiC*egt^swox(y7E.`N/ClUcpK 5{;IJԮyْ]XeͯAGrSJS-:7e]V;WXN5ך*FRE9Q=Uf11:^4Yv&)DIhr no|tR88߹~SjҘmSg|:;_)M Sg<9O7]QNG9'&mRng/ǝd>j:GOx߷D9w.RWcݩW*Ǡݺ :]Qg}ErNu/n} C \n*:vȃ,t|Fv;)B0*ػvܐ&Wwѝ1EzNgDZyzaX9YjKm$f3# V VWurV@GBɠҦj1WAÚx+N+ 6%@m2 : gVJ@ZQ#N+h(O&PVI&u ~! ^jػiBqXyEfuHQs, "x[fV,5pe+lp/x 4\Y@ 2K3H/Ї+7͟7zOa /R a,}% 3.4rՔKLsf2AFVN2!Du0k5065VބTR ͋e "\GiMwHr!΅m2hLȖL,޽Fѕ(њ:XbViR`W߮Ot3k rѪ%Ȼ|嚙Ud1{kl+FZ!`j2Wm )V W ή_=7S5LDRzd_jᤸj0`5LTVPƄv?0W*-XȐ́B o]<*m58-{^iw R9[y:qNWjZC`|·.Vѝ V"t=B1*㏨tPo_ -V1˴$SC|eXpVՆ&۱Z@oj{?}15PzWXQT$CR9V]%UʐU;*(Lϳ_acN.VEk7)x4w \-5?9J-˴\M~gnj-nw7 mT.]\-+)xc )e<dzqCzs5aK(eu5w7+h?=Qo:Luoe^>Xz1YcRYmx?=2`vȲD Zč.b)/#*; ѐxwW\(!D, ) ^rN|R ^d*R2Yixe$%aI9yC$Ho̰V}yoďsu;෦A+󠭈GhT-䜪 0MAV!i)[W ]bYM-򮯖'?_^]v҆l.2+{Ȋؘ)K2Y`,p0>P+SȵQCg>lΕ)e|Y|]uਖok[따hcdP9))Q9GVԄZHQI1UͅJ$P0Tq2mT"*WiZv,\*O6v]dnvUq jN/|!%Js/2e!性EL*9a^%)1J/Jw{(2CE (:, @V:B*eG; R.R]T}/onog->5G'}:k^Ommu5i܏ !%cDqX<쉳79!q~,tveysibufY) ok֯D9x:;vZxݨvZ+ly(E=SH/Q}J֌N(=b9ҥ麄R)P2Tn GR<͈Ir(CiC N(=j R;8PTcңF)f"C3wj͍:Qj2Qjf) zvZF23Ǐ=P9PZ@To|Q+?%YmJm<愇 s3$0YBwrk_}M y9_/MIF,W._xB0OVvayQ=Ga=ّ O]5]$H&\b.\ O]9%Kxf.Y5.UqܕK%+\2d/\,iѤk܂loz@n;+(uG3B5ytL(45NgB9Y[ ԒomvAjeӓI[_#̓o.W1_0 L$Wb*0TPLmtnu!ZudmUWiW=x٘X.-$dl<4 o091E-8lGx%E0k?Fַ(dc .k%1:meKXq2ּ0Q){QX&V{6*-L!S)&NdžYqI&_?/6j}nI-508'r_L{D%D%D%Dlߜ_M#x)(8PT9+]X8# me: 9XT))-(aʠdRmwͺ/|Ko^Ze6_ o/cC fp1dx˧N&d(yq!T]u.w 1E*^%uh!5#+"e,-K]I82X @eN2ӆrBnoCn'aIvrǐ4`UQx0H]N:T4<ȀFf6ĕp[fz?|}y"k文fhߏL(ehVJWRX9䕇k`9=S\hɡȫ_ҞIF ա:4*kn!s !Y[y>mJM3y(M-7z?o 0Thzx\pkz4ҶՈkI vkMNT p[ rC~\j lI^6Z2ϰJr*RA}ͷڱ\owCK-viSjk)2npѤ,9}%k8cʕ4@Swguw(Z?e Ru|5y>IxG&O%҆̓dg!u$R<%5V/ 괾pUA!k\}0EޏOeF6ȁNwtn;̻E3!SJȽ8 70AtKt,]vtQ/3 cJ5l8ą N.bsw=XZi]>8I[]_ykqܽ]mZOZ|wCyXp@܋3w"3WR[TBS}.hYSyERiRy/<=W] >/iSjKMQz(R X{RPy(6Z 6J9桔c5vJ(}NU5c RaP*lu9\@sR66x,{(U2J&fI}ڔ~DT<$۽bӋ2?E.D뛫u;V FN͟r0 N`u\6l >A}+S(wgjtwdp*q{S*=\}]-Po b2g)4ynܒwmq/ol*Mqr64K7ob$!m/<5 @=r'"hbia4${hr6_<˨T~YR9"_ 9K։7n[}jt9-gBs7YلGwz1 Q@c0wԡfgX֟Mxsոܘuu$qk@Aְu`n'Z_&0NQz09qtX~]/jLy1}S,βҨiy{POF;(ŀ4*2P3 LCX!6TJaZUS[Pj)E( t-HJL+" R>/USD+(噗*5G4JEh(kٽ@y(kxQ*3G%Y?R:Jm&JL züm7(BQ0 "˜@ /e\b^krEjȜB5gi}yOR8='`P fA `PZIMyqw(,-h^T<Ԛ1dpDa43 JQHFU~|$1 xsqJž6U$t<к\c}z$^Iع:S-e( ޽_7Ֆ۶m~A^JiUbc 0#^ߠ4McЁbAS2y El]V|֎)洽r͖mrVBW0c"Y`ZS5@)F'Tg5?娖wwY?C m/urn*`SWH=i8ZF[;Dr󲻆GH.n,o$3+9:IζEss}@08<:,MDm8oC3T[fe9fU|{\m3u3JO zN>0rf-H3OGv|q Ijvrc\%t l[-EJSg˞gk9'-x,uP:)5S/(KIIJ\4Q8\fF#tm֌n_5cΔ4nj ,FeMK_0dY'ե)\(8p%AM33͔ՠ̲)[C*yblfը,eY2 syQۨ)ݣs$H0B 0ʊM:BT]YN,r U(5:IK(J4̃ #2QpqAGךMA aV*1ě>+M/7x'YMGZu\j5zi7 =MQ- љSOdVͶOJ,WDX+6YYbsN1muNrP;Y`@ok<#oHg[ ^`Mc=|mckwE& jvrH<"6rb8!G`XvYEr(e>K5`(u( ,ZXFBbi7AYhN}"d!ʂM(SA`JpO>f4+8D$^}<'Rkj=cni'sq$I5F #[ј{gY]lhf &[Oɾ* _LEU68&X)h:2n!i()4e]7y@L)Aj! \DQm$HgP>fFѧRdHAй2z_Ro8/DZeK#Ve+]>j좘OMYNYsueuӓpXݴoUvy7Tpɸz~uUOx[5&V_2o^^8No/? 4_z任 W 6I~{r߹F"Sx3(n_/Sg3OCN%Bvo&WKc[Ro4G} LY,|Һk 6=6O OR -F > Z"Z.3#:R ?J -9{]( 7V6eĞLTmr6ĢwoXt1Мb᳋ SWyIs4u)p%84AXS"8vXJcA`VX8+c!3NJRcvm9 ka&݇Jʯ1$&R&$5g Aer*]>Bn;9gd a^Ӟ&%܎AͼU1eaZ۸_af'~V[8S]qONisB\r@7%nh*ےZs.. ŅB*VP!3E0e_uL5ۓq#)UGGL_`1X9jq#cu=:Ԃeb0f4/,8pӘZHP"%$e]L('_&!Hixwl /V\*at A~m󼽹gYH?'#Y3ysUe^}bR],./Oyj.(QmI[X^Q4AI)Lk|%:,͟PuY5hPq HPkT v0ZRvP ĻuaDzhRjOwWx(Q=-S޶,+s-qHϨ4Q*:,ƚI=dq?_ęU]E7ɷr-O N'Kk 0LnUsl&Y.;wXZ#4 odĶ蠣LBѯ[wu1i AӁPz i4=#&.%.m`mt3Vpg+Aa9SV}T\aN=_?ؿ»zhmZiQj <fN9NW(yQN(:Ye"kWQ9ף,v^:ZwBHҙuo`iݚ`+hS4K!lgt# :Vo]O0b>S:ػ -`x`~3@07~lB .^[0Aw`u XtZ,@02gOwZl+AR 9BNVFw58h?Us>̯c$hZ~ʎ9WK%[ͫ/%U/ UQ?2  (t8G x z$%lQ#ft7KMƽ+i섲wRSTٳ; 9FBmmK-$ ß=xG拁;\zB5;2 rzq |EܶaDob`CwXym[7֭ rp;8:pn="$4>m[7Z&)Z@IU)mG'q%.l5`]2 U4zC726Pu#0xH>M׶@(bPhu[uvoV-vZCOABnLؕ`?r3 v&oY;bM9hTJg_wZl+^aDnz^NpVȺku^]@JW2=weoZt-7}v5#'Ņ{:A(wMR+P!y-H0OC<1||S|HuW˿%m|H yᔤbuC̨գu:pn="t |7Z&)Zp%i:Pd'F6+`U.O?&%FR7fdS3zu?\˼f< u>>))WIMߋEz. 3^>rT%緥R͖1N4@ 2EKRWRXaI)͌Pͮ..pNxcs&5@fhr&|Ƈn,R_6n SqFP`b]*A gxBI#@L=A5b1ܚ0IDTfe X"4T8Tp"0ELwr~o;c;g%[X:7"wn$LxW;v&|kǖPG-LT"D;1onnjn_ÒNiFGsuX'W k܂?y_yz+ccj?%w۩ţo?(/wYZRo(Q MVс=eCy)ň񭧔 lR%:N7:S(0z)Aby/^R%*d0ƫb|53S`3*LFtO }36;Sy ICW w,AFoMWa1ϫY=]M ACP/VM`JJUJ6$FFQs1ȳ[.Rs]T4'۳7ojz"r5>&Y]_s+oyVf B|'\\^đ XVY47Lge:eRP9.Vk/!cw)F+R)\.m>+S4UUc_A4ZtD$l^LY{2lHçq3^?[H @UO*W /. 'D%P!5 pJ\VR l msQ$@pH` "Aa"$4UG$LcId j"5b5ZKPg+Bc՗Kse T$I(ӘH|0P"M , 8,~ 6%Imeq橫ѿG (,?j|}~ĤbPT B D(jM$)aHLT&!H2R *=U3K DؗxuS7s['|/a (ۧhTS"$i|`3¡‘D,T, CABf.[=7xוN0/r5@#ǣy:8>ajU ~& PyWUyRΜHZyR&ԦB eF, ɮu7;ws;,v%"xOm}x XW?x<o_[s|k>gYKLDA& PI=-JMݝPNZBlu1OtA|ƁblC<ש m,v2KaD7oQ^JKRTr$Ghn? h}IbC|`&-+,@eߐRc{Ӣ/V=߷?{tꬿc߷ rzq֍S:扁uYjo, ڱ4ZzR"XJ2"6aRlKOTڱJmKQOzZZw+;tYJ,͞Sf ӢԜE,=E"jRp'QG{:qbiRN`?biLj%:YjE=[A;fwO'RxlJe@2zZZ`홥R(X =>h,ҎK|fI;l 8K1ci&sfIԪV^7`x,%؎Jj@tR,n ÖZRg2K!͞ˢMnHH=-J-ٖ8K!c)ř ih'͍KZm(夳$62kt'Ӣ˭yXss:RLQ)p=D62-n[8Go~H֣a.UB˾T붯 xĨСJa_%Ԉ[% k>Jf4UX}75))V1ٰcP~_ĒV /Ws.Ghȉj~^j1WfjfQ1I^ $'$DI *iVeϴ[R 7`ڵìSYUtݺuy7Gܾv*%V.(Js _US:Ffup6au8oIg{B,+fb*g;e=2=4>#-K3z8%/Hʁq?M5  MWNGH9lCQ@(`0}oU7fTsEdjhF̲u5~ZL*@ 3!J[AXtmLblȄ2{EcizP0C@˗9SheIoV!6 TA7zZZtN;&dFX([l[nu^p(;lo=; Wr #@D;k l}-7mLZ!iaƮťo#Y7 uF0=jdwL1ﮧov>Ṡ׭>f0AN{8@hux]w9݀olRmqZ=MdGm$ot݄Ѷn?ݹhuFYl۝h?àFx{S];u: kp]7U@Q)ɾnUjn@A_ZAn"$_ 8bHp?FR7K!5{_hחJR𻻃Uy9BUmۑ0Pޱ[ޕ6ٿB#2"-/ ;G;Vo&IISuQb70KňW ~ [RS-؍/VgӍomjn|kn|mx;wR~ފnB([*A#`/gSY;Chrdh<䝻hO)3vtCE'T B\';bې^w}7paNaGχ:)aW'e T^+RoJ-96hs@e"Ӵ>p Dv!,Wα@wK Kn3Ô wm^j3 ac8*\UG`.A`Gpa(!TRLqjrdU\ j?F6^ouѾK9QsA՝"{? 2ўx ɹ6rt AbUX8EuN9YI0jF5{L1)|2ξ[J'?!t[J7*!K} H3BN83 7Xޒh-aK>`3E$\S ^кu5e`Z$빿}2 ҋRWg\,TCef81[# G柛 RnvR="vHuRjH\cXjd#L׎@N.OMs.@5 -McN-HN-ToSujI^vjiC޹&bF?OoE7R1qn@{u҉\[]4ɧy;2cnNwD7=DutK'; nmx;w > 2cn:4°2u;Ѐkxˍ..5j yFN)Tf><`@ ̏}%K畁N g=f*lҠZ-&U9`2N> ԭ{ pMc+~sr @[/:v1B_kg2Dma準yZ\H VgAa-_,%8 ttV-F[Y50ڟ dn+_\EŁL며.:GRa.X:*T{@ )¦1{i$CUC.N߄9DFͥ֒8`5B^d79]K4LjNk.5wcE|[6- yþےݏ}?ߔڭr)R1N\egSɩ\eS`.E]4ɧ@~4"6I6۴:F6<䝻hOZX[́5薊A~GLtp lJ[tKwnmx;wuYp\yRC%aT>ߔA'*ƣb9{(5 /=$ԚHc)D$(ga}lpQ@)!V ̟TM sTEE) w(" Y@. ^`577*?75閺U?A8[~U:oCӻ T%5̀Xe8KWќ=t0ś[S^\&b.&["i vGӂIZO2IWB3¶IGIO䡖b'$Ef4Hi,JfYBؐFWaIڝS~+&ISW턿VT*/FY(=IfU$znz&ՁyDO:;Q}e~ϋ_"6 u>m))"? hQsyuU k^rC 1z//o~e{L2_MD,NkjTQMjeU>MYX}Kc`֊Bmg{!„{3e\-5 .wax;h"Znp6^ ƔN5'3OQ5)(h:Jf:El@㢲fr4Z4Jmi6ah\Ȑ6C vER╛hF@aMgVLE#2*y2*g#V}uRE츞1kvJ)%ӹUUa909Ct.2c2á *.Ș?W[@l+^Lp@r؜?'@̙b #`g!k^xUlX؊j)^N?(kb3 ; mͶ̶|]i.el>%2.݌h1ʢfzbb01]Nu^WfO:` `=Ь^RP:\tiq]rvuDZ}@`&@Gcγ%bBǠ֢*P ۹pPmLuSjWe 7$(S)fAo1YX*[m *rTe6Ly6sa[OkRc`c0̐"1bff[͖Uq*v?5qZqK7ŸuՍ/}/'w|pw?|f5a٘s jM֟X1ܸsٰw[7>lXue: BTn|JòYӬBB.^;|攓w|)NNN #)QKݤճ5Bz;(AxZo\W+L,ҐGVZY=_'8Kpъ?Sm;LHZ&Kj}D c!2a h 93ȍ0$E;(GFkV j"mq2)rM6$u_Qfi Me6"S9OՌwjS u;l 2Qif -rz>Z}TaI7$]Oo#2xiCBfPo=h1:PO `4WI^uA#zCXHbOMT`MSYn-NmGren72ލ[-1; s$g쓧A[ ~!~ě 6"3FvTH}&ݲ'ӘR+Ӽc ZM!p@-[4/:1j|@khul1P Ңv5F;\Xdն)f!޴qACFmmLX7!IUY$J|Iy4*˕0%92*5@C7-kJ'.ɒ-@."s$79o%e; Ze 1,f,+,I*0TxNiCYݗ\KVO}.w@9D.eeX]~lCL f0uc liJg .-t7ehX)ԖmZV虐 -~_E򊣿=:r>\gZXC["i0ե G4n_}P S E™,? -Z{׽F HQt!zd3JdP*]"dYEV WrQfBC(PVRl2 (֪ǻ ~ ~yZ] _X]l#Tݯ,@/=p3nadO L|_.>ϵt&)|3K\]EY'ן~/iOgfm@{)>C$]]!͍{WlPvVۯCC\g$(53 x+VINR#7q M屗Y-HW8ӚiO?>li0[r8_Áyso\6=>n!_og ^<RNyAy >t1Vpa 2/<\8 J`m @q% 1D&{!(PXr3Ȫz綝/[\9qOnrP2Gpv7y^ېxb>D+dBCwlOnbAgQ$ZurGt^@ =,BoLp)7}o &0\tҾ,95is5i4&m/@ģ߲k՜xMC6dm.yC܄Rkl@1jn$=N5 R3HNZm.rG?7FU*Eb!K6Iu{=Y?_>ݘ-roҧ>it]gj{#_1˽`KX4{pd/u'؞dG۶Z⢨F$ckbbz.>ߖKq˼OA,\ ,/B}^^U͏| 8mj{YR/g@r}L%z^^=L)!'Laykw/pn:`A'Qm5wKN~zrM”G-RuN7b۔Fn5~L6BNL9&\hb7PΚ5Xa[?(\l:{N\dNk\g "9}HtnAdNy|DD6ڝ2}28kFAԡm-r"\;:o>}z /넞Q]3oeA7OGV_xx]`g~{U+svȍ q+ RVrGk[-vH-$H)uR_oHMafEY`s$Rm:QKPM 8nvB^t_뎰FV3:B_~< oG¦I49#RL'0I!l11 RQ400-$wuvOڽjY_4S^K̺qEE]K.2ڕKw+lo2ܥݯmc(,V?˯|0v|㍘KȹdeݴGl"i%5ѧռ ` .,6pnͣO_ց-Oާo$}L9yvu"& ;M: S4&BD5q`ս8mm1N-B (*)/]!*$#tf0)*U1E)%VZKGmJ&^dJ):$R}%rPR(k+3U\6/1J"`) m*JUῒf:+ 52Spը# LYzMxeτD%kma< ͆F\(MԒ۩EZ]7ש #bZIeۘ /#Ү*(+ѹZhL d (C!++k[R)j ƣ5ݣmr!m(HfN\ܧU>Yb 2JXꗻt֢!}Rt²,umOɠ%DGoF[_  h xyGdg״O ъs^Q5s~PbĶܭo$Q:;@xm}!Gz~*{0쳓(y!9^#͏=+1T#ݽT6;x,lG$Xќz{ (@VI1Z grSM_QY_KST}Gje-ZNgaS]>=1H;8JQ3S-aΖ#U -}(vk#bHtd$I- `x->uKm} z_j2h`fa_b񂇃hDjF{̱JpoE?y]PE)X\wv*T8yRAj`ԢTAj8DS0ER>xz7)J0tRݦZ;wKOcz!rMin7OoxNO3݆@ȉC74uR(ki?\H^˔.r-NuWakwUueT&[e, N"?',s²\^!~]!wFQT"f 4J䡴L|(ՂR-!/7b3_l]aѶ:C @+g}?eTijLe}F!e0J!l1I }꙲7$nϓ7r8Rۉ'Ө|9;דc N/g# ;:NB{p^ r*˂L|LUۥV H )o"5&p7UO#ӑX7b.*#M4D,Js#<]ƌDp{va q>'ڞ"%azRۓ^84ym H7N !IkG^P/o |8 [RJ:'$\M@7M~[ť4?h~ KHEF,/Ki` c ViYebQQQf.# Q}0?7&_],B:Mꇮ.Vp +/ǟnM:$ͫ4KA-M`ժ:_  O֞"g4vp^Bd(DYVz e^hԘ;'EJe._a]W֕+Er@ ڧh\vkY{F;Y'ЊmꊱFTZ82oC<ʸZw;(hMo #`8iۘH:4SMP«Ck|X*iY溨. XT)j$Z/&ޭmjaBK &2{6-,YKA&1wy@@ý>g RWz3=1P5gE_j5hVՅ&8eҧ?L!rMٯӛލZWSVA&Ub*b/oS sϸU !)JkݔnLRuN7bۄoSѼ[z! ӻ hoLYeSe7(1o@GJ'Rg]]ϕ:[WqP Y$`(/uWj';QJ_``+szR;tFQWk=_OuPCAvFq\ V'\XJK}.BQ*5R7{V)Q*5Ԏ:G}gJPp{@BJ=L+q+>f,P B0Z:)J][kږƖ̔ j@2F XU\(+: -5½ ~  *(XO]}>Wpu?7^ف穾>LO~_գ?=>~+BYG_-µNy^<.֟h{~RWpkx~.Slu^)„Mm=)`-u数?To[YfB7!΀|Q3xt>{|,#3'<ҎW|naT7Pl/M -Tw~GM7tW/`Ջܲyɑ w5_nj}]s@P8hCsKl'>J ѱf"I+ՎZS%핸*hS;9RJCX~UtАYj%u:+6euDTM9Eoxn!lsX+ca|Xj FǼ]IAxOg(ںKJwJ\:CVn7pEHC;*g ՛e\ks',[hя IR%m5wgƷ@V9;HK8%UvеfÖ~;gZ@2φh]+Zh`gѶ؅H'hU)sX r.]i].id4ti*)ɰi[*Qe%XR޽ӡ>p;qtHG@.jUXJBԶ6Y!tYJQ޸bJ9'!gic=- <;obT`m3se"kiUlSE9mjVc!YY- =?04}`2&N`qt>pQ"._J3 N"f11 ˡֿ9R?^{[eH{jg˯x8e^Kp4ElI[6gD)̫}33*r!b&"͹Nj(n}L43IH:Pé$6p˂aDc~%w6O&$%G )R`1F߇I@rOr|[\18Llp1 Gdbs&硂y:؜QS#GC@jJX0$PE9_p߻1ƥr$(p*;2BDʧW_b2N.|yUɋ!C3In1)u J9*qa_ PıS46vNx[AfJև "ɑLXAObX2mA6\áDZfQ:wl_98b ۦc[8/VKZLhQTyMGG &uV\`}1PLFS.pb)a YTLCW6qGxl"j@gr0@3FXѹ5ү ?? ك\}X ۫}/Z1TR[sp J[S)n nRd-Moy|WWEʮr̟_IT>y:Z> -q XlWCς?Bp .oM uJ~s[]lg^ s[NG+r j %*3"ٟTf\fÅfV0&'Խ8pm1M,EgY4V3V6 ;} 021>䀷UVnqu(=c:~G)m.MhjZu(=K@! $hYa965N8J!JqĎbM ǟr!}g\,`~}v c(5>$s}cnmz{{3l&O:$(jR,53ʼn H{R=ZYED)WJ)l Z%J$JB$ NR KGKY* RyJ>ƙj %vf&k J/[1T `iFF3$FR)(!V֕꺄acWCCt}3cxņ AmyXO\{ԍ ц`,A J|ceyPBԆFF>߭fNO^v6ԜO (;pN/v~z)E HɍtH/rʝWb"B`UM RBͨH3e&8b)H4"R6XpREĻgKrRϓOkHZ(X©BYfn`3É[ l3Cq,X!+,Qc y s")zl0dT EdVqVs͌Q@LIu8ZgPjI:u<^{: /6§I=-@ƳC~Mjڻ"2~fGv^rsssrX.`,I M<8s"mmkǗL>ᄳ DC.(Py6v9njk@!ڦ r)(/c "2$,e ,FX*ؔbߔTSH(TV"Evvn^86ɯUѽ조e/{|+g y)3'#2zϋ.(r^OMяvsODe}u)ygmlwL0F>a~og/Վ gz+B yj"B`݂?<޺gmz[[Rq>SX8A~76Knz(XB`>}D'ڷdS-ⲋfF/Bb"h_71lRɾ4u;J}耬mCGXbeWFouAm&nY>7HzRqgS+=rDO5"e.oVAv֒ ;h1R dw <?:-Ng4zUc$EB!}$TIFZ*A"QNw^ooު|ZZ O"n<N;jiYNi蔏PBr)y)/ns.Z3Sv8^=MiU?!h2L , 'ɜn,HPĩj+9"0A\ h^qWǽ'lG/I`eP'9e6 P˩VRtE;xBJ)8yKp+PQjEPz(偉0'HFP+Par2֡QqwS=R pܡQUJ(RUsy0RD"@)Ea(ͩvPz(e;>w|V9PPz(< tT(< 9Ւ\z((\.JNTT+$sy0Ų(%4 9Հ;QJEJm|[R*PS謧F)a(ezMB5aw^sFl+ 6lty4T(J*.d+PV*PޡQ%1QV4,&jA:5JYq ?b9 CiN50֝K2Nx;-@nGUTy0+ dcAw#)H* ĺs0w.bf`0wƺR-7/]\CDՒI>~x9<^v9~S'{\|B!:H&zZ1!wijoSN6S`3uƛqjvakn}औvjT )dI\hu?M+2;,f$vLJhc1}pUrsq@A;V Y~]~d65S2n(*Zjͪ ],Vܞӿ7GC\ {$> +W% y"#S{=J{Mav-/щ|Gv;x*I-儯vkABFɔ'Pk7^Qb#:nsOwnRŷnmH2%G"(.Ezq_4[/'q⿼JfS31/֯~N;/n>yиTمY -ZS1֕eATGiN5QK9:tQl*_rFXV;TcO!+.G]/]R Î'@v7Ү=ɭjr]J/r5E>ϯK>e/. \vbiNO}+O%S^q֓dz7In--y2[.O~|^,^!)ՋUYll&^%OB;/crkKΊa:˾ǀ,+gb6з6MSF 5GŚ[hn CatXP%b/_v5drA-y$zpYȊJ}v{*R>>xPg? H& ntgPĔ6LͧF=|TFcoY iqn;$$,F-ƊڦX!yhU&t pΔHR66o=^M=ɖ) ];m*%֢B±U!䅻J҉Qs?ǡ6(vsaVjD+DiSQgFLH] F 8ɹ\3&#+6EA`*tW-1Oوr"8{r \`b3sb93̂IȌ) i.еSY->y=m@g]1 &Ghh2ëW nWL[RPzRd^-&30FWB-3>ٲ"/=nZTʖc34RX>ۏAx8$٢i-ʐe\Qٌ-O-~{Z-QLkjXWJ,UTn.Sl?alYH[h)a4)EǪҞg=E#1?Us*?MKsQո܋E^wy^  ZIj  =FpY 5sԈIA?OQkaIbgb䋸"kzÓh>cHG'k%:?:TܪB C(m=4ۡIiHb;6z {Ug9bACびryg^̩2[B|g5:s€裟=M}%]O|nׂB)EP㓛:'+Ti8rBN!)T?6IIQ(UTeeQx aBRCs ΂)j EB!.:bmxw_S4pJ(Dhrkm ljVyL<ҀQXR 8jY6 _MɻGt~ SgZ O0_U%o5&H8c^u^-Bx| `sSg߯؜ 6q V8}mh v,AxNFwS< צXjl:.oW-Uߚg8@$fр8'pz=;p}4淆bX/t'PRհ=\ vfG'/5⭦iG$KN=PMG,{ߎOR$IEtAc%\%ﺗg{aRsj]8= i[tmBnO7 B{A xq~rjiw5!WHJ/O!&%qmK`͌Y#{7)B? k6ȧDFSCQ~4BQ!ܒBQ͵w(jt\'P?G5>p.\*ymu8ԉiM뼣M\>^즩ݙ@SX;Vy(4 "izA ޸jP`MȂG9ՒYxm6Sg9nI)}r 4U9[JeZo\ Hڊ%nfur!y ñYq9 D tq.JonQ؟6Nn˼+#\J 厡Kdl ZƟ:n%Kϔ}H/6IZt~u ӿVVsp]FZ[jVhϳk^ZTЁP\+sxF(`wHd$oygzx/NU"?KWAJQ,P)3R-|D3pN9W:RK aԳ ϕ.( J%u$TI{vɝW*!'YdA*g-嗲V;UJxB9n@ ĺ0Ɓ2̇pkwqeI}xtMҊR(/Znū$ udd4hS;MO=h#(+n$y|K֘ LYs-e;B']d_9ѷ~]=Sl-WP1(GFqD4Jk6 QO "GWyK2Rsܙxj짶ڃI: L?ʃ+$cr&jk5CzR۴u&%{RӴu#S=X;<^҉ݗy؍3M#\=F7[5w)hO]?%3͗.;9ʋwѳX=:}T.wrp~<}QPzLqs.Kzl_\{/x}1nFMo{W[f]ʿsW6龿Hv#pm#$m{vwF8 3Q^~wWw=-ɹu4- "qG&Z/_)o*( sU+;_` aQwo~wH=DRz1 0>SʨD<+Jz1ӕQiU95x/t?8PsBBkm{V5ڪ?bd:c {gwgbjMF'i(Pq߇[7Wƙ} Z;& WvOb+-"ǧ__6`b;00uj>"m$}xwQ @iVU*DWy0O\ E_sQedL!ƕyN\9v,q: K6]>9E!hoV JsJ /D(u &+=k+Jt =QX!TK&RY\b@mAz0Dan%Ep! CƔ%V ghG1g.ւHY_"hڍsVFu*!T3ͭ[ ?+b+4 +=DuJsJJAY)q &0 +g9vRQX)\s-ULVzVs+J)ʽQX)E~VSm#&+=k+eJv+~)uTӊV*= 8g|S-mXY)T}cnK?+ͩ&&+=o+kdQX_#jf'ה!=s+V*昹{5_R2M>G~N@%")ۥL(c B+5Oڳ5 BzVy jp_!NLkH5<%Xǜ*$6(󎧻X`lT[FNX||jrLMK,_PǪ)=b7~Dʁ'L3 ᾰ'X1y*/ FP($ϠT\t䁚ib,3 B7шDYo%Flȥe` /r >%KhH&$$KR&HSE$1 ppJ8V:I*"R*T'x^Ҩ&M!M8cia@QS+BǖceiP@%^ ,̺)OT1|*+/6Xkn4b C: %h!SJ3c F( 5pV,̚ȹqs$S2N6lKc2噤 I,va-3XB(%dKbA$ "ŬVX18\wÜ1Z&`7 d coXF}9Nplw&5D䌳D>`fkyrT}7 hk{lGl-T^5ؾPg5h-A:wmуkq7uhQ"x e7Ʃ\u *xx&= ywA^Pp  { Tj^yMF\JnOPJT+9-!g MS]ϧ1#ޑ\שR: 1%˓lmgTFB&'|0Wlt5bCa)IWlIx澔дT _\ֳ?=PB߱mu,UV,V@x;&tg-wBqmmSSx-VNXy|_0FynV`tmNL_6QC AhO 1 4Clلt{rlEx!2u`^D1lkC\WJ&vTr*/F)gN) AO͍„t|iO]gEBt{A!;m.΁ $v+KãC}( e _AOH,bWxU7Ya R= KyYrXw!LDBߥfiݞ$d`BL1h{m^FG<|lq[E(Y$ RHE{tm/]ayG ǮGogS_X#-+ߒL š~B$Ch +GГ&&Sޔ1>}RWTk$Җkɗ;i(5A_|y{rv_j{JsQS'0+EyۀBTQ}0R}PWد>o_WnCAp#âPuw_312MSp rpɸ*5$G*=A9k3%BmD0Rooz$Գ\HgbzRK t,׿# "h(*hfbA3h;U",ó qI埶BKKjF6dv/-xՐWR` +?d W!ջ +oqϠ߈Q׬~֍c!BxZ=hcBKgrɇO \`Ʉ긂hPT6a* )dhFY 4Q$92C3B1XFPƴaE_AH3WXh5%c`_Ԥ8(X*"IMs#5SjxTbN@F^y ͭ(=Up3:rQT_^K{H DOmPˆnF'X@᪝ͻ/y ڹ0&ma9WTWzޗzdtUz@ F{m0Ip_glNSz& vy_o}xc߯n)(z- .$^]qsJ htΥZ8%3%i&3APk$cʈ6B5\"c]7n;ؚ^O޵6r#E𗻽ݖ~l\$/ki+#?"3,_Q,Y[R[xdU*;KdL oa<ңOZ[PUugqAGw{?|&&SLdLINNR ?0c5?`ZqNW`߮/W$a`O9?y`_6]9ؿp2kȜE,h&t*TN LuߐLC^Y:-,4D~gd.OE јKlP%)˺CU'ijj?9q4[mYYL-H|%gt.( L̻?UCUnw]ܙʳf}Lr +ԄXfAU-BKXL,h J"VɎؒk+:;b[h@KHLLHoIE{,Ia8"* Jx6Jpȥ8Tdj/5sifQ+id,2V0Dcԑ),u6T`#N=AGRp l,׮nr gsVZWr{RRX-fo7Djh}k/Ԛ9\ TQC KxHjFjq)='% T;&"!^(gJRT.#BY< '{_R<ԟT\ޥ@Z<?~Kð~|"A}HP_FBędD''M-)#b Sފi5Q3Yht; 4f%=XQzo0Lgt*,|%М큳ȤWԐJ@5G,T Q9DF,2*"Rr%046E8!;h l1Żժ%m%TJlZ}]o(eh|pFvWUyPFpcQm8TE;pMQu7Rިe1<}$mFDyG+$V3C=F KF`jVs VaDqK$"̎P<bL],X<80 bRh$ ؈)%ԄHnoI f(I}oSZ·|CJ@O~F0!bw0uu;?!iդg~ fp?-W]G2R{J02I$O74_Tiş ᣤ#ÓҺV D4=OLFu7LiMݝswp2OeVV^OmX@1Wo(!Iṳ=R}ymC)rekVY+`;%ڳ#s*`s 0QqbPX3Cq5KǜI6*J vЩ*Yସ,i/ I% 7(G`K=kp\[6e:>lWucY ?cJ\NmUtQLFC2ѐQA֙ }ĐVj̺Hr%'A"/D^$HyQrօ@CԜ+'8TRj0fi*RqB:HvEՉ*e]T5m*ĹF9*皦*C AQKCE33s˱KQT1iCKJlTN U.jL="M I ne(He`A)aGouX!KW1_oڷȬ 2D9Uxg77SSR+7rJ-nL+nLSZv_qʼn7)BgW~Z4G +G$x>ӘIo+ӧA{*8y,PorN@BOڌ-B*8ŗ.[R(j~'q87jQ6)wi|o0_''gW:nno'hRrbd~ L9Mm5ra^28j&i5pj]%jWSM`+vw5m(!:- .uZFi T<Yϫr&c0McRM.A[ E{L\-™9 1lP9iꚷSGx@;ݓKN@_vf%eN G@DZ>Vӻ2鯦#Z~ʪV>r8f9`3 F QDeaC'Adq3'0{̙ś90]F"pyɊhD{n;R 9c-]= ĄKI}rHhJ|4i,Xq&O|OT%I^.qNHJ &$DOi(E](E]p4u/Yb9NbxC#FbTX!kgDh"ڠ4UJS:GQMqȑv0u_FvWʹ1td9蕄6ELg2{*Z.1nΧlW*1jdN{D4-wa@!NeL I2aP`os<#ʀqʂ,l"=MXZ,XJVo+5_e㚱X3G ;S)?y #lM70<V&4рx gpxEbw]mpXRXN@f+ۃ=57\΢zE@Ăo Il:Yɛ٬Z-pɔhn !nNWpB.Z(k \?G]YSJ]HbES]aѪ1{ 6Tƭ.]`Ny%ۦfwrmQZ;9yĝc5;k_f96뮊<2.u_f\ 9=ΔCzO;}9sgljj[vI.+,B QJxZ˂_'+o߉3\'fMft:yr8~U( Br콌CTGDBj8#K*9Q<Z3&7@p%2]X9)ҵ/0X6%ҧ)uWK0:$]sS h6~Fj=Y@.,\Z@`I$Wcz2t=BTuZ֯Ҕ"ïP21 XP M6;ȋl+9ÿ|Xץ 66z ;&ԃśB'T NkW7Y;HH/Pf=۩Mc-HVda o 2zo}Dl=A= ס-6lwh (x-`YbW 8l_໖@+hMh>2V+QGgnĂ*0;{5q^ȃK>"4!X7]tXEkwjк:qLQ B |!KAԍɁPNZGq51PMhpF#d<(!obCnrmCc*j_N3B104^ #\'}P9#fo" "᪔;o$fE]>|4TFyE>՘ſfr݇ot62_6mg<ɷOβ-$u~,7TrGީr~ttʀusO]f0 3}bH se %0.8%P*,ХZ*s5ڌ>иp: L R}Y= UfeZr3E#ڀ%2fw$j# KŸP, F"$!0EQeFG^; k?H^CEH5f\3Yn^@uQau)a׿TYgBd\NL^hR3o@2[Կ<|֚4G@!|O"Y:[Sz/}(i[0DX>r45vS7Yz_k7$/L$LjJ5cBD=EF?Xl^{nwfqItE(X [?QcOct ;)s$94 ڑ`%cJД'Gpՠ"[YA(N{,p} qz6FhKr-ݣGC,ej%Y"E!3ɹu7ɸp*(4O& 6z˛\ː砿 ^(e mXaBB##29Ŭz¦2S7TPHJ6ܭpR dRQ&8MW.s1fzl@>^LH:3s_ ьZ}M#TCZjNwoz~ddBd@ӷY.6rvM v%WU߿y{Y9A0ь̟㗗ϛ</c/F>>{ ~$"Y)؉`F_>h6h< ifo&tr̮nu2SFό'flo)QJoEj, w [V]H x&`ëYD|Z%Waؽo@0Z?je$Ϥ+Fcfͅ KVg)IJE.ό*}JsN7EOK:% {َdK93 Y\Mj!L^j r{V&pԜk2}=jZA Lg.w.{eϵW%V|Et7|yWBlه`X+]pǻO\ʍl~-B%6w@d@b(?zdAr@qiQ mfTNYu ~iAq.2륤Ks&WHk̙04O UZVvҸq> ~> ~Ӳ?,UУ47pl%DNi9$%\%)mAY]y*1){On5rXI9gg D+&WAN W11W䅸X}R\ e|TEz߲W>2g0|}?IRF+cl9=I3N&a#|@j?<=2K:ǦJ>%MZ>׶Zpd1%,WRxiZ\8ϘpC $W^1 Dp%ti"Idc5#uV$@RyP ^Zy!dHJGM/Ռ@wt/PP[TnMl6p+GM otx( 0#ʼ 6VQ]ּ"!0Lκ,ody(LְD49a  "y`2ƚSalGN GECOb;*u.bwg`E: mza`Ӕ+[+%N6(R-BVAQb&k`5%qfL]HOs,ec`aqx lep3L ÷P>1U(GqR"?JKa.N2ߵ<N~O>Ruw%U;uP@K d {P6g+g[܂L6ܧo0sLLP=`!HN{ ę5źU6N̓FQ]MRDݽR[,+9sq9qtF9ʰDd]34$1Խ|t;k(kJ;y1tՁQS-nq#xٽvDC]4;`(2e8BHD+Je'R.S]Lİo[hn*|ϩ@C`0Bx)IQXѱ>0ڄ($D-s?0Na\Ø*M՞i JhE^eWI #sGq AJgI[DHZ, /IF`^eZ)  쨻[0gV.{6KdVz,"⤈.جىi y ܫh{h-ZoT`2kr7!)"!_.)6J)`:ٜyTs{-xjԕ@ԫMyq5dif AGy Bs[ :ɫKB[ipNJSX뗅C⍺uCnhxhSMs攝8|y:BmTZ (uQ/o''=+5 GEk)^!ݔ{!buQEݟBx*Τ[톦JMGEk)%^)t).[] RN:H"fn*6Cy]OI)xtzc|\AﵬzK$W.8;":{6> C2sSc HO򖧘L-%0nqD""Qxڛcy:p:9N;4_WrHx;O>6Gf9A+T?(Wryຝ%P%q$A,J8j03J)x2 %-rgsOOyE{ DJn Z_MK[ǟ{&*I|rT_ι EUB=5cv- THOq5]ZmE@8jVAz;UX*q tA$"Z*MKqBT3 aΙa1& k0T;$kkWhB{]^(yT&H)5B2^qXgνӌj1 :qrXܬA5%hK~mL\HXgBHS)6{D RT?P;gh} scGi1 #"=oZ1w0@g/6jUq?pHr$50.)٬ZWmi mO)aU㰏js#W _YN4W >DntYAh#-A\0Rfci(e (jN:A5 3"ˆ3=2"sљS & e4$n7@ED*\JQgKE'QX˥D zC)e'ԆrN *-"Ĝ,!skؾP8edjp3h捓/üQPݬ?صW+3Hsg87:k~DJ{0OL|L@r`vDUTyw@߷w4Qa |NIfDf(B{:rO}$uK*b5FHk*{-e,HdJ^2b}j$2Ya}*hNڭu\|Qj&mSKZI0/ !x{3MnwAgzR3 HzygAKTMvrRι -UTIu1gja @Xm#4^u23|:x>:oنfjuQP</äpO[37<?Z0'QZap>o]>u0Д'o_>G!3+>`غ 2` gq|2+Q7 soQp'P!@iηp⯃ jF"G_O (۱óy3dǪpvr'eGppurYs 7`Dێ;O*c ;*\T d_{Dz$pΨ+]8]{wo!{'p$W > .ByC Tڼ@͋(~2.oEn~ܾvG`|Ns>o3~`x"ShPB;Wȵv֞ >,Cf x8{4<7gQ)/@(^8z;o?=P\kŦȳˏJ4 ҵhL:jj*C Hg"n(*|1'a,.ƻ%[olG r /&WςX)zPDN~<R؇@p(&w 7.^\ ]%7o>A[N@}]`@͸ht| .8?IȓK­WPB<_UR<(Ƞxq&h42^%FJaq_GQB_~TpQ8~.h߂ 2g@Ey5-cwˀˬ͕(/B1>ܻ?/K(M)Q|Eؐl~wِn+Y'dsi[-ZbV&2NP9oe]J-*R%WOiFQb1XV8x1\̉H0XnUE \>n@]rjF>1ؓ%U 0., xQ,'K%X;GHX?[ &`fQ$yaI9xYr0Ų`qqD $OwzR'vߏRuUuNltHKqgřzr7/ԋ3LML\1SnÛR!8hClXlMMl#yRsޛqr#5_Ta6.*]ibCuF#ζvXn\ b.کdI~0;d=8/-¼r.rQygZ/Ο~RNyՓ H0IWRV3b_e׌p @a!h=M蒟7MkiT/EцԴ6w &Hp6sWAuC⃲*$%S KNHբBd(BgEQqkN h˸Ѝ ie 7]wvk-FhClM̼O]j=*nwc_ΒUh8'OU.E[ _gAvoo Z*EWIx_A/cn>C QC?w7Qe S_JY32(v3 1mVf"uΠN\X1B|1 ;2LjJRmKXA:jf۩}PYfI [R؝E:8'j` (*dUfg-6#(l!:֔o}LR6g F-z,XcXRZkRhoMӃ*!q%c*Ti^ `%AXuѓb' ƸYtbTkj[fFN wܖ-O:k;9tpG@%9k,df"qJ F!eVp@̒fMgi!#pN>CPy.;'p$l+D8z~lbn1MG/o-[Qoy+=24 Q&V&)#U?jE\SRj4:Th\Y4Ğ&+y?zE6V˗⩈.ekne{EN qdBqRPٻ(~!E PA@S^vj;`B!q'+e>n8QN|AY89J 'bD&rMl@#|h}I=9=;3(>.,@}hJ34H8-͠]k|l9(H:%qT[59؄wߠF ^+Q'd HCͭΔ:#Rf dڶX0Jen2ԵV,#A^ͣvwtVb'ObWX9[Z3BnG3dʊmidB dtFf!{Vܱ;f]76|Y6C/mgj.i\4E+بVԃoTTJ==v%I ݁J+Xޤr3ْZn)AB6mFk'_TYK DKg 6z=ٕѝehuLaW|s*fӬ7>כ7WuX,çdKKeY+xu[~8fbpa_n,F?vwǏoz3w)Dؔ,k՛O=gƲSkdI]W0%.K<)Y7JR_pL^2{ɔo{R\F * YɬNJ㣘m IE2+eh\+Ӱ~24l+L6nlr#c~:N#0fsmKػfsmW^dԪIu,#gvyUg9v?u}>|z;z*г~Yy~RlW]Χ+&} sh *3 Jy>GVF>ne+_ݭJs_N޾pw#ƪ&MyN`hV DKܴ'F) 7Լig*&4;YG㟃azj^T{Et.OXL;gw\mg>vc h`8݀v Qxk~©nzODMpk6L/˰l ۗI:E"yR{"=++[| -+sWo^kLT1pϻlSW3~9/uӪP,*2ZUpϵIC}^K*PېúI_wgknFn6ӫD$ *m"Oū'?;-uz|iu{|uFJԠMèOVm/wڷl]U|OrN0c]`T=\{"cfS7~8D:;'6V?1>?Ȧg{ˣOfn Uڍ̬8@p*](H?ʋRI۶_SH:\کۄ`U;쟘K9rÚw=:j brtt6*D[~~:DZ'L'+F8q*[@tb48|w4}N+I"RV)RuwDf"EBs"RLpIG'+Q}c\۟YkrB+VWgE<ҽ I ,^M>]{b=ze< ֘L7%zY]qljgϰ_Qa[H>`GV=zbsGn1&hO8u6xK;<`8?2M"Okn܍<.Ie0l'%iٿrgCJdʮ\Pir l︿,=%iͫY$O1jp*T{0t䤡xѯm; ͪ~9B Q!2\5GtBR!EgPO`H0hwZ#cA輂gΥNĹSZAvm( ȭq8+^ 5|AW<Bo^]2|6YW6XK͆OMH`]0  ܩݠaOs% RlD)9/م*"D:5|LO7sm6`7̖#yе+ jxwxly}Fl~MFGm5c%?UQ /{b9^\ȱ(W/FR%>~_Uo-medv}]Ys9+ A =̺w3/*PMlLD"`q>-J|Hdd!Φ͠3MG+/d#? t jȓ4y*PCsow+CR1jah&-ֆzָc[uFu}O{-t~C|:c,0[Bv>tQ(If @nL?to:+`ZGu[􆾗 2 [y#>լ RzC1r=)VΛD^l9&ɇFl"K*zag] 3VibxApnxInH|roVXACƐ2jIr]W4%+={w|Ե^9pⴀԂ *hoXzoO1_5FQ d.cpy( ݂PVq8o~VԆ6|gҟ(wpOu r|폧p2 :z`_xkڳK KP!{Ӑk';TKv͐kۣ a4ֳqAϽ||&^suV.UB,& '^G|ju}VO+ʕ:?.lQI\nn҅T%uє[;t*%T9.4EBczrWUɸXGU!;EUǼnJwd6@T3!bd\qKkaT@l*,L@,p8͗x_ʶߘQǏaV{+)M9%DкE];YFV1"]?<\BRE8U҅00S_wʉ,Yu̝զzo77 ?DBBQܳBQWB}sʬB}Ej͈lY~4WǕx)He"$ Ίҗ'A(]0͕ ?j.%]$Q[Y' =-GY"e(_2 &b/V0󺐿hS+bNՁ7TZ:q2}Y37_Ģ"tF _za{ub D(JoMޤ)1łrQ SaMן 2 8u0E×Q`ȕ uM%Қ{~*wudRU/at$¿\Ge}GC +:Tm<_{zM|)gq͟3]U"$tclD~xwr~] ȃ|M+6S[ + j<6ZGK-8F5%jQIx,;6{;Uq-[G%AۦکWӋO'ū}v)oW9LYk4 HU(GnF$fjE2E<bs󛙦~ʺޠN,U_M^2 >ew! ]ȼ?X0'&JrM-OY0yh`X8RH81Pr?Coepߠt^ì]/YE7wYyC tk;.{5ٙ޻ڡ}[G[.g0 #𲥎y*%\g @ʭ#]kJj;+g foy5BP2B"g-F94XR1Xe(q*} u≐Er)Oj,,_LV;KHM8j0.lbeb ae2U0I:rp03鼠!@VVԴME .8;Ua{nӄq/pHNQ #X E*%w$j 5U%,_ߵRET:Z~&>4F6zEؾޔ`) ȗ#C.#&Y mwLk')gU>to6S~7w!e~QsCnM_Ap8a|XR;pql./uOs䘽!ٴS>.[_ԟ4nelWu~{(%'ne11(NNal̻hwB^)}w(+f%:6ȧJ1ǬN8ʉRC\(1"H1&JHM(}WRR;YbmQ!5 -N!LLR"$ P0De \ܘ.rW7YwN$w5 g\Et4m0ʇw3uov2a\n[j vw!3c$s;pf2z|pӓE9S؟ҁΕjPeƔ4H<O9ۂ l9u[өժDK!cUkTx9G}>2ۗG8m%}iIcխb!5mKI}ATX_Fר^Vjxˋ-pUIIPqKvHZFB& 3;lOS˹0^pLjQ܈@N, JAՌIv4|{ ݇3f5›vms^њEuq&ӺS:p*"Wq#eltowӱ >X$c,ZwlF&E6kn'ȂWr5ySnL9Bsp`own$м6% Tݝ~-UI|sHWI<]wUYW)ISf4AVzm҄D%XI9d0'p0࿮nܧl{PqCg(LH;6cޡ R>-x^]#;RhS!&5F8%(u C_BaFzĺx!JcT9W^}Hz!Շ^5+'Jһ2s!-)Q\q Kƒ6Aʉ/U9+.qjBRT;ySŪwm97TF(.<}_ߵd`̀Wz+1}9Bm_eUj[x:irQ4^XKmzimh4XPB'od67oY?$vѐN%)'C5%wTߥ7y<_ en_L2o//Z;8_n| ]⎊u@}߉>ZTuUi`ֿ0]}HiJ셔eJ:rLۗ 3.4tըЇ2ak ijEEI/ rş>_k1ϟՃE&Z}q7{G`b~7,%3]:?k[tl#c~V{+o~_.qd[zs&|\#Z,)]5n ŗP~k{0'Q) Co1K6ʱ:n:E3-XpciΙVrϡrQhsT6~lJRS< -}U5c@|ғRyZ|.0LwRIºR`ghePP1\j#B;tɍ7İPR6ՔIuu{bRʻ-&es)-y3;p3w%Մ8 R-ţRyZZS8m-N@aBKAiiE5a+uSҼ$RL"UCji^j%qb>m-ͻqsneRżgԔeҺbٟ>|6." %󸮢]?Rmd%rh&&_=ap}d5?gW'xAsiKS&|mɓH9dl421> UQ|KdYr#!0.̤&~bxlDƜ״ {[QXMWJbq;JP㹯C(8BqE4שC9pY{ HC dϺId2%ǒd:KC1-P] B yF9[=6дCwax<̨/9#ۊjv!'_֧6GZNo@cSJVjaLe,9C k3JKiOS 3 j/,ֽT՛B$u,\Cԟx% (-%Ҋ2<9BBT_KXIO [bf(aᯂ&heQ&QB `i<EZEE9GM=[n:xPvP-b-Ԃ"X)`eIHp+Sޠ%#y!3ư(H\.a1:u )]6wJkRsnGDz\'ߡԑ{U$8ǩ4( y^"*{2KSݨQ.c8cƥ"#!yT8bp*F#tZ@qnGAN/z^Q̨AЈi 7-ht I1ÎޒH$7ah[V(ṷxq۸qML9.+#a5uK+ ,'FcYGZb3Ht2n+ӈ%\FĞ {=%NҊd&Apwk X޻71`7 #[x,.6X&r.]b cШ17sr` hڹ9֕_AN,T &tfޱ6{`_HjGޡݩFTzGؑ)D}6vw WiaBnZ yjFx@ )vH &6KWYjǓ22޸_:Sӷgudnvej~̓߼kGc;jӄޥ(G9$jeZD:!Џnp\=lU"n Q Qֱ4+W>:%%>nR8֗9]^QZͺϏ̴nChWtџ֍ROcn}y:}źFLsڛu_ϒi݆А\E[:3u*FwM59Oػt2^u.]')sg+I 𴓔RaR*2ǡQ}ަ1iik,U; -FNZzZ ;5¹:'LR`ysո9 0E%\[Pbvs{fX4㳛?fpޞFpcw*E#_L$BF,Yȥ;POaHWJ$He{ɀSX>l5`:dva\ѐS@A_ξKwԛvP~Wm+njMIyfmBID8kP0KhY …_ Q&Ā\9&Q(B>mRr=ykI&B3Qd݌T4-*DQX>k=5r޲f!Jٿ~tUk0Sr<&_7w{''$\2<$$(&#ce?YO jh` %CCNOo٠ЍP`On(q!8x1B#{Lb04GsU1JJ2/ ZE!2AB+]yK V Ί2Q+K>>PLI8c 钳86oU\|bL>f[,m ff[l|V}ZوXo2`ibB+K %lVn:>~ :$#m.2UK̅$@샭VSHz۫ Y_Z |Kg"\m;&O.L-9*R1=zJ+Tz4(}_n/}/1X37xCF4c2G_c Z" MY" 9|ٿ/Qrung$lɷmuŶVXf J&+w1n&cn՚8`o:7vNy:뉥#sd&s7KW3c-`$Dx*?\r`kkzϮ5aO?=ZH>a9HYI:L |8t' | H32̤afmp3G{$2#CGyj &-=i-9B52(TfeVTSs3sN3‘%&;ˋ4Y#Q`Q[W\Z-Q,S, S-A(9%eAl$F!ה֤BB:4J" hTgQTx8- G5MSll@*_-zCl>sfd4YcJ~9oXr?JEFlt6;K8+m_X >,myk4b^sP<ɪ"%Gdred: O5٩]04dx?F}.uf}Dt̜ι/O7U+$P ?)K}xM%+Dˠ'+2ȒP A) \Ymp5]YV2rDKQ>H5_~xurNO.ūOÖBԖ+)ZN3cd_F7bc0\od;4!&LGҰ} OP@]H ,p+vCP"X*}GvBB:طv˯ݺ.Y2\nw>v Ftrߑ݆"bizґڭ y"ZTPиBAc1T~дSvlڙڴ3d0Dvig2/LB7b1$Ҍ d?V.8U4^WC8eiG!U%(f9T R"eITP*4TBɒW|v!.mVSjIWpJ<:gw4h± GO|}|`o[`_>)=_F wsלȩw gU֣a~$>%C F ^6j͊Zz\V|9zOuT`9PlYbJGh%KZ%;t ,iD5ST^ 2OT^LQ^̒FITWC~mrK+L}@ $hJk \efB,W@uArs}:l$k heli G})4 ɝѺ59*ղaA8-)8*-,+iCQM1y ¡tu,WFP*:ɠve1!8IaL+oa{I-P)?ƿ*ǡղ ^2i퉠UH#1;)wT9"*-/Cv5!X?|bΖw ZHÌb:AQ&B  Z\@5Dnz.Lѕ]#)Kp 5eF#A)h R8uZ1F^'I|$X(veC%Cz[CpmRP+Kj.6YYQM(Ur%QTQ U5 e$r"j+B+QrЖ.`PXϬJAbZ8"k, 猹B@!DbAC/tY4|kR\QI݂[>Nolz{w7e|xR;RvR`ߤ"=o@iΕNz"‰S{eBW8 DVLs *Km( man[h'Mp\+ lz]kr]~ 98owC5EH$9i-zS(Cc-vc+ ή O%Gdw-<Q(0^!G'qH`"!ϩN*VͿ>:TU>7K\O?)$L4CX7}NV7~=Hُ'tq3>?Fv!h㛪C[\DfrXW9H),#d imxouKU*U*ܻ.`C5: [#V؂I:]]3擝^ڲ|Ѡ)U˜lM'@3hXB?f{Q5 .|Wh4x;k<tג-6ߙ7!Kn1EgkKiBWŸ(KKK8S”]\r:2|݆1 saa.L eQb Qs4_5Zk@3Ou~/O|fȍ-%:zإ5>=Lram0 MⷴaxUS82z Y>;MթJZdYY\s!B8* RC &Z N_]oyWɀ&w#;ި= Dž!'(Z>y,X"(Ӆ )Emm)F~b kKQ BPʀTr&vR\rtT,$N;zؔ O?@-g4 7FT Zh&Rh v`)yi"Bqˍ7Ė*FX^|gbJ/G/ӕjJKFDB qPjYՙ)10!`LNWaT)ɠgNm{bq ò"5nPff"KVj5qJ]!dQz_bk&U]ÞyT0Áշ|y=tGz%$}&3!■&ecϺL4!*ܲXX<cjq]@dvuÏ ._!7n_ru"E\-গcB0–T˂S_]]B*UЕ@v05g׳<). lHD$E|Yf|E>TTS.eNgX ieqz:EQp tIcKJ$8b.|.Q ̅,$;l֝ٵKOnC>E*Jc35Rj&O]5J YfӢ NgA4".WW~yY.iQ?+9%|nƨRF8q~ErC9M{I̲Ec{'A"G06;xaFvG$:o`ЫzIJs?&DӄD>< =f} 8%n%+me3߻xˬ(Kb`L,-5l 9*Iʍc~r['3CϼSQWg'GAf?g'GěE晨o%]Y0n|aa%!T΢/2d} &DF褓"C+S]@O ^R[mi(TI/ o?I@u ۯ9`mV *G:WQrBVqDxNQ4J U-YrFF3J6޵6rB%IF-9 'Wuوܵ}SMI#j%p8 MF,]L޿B"0ln5o%h0Ju"̂Ju!.Ϊ{p<搃`b7obu5۷Հ:Pj [t(1zn{[n8jպ VG?͂8~vnɫ\yMBWNp[SpXVN.;U(ze:ն}Wfz2T'{)tEkR:5)R*Ȑ(&,R"5n.}4Of.[-;&vOnV!8l/>3`7 \ :dÆ3OK{P9沽ZwO=rыڶ6,7 =vп]>'<huly c5{‚^鬏;NZe#Dmg#(]_=ՓMp'!>O%+g/Yј& zgcA/=..j\rͳ?ۤ=O|PƵ&ή@x/1[AQ+½CS'fKS2wbm:eIuUxd0*۹1S,c[aW4e%*k"\_+5o&>XQ \Ǣ_C=]f[u``GuX6gosC$y&J5"gaRE{vhSk4#LL4g DFD`R 3":f&{Y,g\&eۘ|8g7Y&SKNa||l!dDt m@2ڬCˋk5"򒩡D3+UÞ>&Q^Ud_3:RcGPŇG֘ Cp 2zrU>sn˂xgЫ9'J;6e;jɝR\62fzmxt22`Ƭ&,xmteI=:l شZ9kvy;:rsp둵%MiIzWޅ.kmKF ahP7|5}:Bp;P7a Zt^I;us/]⻥7Ua#kL%h56ʤCX)ts~4&G]6F;MV>~|vR`=U?0!483A%}cFHz騭2l-_d,сTRPm٠xijPwF(F$3Nf=9fT'%w$ؘ'^ƂfJp80( s" ^]јVh V8b[A{&i7:L6F,iC8 RģuVel۳ssdZ}vpѪZzr9O8zV[zdzϡN5CL86m%#тdw zrO3ĀM8ǣ =83X{ю 膏a{ _c0qc_dcqNU 69f;>5#EJ;z,6;u?<5YzlA*㮆Nf}}5>$nŨtNrƂ1cAbcA9̅ [dQhۃ c%+8~Tַ %)ҷFp10RHs :)\ ͘ v8?5v{RdQg:pZo ]0Gb"io"s+lI7)eo58y9aˉaÅ.J)&]+ lsUm+ H9 r{dX>$m2Q'&XDkB6c}uq`ԏTKc[Swؽ)GEnC>2ܔ#FA8rIC?ͩ92b^Ts*nϮ/c0ArS"rA8LȲ6"XE!I!'207+___\_DXr(ܺ]웥[_ʽ'[gi-WkΖoSkџΖoggLr@[E]:iRvEɝ'w^yqrm'-A#Hg p!Ymrj3^]-}%}_Y)LtѕKl`ݿ{΋9/{Ӻ YD#/3= ,`8:ʤ;-I3ZBn7]Q'ڢx?[_~7>yqB4ilzgcY&xy"qWw8}ss&>y햎>:d pCFrnt9'eb˵UNpk4b0b5Ӵjk &8je&TX_s 1!q8b5* fj^m{`n9N%%!7/ yI 6Q 42$M EN^d 4>$`drib*n &BLp7]>-y }{ͅ1-ߝ;buoݽ_5b:Fh=\w ZGsh] MXϯ~(Tzlj,!/H8BfJ_ kOV?\SzzD Bw_!t质a#k/, ly[GS2ju R$4ݸyw=/e=^ʄ;F8 @IƑ@PN8Ņ>w~铞o,(K5qF&zxtO;sP[؄>j7-t&a.IU:edK'(#RV'w8S!5!AJ[,BN'rL=KTjnc.(*!`ř`Ml b}۽3 +j*g* +ӧ\rj\٨Rn~O!1Eg~Y>=2.=T2Fљ2ϺaV,& d. l*F{/}ȰVqOoWLku׉)ʼ[/f'̝H{U1]ZH52M%[%d$|ԒŜZ&P"ʛ-,.F>}a CH]'a_B&Q[\Շ5`ѾK.(5NVX-L.T<' %}r>h}7Tv@HltE^hV 79@٪?CNJIʎ}|Sv8vT!rIrm&e%lU3y3R幅>"^g?i}y[Z}&6}Am8ߐ{Gy)O=OPzwEj]7:)~*]Rݾ]p˔22fCa}H,A%]L~}wBтS֑A@;) :pp\WKBJ%QreXFdCC^&TCj]KcIξEg׋-Cz~NdXڒ>SJ|sX6o*,E Qg-_g #K6]ms7+,}\e$uHU>%lRM>9`fl%%)Ǿk )iHΈWR\eKCLntd6Hs uҸם|47#a9I.x=l)*c@ D ע`N$4H X-n#/xQc((Qu "[F* AwIT,D[{# Dc :E$!|qMɳ=kedl֮dY `Bf7mnM4ZȻ#߸ݸZNHFFdhI nau $JՆnioH=&IGɰ 7hvo]BP|#531!D=9 ⪑hFkG&!Ce$kiL;Tg/l3'gPI2Кh6 ɵ=Ft貑hX <#I%33f, 1+BV@I.@Y`y5(`[F *"y&( gw,u,#gӧY޸TaMYf APԙg3^1Ş9R ]7DkuT""OT1MIҔ!b&e1F3JY1[3nL?4V1t{5t!y^]hȄ֌ip|p_[kg#už5L4 Xl8sJl<Ϳ8_>P,pmuCV>:K.>V0Dm4T &K)#{\>;&$JxJ(p`-{25W $fZ%zKS4R)ӊ{ Mie{`"%S=a#ɖk#mMm_94܍^NKNTbXkDxۏK%@roZ d.{ L)e݅%Qabe7kWB_׉նHG{mFyQEdWWh~}=-IۻnwvV,;ɣgg㫇K=`=h =uAp;068E• s/cxYĨ' }O'LC m$ 10e;aZuqW|:UlO[d篚=J|lQ0JC xފ2m2훿΅z'rMY3d_fv*y=8rvw)^i]e|}i|O(gTp6A=!zg N \Fkqdh H9x{Fa9͸km'7P[>PW7 #Դɼ(z,#+w V=VH+ȹp_HmXjtr6\d]-n3L!QŁYsV`f̞blND~_VEr 5+j-EiI9TŶ#59[9L-)fA/vҊdDևj׼ZCzޤ9>з]و9 .oj#>9{_|< =oTǓki#(I 8uV*)>2F0ݒŤ|< x0hnty .HZNӋw6[pՑe>̂PԳXݩe,xͤ/7:^bp @ʤm Su;L%[8`~'IS+FۇJpݡLh/p m3DiY .b !EEn̦6-R{M荞 ^L)=%B=4m~(Un心I<8.-sniZ[\\]W_Gw0yC2m@tde GВ‹Ԛ|zyRS0E. H_{zRHيl!3HSw1B9P_/ H ՜*ė "OyR1Uڒ>߄5\pi2PX6%hTpi4i<6<Lu6-zJaywaCF"V8a;Ge!;b[$~Yj;)fEC@^l _l~Iѳ'5{I=PZg<P/&q$O5^ԧ%sN4 B9:9璱‚3yQkI>i7"7 eyze7;G"kCScB}`:+)?*آ#5;(6VE0)ixqLcaQm<m'ǫOv:v~}{^ s*0!̗\i5Fw׸f+nbE=ܴvz|x窍4eEv7{'bEШB[7ۙ<7w~DJ z*Ez팴GM;#-!0 ml/@wh7ABGpXzzqm#̽nXyV e6JfY N"х4>MUrFfRp6u,7Yk!+;CI4lw$8JgppT@I@PR48ʢyLgw<ˊuciP bJM8BZl:2ْj qA*e۝D\4Vi0q4‡iRKs,x10 V-r9 "J->پm{Mrfs? d656 >c.>Yg+fdwWUJ 7oϾkO0lC_Ƴy,Eru2\iI4~]:.oInNKݻ^\?dX<|UQzl pᔪ);yzߺUaӺ :=ʺ"j9uh6C^8Ep\8.O :}ƺ3PP:uX6C^8E;sJsP >3QԙL+yq.Yj7 ,5/ԛ] 0ݵq'g*'lvyF-v{lXlJz}(Y?l?H-o}#uF%tTxRxթ^)]:.j=K)] 2dۛ$| F+J);}:B4ne5R,ܓsTR>ՠp7y;Ϸ;r)KCrR7\&u/hՕy[ZPCuv P+<ꒌUK\oZej.Ͷ\7b憱O3L5#+ˈQ2NҏCzU?Iw !.HYF3/{!S"Ƒ +& I{rf!5$M!͝I]M/~IVݛ;[H|9[n<ɳ}gRzJF!5MJ"!rC?-wpY$|_d^3fK Z|7/{@ڦK$է_ꭧ֖{HY8ቐ,O!1Ҥ¥ŶwT{OH,]}Hs,5'5'ot k:Pjry^d.S Y0ޤE^p)!m6jOiHjJdVOOYA 0;)dG- IڣCiȅ%p0*h2מ& \JA om#SE&bUfRV֨E `*M>YBzuɴ&z6Zjâ?w] r&QNB gEK,s4ݐ0H\ r(m2JRD)R(Wx4Y*v:#:#mc%> xXOjP a|= dR?Q 7!Gs20-trZ&9p33:Mi=%8k1ʁBZ28l28mTxbJjYSj:5 \SіO^2B#i$ Lε'2~P`&[W0H1[W,5N"{k12 C9zG?E֭:x Y5}DX*񀁶́*^0h9/tϻ&MRk%%;`+1bZI3`jS - Eޘ) e-2HITgB]dS㔍\`a}fWjCG۳}QЩf])D^!/8eO ;}GvRGM򝻍|3G̪6KnW}0].3X6"=) $xG3Ƣ#B&DĠfr4ȓ֫n|iM@ KdۯK[aqb:El JVu8%ԑ@Ǝ4e 8/1i623QZ?uGBOrCV%)6RGHHb`\_% }nbB|uǶ@@ۑSa҈#.ċ9j+GuB+fxkr WbehDz \"j-3M_e< cʃ 2 S1eR9TZZ* :婵E>㊀*ϊ\0鄔޵Ƒ#_e^ʌ5 ӎX,d-[JjlߗYUndUÀV:3x /qN("MWk< - Q2F.ʣhj\S5 ә@ V]k .ڪX:+k[^hH鸔BLuOCxe\(>@r[xZ4zKG`?'&ȧќF]ѱq9!Pi$ie@ "Ө >69v8$-}~ڕiV_$C !3]SO>\zt]<VjkQOLr|͖Jj/Q.Q.Hy:#J/J.j8o[ |fZ@2|s췈ް /-(#&@]vS^ v=⚧SxC %6;zg/J*@/CY5/ ڷ g 6>qm:V!n9ա{~ q? KqLG<*{B#]?'a1P9IY:Ny?7lY:gWٯ\GS&(Jfy)Rj=M0dH GLi`"x 6H{G~uCf}"Oof5@W˧wME>=λO/mvm3|~0q6DL5PvFH|ƤV + δ(n~_+?}Yw%ۂx!s@mRn&w6gwa8pH! ['xQ6hkC e[skgm[ U5@^ڂHa!@xm,m7;ߒpTp8J[n`G3@8"MvՕ>kcm JAJʨ YiZ2Js{Z@S)|JK" ծc稭%aygT.ћv $Uo*I=3RL rff’ס5Ǯ"ZڀJlk#lZN&5xw^yb<'e/0px7þ GԲ: zI;n2pRÇv,ȡt x."m$ƕ(j+e|s$ ===bK}m =K|^ڳc IkbFZX- I\HDO p]p aEq3*z}o]mn_x ic sxc kTm9MU;©%~R} sBʍRQč/Q02N69 v^d'4w*{VuJ C> U>q75XTu1buT' C*+HvvPnR%!ZK;K˿6XpbʿX_2 ?3HX$}ty_Ǣ2\o\'wS;,[X#Xラ*`h<,axCp{9zJa{۞8]Hھ;La;S>̃+ӅO,V}86XR?~g [sj/7c1/sPSm5i{ 1ūU_7U_a+Ъb!U JtYBNmOv} ^VՕYjրk}nb׍Mo?9{Qsc8橤c.18b:㟾7= wɓe ŕo\4M롐yhHهuP j٢\,jͭa9"V7ȉzNylzBE:#׻ukеc ҟ}䐠K`hE5ڋ6s&/)2_½cֶwsΫ/w\(6¸хp<;&2B\>7gukm275 Fxz })Msݝ G.ZxZj<XrU5aք=p+&F:Px(]깑C"ε$s|G`2}_L&g6~7ɻ@p_ %i.mwZ{A.|-5ÙNT>cVnlf0xn(TtFXRmj^)ВUvXk$Yc55; R|^e^? L iN/ ,]`m?9rmʲjȶ^vM-n()8Q:6`%q~|?0 tȰ o%EյXJr+M\6F9p H31)ѫ4R{\r36hfr1Gךj .'jUUEUmpn|4@UQ}6PM;qo6,=O F8I կ>~^&cX-}k@AD +g3}做 ȗ8䄷ii S!:m?Oyc/[nh9A`| KpQ@x"-+QgO:nqR5h] ; ʤrOD,>8cVڱZeDRn{#K:#Q U736Pq\jK}lcmɧ`j`s$ gzDbHyeJ3@1bLƒ %l%2DÇ%Ϣ-1y+!fCa^bH]~x=u|Zp辗E@c@ςy]Ͱ\ &g}7B5)kQ=BOΚnEDsbD_!^5 ӻ3=MKzD4wun]uM[:._n=>s{}z>|?|?l>ku5oDލ\gnd !(YUZdk#Ho,W@9b6vzdr/9N9ۢ5mZZaj^ j%ZtVˊewa9?C!ْ=2q7X>ÕE>mf@/Y|/ҀC2⾵(6]>ʊNAex Q4lu7-n_˓|_oTFc,q:4ѧsoV=r\|c;c)*"}pxMeFvѴ['Ts#5ǻŷ>TsҠnRرqsGMϪL6j[do?lIq'+q(rYWKYL#Z$:Dj 8XQQjATU\ pd8)uvA1kȅ_ߗ': rN\YA t1[哹b`-㔞w$=s<F!]CRMINƓ=bu}KBMlrtlr7;aAxùp(ڌLlr1%hҊ+9S<Y{'w8ɠT0v6'a0E+3m8z磻 Rggu *)cy(GE;X줡hOmcGurWƉ8x%UŬdUUijkjrZ֑9_ӎպwϫB쓫mW!'xoo6cZ3W=Ǜ+mۯ_횬S~}c2V?Ul}M o4so{zͦ0[譩ވůחՆas_9Ԭ>\:.ҧds; 9DT BN7b۔,/Vn n9z;g)(d u>z(39_JBXnDAr3emxyÁ x&( /J% 4( y%sT0>#]g63R=1bnjս:'Ɗ΅][=<\猐zNtS9<$Aw;ߌ\ޢzYԀ~ym V5ޕFn$R藱:2>-XkC-[U7ubU%m[*Qd0Ȉ83ʂX8-* BhzAz~BȐ@/8R1\`L -O?́gL"O[2'gPMC,ɞMzY"b G3r\pxdcF|q!)5 燘I|!$i]Oj]8(pBOE[(Q9_ce j:w_-r͛{8ʯwq6KY^$o ލ'x7E[5 cMgH LiR) B *Ld> BLg3ŵk37A\Ԯɉ_֍^>fCI㇟)V>dx7Gemʸbd^1m鲎!F ,Yf"C-A"O ^38& U*#&&#Ƥni2ޟ`.4Wu$YCx^tH7/\q]o/XqSn.KkihL<Ү\i]Beтi3eF@HwTQ3\Y~dM'||j$&_Yҥ]ΒkXvd{W\oj]Q~wƟn灖#Q-KY<~}m9]uW?~57tVoYt/gi&Ua*Y8oK5>-<)a!P-KjvLbI)6\J .T_6Xh8K$=:P-9y ߙ.՜]Y?GioU|wd~i@BO72S0syp\dF"ɠk9L?`W@h/֢"nv%g*^L.S&ϫ7鶻w;_e..>?,כڏ@OW?^~ӻ49H$*MI < 1Dh&zL=Vc)^9)p(sC ŃtA25`рSIG^Hx$^ h X,4JTSݼ(=FONJMW%<|)eO̟ޚ{d Y]*d.W ]  ,u /6&ũ=.M.uUt_FyUӅJ!N`ɨ{=5VSu=>8ou ܷG}[6WB"&:!LZ"#bIB@g&$tU-K#PbI3(2~*Q&jDH6q)ndH(єHR!1,, #J!LR&$+dI DTqyw#iFH>$Sr(Rc/H+*H)vif`Z"C.'Ș=Y n!~Z<>?}ʿv/\$5KO|m~NYzg_޳`c"8 ocT_6l.xY~R*ĕRdmb0CB"H%$0NƵ5HB0M3kR03&mShTU4Y))N0;%TN VI{URUVBUYQ@ gRXpvd~-֠f~к.OKIXVEK]jĭ 'h Dz& T~L~R+NJ BJQ}]biR*\=)u9_9@JS}KuޮNR:i)OJtTҐRz2Fd!T MQ5S}]ڰB,SR:'VF!4IK)9_z2ՆYNYJ嗺r$8efiRRW3UQCDt6C\Is}>قsf+`uaw&|Y_|>" CI.:j=*%p-b)0: C!FN@zrExN+Q21Q "n(Pc8ؿ&ߛGVj-pde(c19Da@ SXQ<,5$!>j''ǩ5+`><}IFهϢ:7p'J!Z^15,]S@,Jk/VEylgNaɘK;JAؽelӛPwHM '%6=C_Ic@1DO1fFyo"q)⛢".#$a"ΌB-<$80 #0QJ(BaȝQlHx#7}Pb?[t`Y_2[ c}Jt5r̔Pq,4!O%1H1(LeY>Lgi햂f%ʦ2وJI耕ec9oćs`SŮ`ބ$؄i`xPe[pLdĴ8ZHhȹC&9,㔈,6&K6%104dIFLQ1!7*dԀsĘ(X*9k,O,uܬ(^'|+=sT)}/Ne5\ȕfJ?zo]bsxΦZ[AR{pՋ@-Õ$ 7#8Ĺ2 IUalVQT[p|+F”<4*Q8I'$`_l| ;U*&',8>_%.X|ZX.o?>oBpi*\5Owξtz*)<ѼJҾE ^A_~ɪm,(dt_AQgmvM:ѝGg 5 ZUQ-< Wz{" $VM F a2t*ze^D'Z豷kdbaYmt Jݧ dM^xBH0,?\xG T=u嘵1$$BzY y^q+Y#`v4?x!@8D]GA&`T7Hw|dʫ4HBr4<Ϭ˦Q0+r`SX܎(uD-e vmDt,]v=N{x Dzg>2,ȸK3k|Ё(%r\BnlvB9:"d PFJZ_Ghhe4c_Dr^@Yk]Z>q l'UUu7IGVځgņ4`zג;7A4fGjoSUE/5Fz1u`;6եd 2l)Z<,72CJL|c(0uSBYl[sنW.>Ӿo kΧ`0P~/{i\fR ͸U 2L!#!ۻn>ģ#[.}-#āRbN?bˁgL&ң {<~Y&!jYo>"#~ @jղr,ZC*s&@Px2V t:[F7뉸MDf>/@cm8q0BrpS@j|u 1;IIciH;9Vd-Vt$j31PFNGu.s|6*pE1R i)8qle;0bs1N#s:J%:sO|1lۜ:Y QXl[!k{3_KN;wξOCY51(nخ*rPiB2BW?<2&mF/XdMb\#o#lGptM6.ULT0lEq[lcwVT=&5ZKݓr&F'|0h3/L`[0BǽN\"dRBBnHO\(fAQT PȝpOdDP|:BÞvPu%QsΊ򨟦{ $pUxs "XI:)+:Zjw KRg#8!-fe&h&(D`pik,Q2(>WKRS{˜, PJ/('bQk-E4JZKGf!a)u΂4fĪ 'NQiJ9$No4(1m)B \FUW~_<B=W2p0 S˲>6 8AB-BxdM 5HitP@^孜܀tTGc0yw_ %,yBlS6M[ys(DB Ri c4 (1SX譲X%D&xPb>e3B"NAǸTHHP=$Z>![%5Jz `s҅,U9ьQbΏ㋳fZD+*hPr^LJ8^MF6OƇ_rQnUgo C]ŷK퇧&.~HDǗ㳗㲁w`z.`wg0'_=<߿;m4$`x}qOi~֎'Ϡq8hK|h/]|cLHbpʝ~!+D3X|QtdtDgdF+J(B.EDd#UXt"RĜ a6,)ZX 4@' l, 8l%Hy,)#ćN( Kʼn<}bF "(%n(9fAU<\čG A3yqn @ n#-F4ƃL(K !h ?f٣ ɌMh*pV!:AC2 \*PE#Tڕ!Uõ/dx5pfTF_eS\ڨ4a!֪Ut.P/&i h 5rspC d S, 2*K$B > 71FM{P]J:hAIZZ{z31J2}5A5MmyqU 2H0F܍ZkVY0&`'Q`T ;By IB-E৆XwnPC UQ65dB1#&SL`kʧ?̄<9=^Cy[g(^)9c9ir.M…R/T$Zy$1J 8Sb&ب1 CL pKjP}D5>zd\Ҏ;F$r4ByȣLw2xxB.΢N)~:+H:+ ?6!"C|`r{7F}mwdR&f=#x+gsUJ N oM=zGUe> ZZ׏\3>ʯ~3_k+Ԕmd!T-x!Pn+_zPzZ=˚'2BBӢD=*9*G WҚh$yu\*W\kBlڮ.P,DmFoA G[&mW105^vQhпJ{  `J&<\mʇIMa:CM__˦si>as=jq&Y %7%3ffajǦ7/M(u>6ᣝ~jTث)84sN'&atpl& #=6J.4(XQtAqrArAJE_:eЎkp.lO۴Tj.ɚ<|. wQ*Z!w8( 2iu,H&43B@'MMzE(a<\+*\rIj橛LmC_Ms{`Tٛu&+K0xȼv@#[*/0 _Odot+%yU՝5;P,W&6o(e`84pb!$puijO8ç{<\pm-?U&<$Q;A[ڊߏr Xf>de/)TRh/K rB #Ka2cHeh@ uĘȈ/7tUF+X.6g!7-9YQGL[6~16at||(754ؕ8CdU~*2M؀HM97a|'ղR_SXR{SSF)X/%I_"{6Ix_9ݾʯTbx`,ڬٳ)1r <Z&qg?'l[ȯU-_zRGf=9PˆWk% aSMfaW0!psO\tqQ\$Y ɕ{0Jwtf2}K~@oIT'{v}ƓQ&fUWi_Ug\)T(rC(,:v{?|*7Mb>iGo{d= qBNM>{>(w (<9NxbE00p8HLZ86M 8iń Ppyׇ׃@; >x9hmk_PHP[;(EXPPX^Z,d@X¼݌Y/4 jm^sL|\MFKmjq4XGp9:}{s5(s>me9N^ w=p(׭=G B;)v|gpAY )dx?27(mr#ۧџ?ɥS [ VSЌc1mm`%@|ab0ѹ&֨z0k<&*JDgFS#fgHKR^?cLyE-/nԈd9wH:AM:qΨXO r_?rs\k*rѾUBPD۬!{/VnAs׊72m5y9_+Xx爐fwNTp racBBWO$soI76i-@djs'-2Il]janOZe-7.pVϒv},6U tr\o:*Gr.r*$HLjG<"¸͕Oȗnj 6t eZ3*.:' 0JlXe3*BѪl@Oy|||JKCdd0<~ֶc+7H|ӷ>zAKr"_O*/&ޥ3X|.{8Kh}U5w֋x2Dzb>G[;|3 C|?6\#Sjh%_>ݻZp9Fm&G3zTd0 ΚC rMzu KL{ Efs͒AkPrC ll1Rl&gIYmQzeٛ]QJ87""-5ջ]^d.Lߞ.-ڳ؅scdP@4vv}uYvw ˨;*{5{+Zzul[fnP`vjJ m ;e_wot >>ꪅЭz5Χ\RmeΫ^-)@BGL}x=Z<EMի^S`n0 >Q%%xϣa@u|؞tLimM2SdMꖛJ(k}0ZKw@aHuuOJk~IX'?ɪyI ;=0\1%4ᆰ5BˠoxVR6"xƒv_l FuԬM.̷{X+Rd΋Gs,H0z@)\<'pվS7|@E~j7$5h `Tcp]#S_TdvPL}iD[u|uZE7pIԁ8L-@h%% ?.}ANA4}c`)!DL4K3.eޒweIzJʌ}{ά ,<9eH:Hl4ܖXȈ^A3۱/?V.Ng1!f?xMxM7^;p8sMKr* IR?`|פ<;?1:y^Ye(µiޭ5I&KHhsbP(6 C!HC%/՗|#j]|d#_MYt S2:`R3_j!)R9y!her?sƮ *B0=6z0ZĘb̆8M+`QK4ajo^d\=\_]yDžDju(x2[d+M.sѼu(Nf>3C;hA /Y2-_)`5ɧ'3%SdS6\/7t iEXJ?EirREnܖ:EX {ߧޟ7G.FWM𥫸e2GkYj 2:&ވp;G3;y\{->̵{>Sq{OLb1zbLZ6w2k6ǡ~漬~}'o~^M @o7VE;.km-r1HNbC3UՔZ×f =miB˳Q(/rɣ_sJƧ=0o6,z\Zccv/ڎ:<jI4KFZè; .01~M!0*I!"`)'טu-ug8c5?YRL{YQ+z<3׻ (ӎ1ޡF &ejw Jtrge~gﲥvdoŕDP]7:']h_=:g~3!5s!6=snN+RƼ:DP"F\v*/$_O@CԼ*b&jWe~V7Yo^Yx-IdX+@[%UrX`7>qc'K-6@Wen;Pm?X̖{%&}-MdDQqv I0!J ID6}@wKzfczn6V(y(nTvlD/wM#`Ȥe2.Rgv2:*UٌvGstv;sT T/N2Ѭ@ICH$;&$nzDw_gmXP;g SoC45=b/f"K]ޞ~7} 7 i=63}6Y9~χr%5C sWMmͩKCI&)$nzMji璋 wv28'%4. Ṙ]W?T6YE,j̱:ߊQv@FԄj.8pjs)дvUtʚ>d%=VQXRIzԇܞӧ/ݗk$YzbFwڟ þUJL 5aQh H`vU2.99n6ne1ZX'yߪawPy[9\oc|W u,Ezy a}6o&}K.lXaւʎa>;1{aׂ[jy@_2j.┦ds8ր<%c㰼AMG?4ӯc]`|z!!&t;vf/?[%229$ )>?j(|]KK4H;@iJdr4q㼐\Q]Ӭq21)cj6}$6!2 =th0An ( 9%H0B*q%;3T}K\{RJ9Ub2+II%VmV.j+\+",͎_Kc.dW68N;֥{?[PM=!0zXل5 WlJy 9:vS):vց2A w5Ј'LPޤgGޤؗ Mu'J$MH^'.'B:c`PSƵ" o,HQMB%x)9'mW|gE1?"PՏp:_@LgK=v|ew@#-/}3w\]ntB~P63bًJГ%?<Ķ/0A7GE m/5jM7nBVI 0cxB@cp*Z !:-Ene ɂבh](JzO @B? -QJkAnu ۔\l4AoS:(xXK&2Iź@#Ơո/b}/DoQE<k17߇VpbnXD+^Jy.nwK{t5_:g7KԼ}ċkR޷ЎR'^*܎SX-o^|gv3EVfHQ^L3=h:cS [oKՎ1^oYJ5*} 3B ~qsy"sN/R 9w/bڋ?8ۘP}tv+Iܿ~/b'l'6-`GN'DZإqkϣ4o`??G'EմI2&TdIqt;t}ċiʱy3Er9~$#FyȱksN KqmM'sȧ~Zn2.2WH3h S:MNiYɠ lBVr+ *xЊg0\C mI F=e)y9ͅ[bGu L3Bi s QxXp :ùW&[>O۠}A3Ju@i|wڎjwDo;[O||& 1mps,K~㖗 btNZPf!-RՅڽ6@q,rPĎ$ [c *JӼM >^SDZna#Td:E {o'1C<\@!.^C,K,!=\qAF'qQdmdgg6"bn(zJ\\w2qˣp ͋)uZ,ܯGU@֏m93ѤeS m8{^Qa]^1󖱃wzVDSnK6 eٵ3'+'\е:=)~ *5Y:E,yO|5}GnPts%/-{@So÷Жi[1'] NeMe ,@P]a:Z4j;Ӯ\4~દ5C }]I3uOtzolLgt4+VU8Yl(X:\ +re-Ҵ?JP՘q&fx}WAh.J"1vlfsA4,#֌3e9ұLv4̫5Xb@;?8C?VY(Ϙly};:K5<Ɏ= S-g_Ȼ[{phoq|ՊpS6>?F%Z6bJ?"/+#8;8ksQhqyC+g8ہ:U]=(Z ;+!NEkA6ԟ+S*y`@q G 2\Xj}q;HShjx$pQl܇ùcQdćE%ڮ+OdDFvv% `oOxQ !ץ mPmuvY|2jAvg r$3My# ;}u=u7b!ɴZǾ^21@dz+޵ëwW u#'XP,RN:9I">mL=O\裆=Ok%]7I/MǛқ#FXZ^ ] v͚XeBuގ?TXN]LWz5Whͬ5$:g*RPJ$#hY 19 88Ap [J85P["d:*Ո%Um'=甂ޖY{t?@LeXaFdri49 %, \\ EPâB -VX-X9#+聓دFſWg.Ί&ϳ(m aҊ[/?'-6 :KUgԤvLCbozYN>"xYbjW MKטt RuGÚEw{3+rLJ2X) r[Jmr$X'çi5) ;34󝙜i0oLN҄ \%I| G O7?Ce9ōޅ;|z=/k%pǧ|+Š)*_?&_>?+=0+^t`3#2._=3rw|Ɉo uu !tYWΝͿ^y\b<FcHP޾=3l`Fo_㶗*-4(4 <{VcY.P+#=%냥 8QV_z^qR%\.="tZ-&R>nEM)A`bUWzpR%'dC(OQ`L>C?J{a-Άnֿ²A۫IlesضX孋j%ۦtPhsځAnL1-yL?>=w}{y{Ō{x_'ُ xq.?W]Y.Ũǐwaïfz p BSĤjzy0*[_ص_Q)冀QjT2u~=Dc4{0+yzx!!jmrh3|W6mem6 w#0m(*=F1 0! E`w(pBsF.qԏl%i6W:uz+x=jA8ᙟ`F@H^%(^z\ ;U~vr>e=[tf w}: zh[ s,ڹ9k!)=һr#b-sRGiy6 @SQ4_1iNFS7ͅN:hG.Q^E9s;ΦO颎3뢴Tfg)>>eSǛg0<\P N9V֥{yhgş;c'8ܲ:`&kYVEWl"``g;/rq4Ȕ*yQ]nNu0OKe;nv!!'.!2%[<ֲ4Yckf v+M`Pws SǼ8JIN7~qN*l>:7ý}-и+p VYJ=G>Q71ƙ] .8`آޔR;pّj'U~1؃A;q:mZrKj/2j]E}矊Ep1q<7r.*'KUU%)BhsAih{[o{h7}"/EQKC1#\RLQWejWEh{ň} Mtм]ߊGb8t>/o'\4u=Wy} HO/~H'F}c2b=!J"]cU Dj\>#>SN4$>~|&agF-q(667׌Lv[[-Nb`1zf'1U#mdu" 5v'fwOeb.CR/uZ5i0U߯8Ͳ0íH5k5֚Pcy@ǔd4~ǔ:RR{l^Qzuأ˅fwӭ3p;GCuN,jĄ]P-ʆ:`!Qk,<@LA8U}w1@Pe!FQmmtB0VѩKy."F1$zV#-ؖ/sFVʘ\٨꬈%>?-;,`W*-NL;ڋN|_x>{K D2wR r8͡8NםtQ:VPLÉ RIÛ .th6dD+Rΰ+ə"z,U6Qtc<1B ^_qO,t7I/V=9⹅x j$?*`#FlpoȜ﬒K]xvVx9qqt[f|pldWg_n\,pN.cL<'wk>o0yG-UVd;^R:ᄼOYw{?W 58Y'ұ>XD s8@Xɴ GmqR:ťEwnR |n'fPJz1'8PxlIfaޡHю9Jq}=iNXSB<5} d W3Oo5 ~vK4*D;gwԾ:Nlnsռ1"7No8!) 9KP,U8Ϲ"(qpMR+Jr +(q9vydEUQ@e`Zsef֮gMc{nz]>m}TZB%,R"%<Ry?P̴2%VթNmE ,7|Uki45Ve87(Lj+ $gZIx6>ELR>[]1S 3y<@"eX1R$3F1=(K˜n\˝ϓ|]).n//.Y"$Ij -YD!HiXfF4Ű@YNl1oy^%AǩXcmZ >Qm6p!*8VHP5"I^}O ɹBR0lj6 UcV2ֈƸO{%+X/~w^`Ӑs\3,PeAKGaػdwÆ1D.ĿO%%@tՓ@SqaEB7<=X/Ƽw{s ž?\{zQl~Ab>-w6q>jA[nCWƉݫ%susƉ]mPU7LZ{84@Ump @SԦ7,/wQ)i5jZx =:zY\O8'Z&}Tq]@P :;6hWpJg@p補[|a:&ڽB7qsfh tJZ+ X0A#.1.Ht+lTPjSCRJ!OZ9"rZ{%bV{0+2{d-EF%Hm@M[#vާ[c2vii`797kUh5󳻯\|wTecrTo ۍ1Xy->#5aJPu}`y >'?רsR4|Hiv5AIRTJ&PI-'0V7U? cV]k8 鯍Mm)c jrkFԋduꇊ}Txܠ1Ѱwn˨$붯[fE8 +g&6NJ1 W\__o ^mu'v x+ifg["έ$!!TI\K_Ӱ#`=і$4JQ+{:iW+h/3~Uꋯ7{M#\VI2Ч(9 91v&dr0QC14kP7t@=EI T2BCʤ_BZ8vDPx2իƿqצ9?Nܲ>Ņ" =W (`6C ~~*|r!%tnicE?I.=^`痮iD>䘱+5uݔ9L_QmVQm֣zl(. %g[AJNyD{Y0rSVV*]B|%~M6l8Q[28q[8q[qnԒ1L EE=L,HB  h43ʳ"UQ~(ɨS?e0A2jD9ƙ'O9#f(sg4X-ɹ y5AÚ-zr+'Cw4 d|GW`YN2&Y嗦;sDZ:\ E+E.y(󶔹+ X& d L3nZO^0$]fќ]$' [-^c8m͡iLZhK޷pEGh%粒 7z0GXƚ[5I~W |q݊/ |tN K#tKr~}%.A򕨛'+EA7Ok !99p%ePdtst˻wbtDJjS5zz݃<^ P9Ӱ:0U15vJ<脩7pP:yePGLVRktnUMm0\k}yGU-]`/u)cZk޻~u/0RƓ[8Yk43:9FI ӃO=&:/Yt^|-,?;O)#6 1j͐ f1λyj^ddiLST}V*QI[6Y);k]R35_<,/MH~^f(Wut!K.dJL0'6!FCF7KAOpkEO[OZ; ڪL !@5,3N`V-e Kf&hVZ}c0xڂ`̧v-!:Y><+ ٿ!:@.2ϲ Xsj ?|c7(+Ǐv2zN=KEyN }mI(]lWD?_Uf_>];.S\/|vJjH?x4 Y2OQ}qLK8=zjJwĊQTVV Ȓ٥9װzטװ¨cEAstØƾ.H;%%B+ZEu!da';Kfc 4{K"JKzcd#`#/#z\)!/#& :'ݾXx g2k:]^fKa.QweS撩om-L@Li蛣p?gde)*Rh8I4FDnnšK*.F:`"IL R q=ʳo+Ͼ{v˼-# sX?zkEEaRI!}Yr >DׇY=wQ56EO+Ս[G2 e]&= t}-!'}:o6Fa\LgzCP+$In]JfSJ.:<ٟM$А]Vh$o 6XWb0~{,T{{AN\&Ԉ%Z1*RY3V]6*&vIGRz-!%VtMSe͍gBeHA&`ywemU#'/H ,4qiCq4ՑJ9|FZM=J!hYHx?>i?i=8_% e|2L_ӈ]Orj_\)B$HHFf QQFQ˷J65Vj!Gg"JŸֺg:ˀ (1"P$ X%/$":ig)[nuux{7 |?|hGgC-`MVC $֛>QoqPmXn,nvէ'}DC\/]u 2f,duxVor)sǚWom&ݷ:40S(,/B0o9/ :o@Y'NHUKYJL]/iAH07J rnbz(n f٠eR QP:lIscDK)kljpy1E^ҳYwjEPloEksۛ'ځ|aG y6qI7:wR$AE1Igǧ{0)CU1r`}sy_\bu/#K1:QC]9^鵭Ip#5ki%11%mƞ{uJʳV7S|dGQDmq㺖g`Kg8'. )+6}v0D%87LnX'-~Gu @H5thCscy0J)~_]o#7W-]qs"[ /Yl69Vr2 %naݭL0cf~z*E@F"Њm 8aIVi@pfVE!7O5]FSXԌj-UJ 9JUIbE}%*'^9k]NҚj99XR2]"ԚK5%s/ +usVV`c0 vb﵏noܲ [W|E9/eؠ̓?'ۥyil1*ƗWn-F*;#-LѲ6SȠC{09v)==?AL9HB޹fɔnT0vAdHkALkw{nMnH;2褦֝¼?"Wv^,7o[q[62?mPkZ벓sn(,\7 Pqx(fW<7 XmvC!jb%T~E=5bl_gTrFߖ~WX FEV!k;k9^)ߙ]( z8⺛gA$pj}0,_,j/,o֦cX6jfuX{TjmQ9*V騝!!]9"IH5>ݩ2+q"RQm83%!h7 66<&ESj h)qE8eG'^y^RjVUg๩*ό NV^*6ayD+Y9[xq39_|Ex$vkuc?boO.fj(CII'פ]K2%C$ Ϸq)چJ HʕIpHLb"8; 8׷germI@:D%DK]RCW\J2H0BH_:RKPgeeBqj<|J9Z߽@W(C|5FW3.U%.=FĚzڿ{ ݏA%l.s&(DEta DUY<ɺZ2ة ß=Os1[kV'-6 ڲaр6߯VC<7{>8;؀W!бݼg=$λ\Fi0}mCL 2 ]$AL17;i$JM\*5sJfXY)a d`)nl,P ݕNRZe2;JBB8GUzZfhȎ"›CHFxjF@*Lja -%Ua$\Q^ v+%ڢxrn9 N/pXK?'=^NB MŒ+}Od0L8 i~X(;PrY4*gѨՍvvӏ'łZ%m(P[+WxEK΢C0Ul>/ 8%N>;RP[Vt:Y&D5&O[Ws~SjVe&mj2* '<*2μBsD hvjE 1,0/vθќ+/4b2S{A?I{ȓ?2xz][XfӨ>~@~=[$-o/oŲ늨-3XtTInX,é$DUjB _g)a)XВ>ٺϞ1E}dɥuHjU񯋯?'njr@}E) zz'?~8oNON()?TS R1AJo19}@LU-=/>Pb.$䝋hLi2zK4gz -);Fv8+bZX}kDօs͑)=MƠry":c4n"Y[߅Kn]H;2c(u䴭j$akdN*FTTV i`iiDTIwU*KǮN)I#)Gb.qk[ˑ}؀gr$1o0 H1\,U4nrc`Lٞm~4lϼDcda643JwJzxFh)!RWf+M\ .JϿ\<=(o'KJS}ڐJCqoMiDQv:xrj[7)sF5A&,`pt[N)_)mYהL:=j9 کi0 aevjO5Hqm&jZۈٍSmQu!;IsJFB>S np^˃1W, TΫ.$䝋hLinƑɃT1eBL5.0hj6NE4KF MZSj\NMBT:vorݺw.e (ђtkRav4TMDǕ7ņ¥ʔ:2U|:oO6F!'\勪`䱪m󪶍LɱmUmEnhb -mQo_؟\ȫ_4* .Hi]Kt7g6Z٣EklEog 0]C_\_~EQ\ vn(u)W1{;fJ?zSM4PX\$R+6*` DBnuXrE}.|N?{$d<".cDu-qx6JPt56&kl[<_|ҽn{o,. Q8 U"xB9-6_pmjfKaqÂjJ 6;xiNٷn _^e>"[.?e:@1E‹@P]aJ hY 8MVc<hA2n:v$rh%s+Hyr^_~I_(ZTF+ . aHYWB(Fd Fbsj2bB^ǖ^z;)"_pIip4Ď;jqf0R$%8cǍyĸ_h3U/e4y4k( Bat(;k8tW:'!Rk0䨷Q֠?}%1{?6EA3 G_xJLVlJZn?8c *I|hBDڻ)5ܩ =d:tL$.ߤwY ΢op}ozׅVa"\9kLZIETUTlF4^Bw Bp2!SSlҰh$wɪ bE\yZ}V *{2-9lLir҂$j ./6˖ֶ{UOm9&e(k24:ϜGrY\(]`h+b*e茬wm9ŕ1sn>]J»ܵ;eAZLJü%Jk kdrAf<4p_mK2%qATPeY(,V"41@.N B TSnO_T!2#Zgq,b].F}- dl$&'PJhjHb"Rb4l|Y%܁b୥Z) \lbۀ"ƣ/Mݢtz޻URY%1P4ꠀ<}'X)rRhJ!s]$jW1ogu;?3Z`$.k]}GP@ejA$6 #-BKu0Z2apYgjǍbKѧ/ Cq-]E@ITNvvّmR^=u74 9&Xy&Ro[Qd00ojd23DU9yG mRIs\V݉MeBji#e”1/nMF`JVjdfzT6gMS.ldQ`-1ؐ VKsvkLBfOjs"6hPPEw|[ A\[PT0vaaE"FkyJt0Q+2y!xHcH$:f$_F[ ZgݲD1քTP!@ v&9,q#D a΋Pƀ* I if )Ʒ|SDU{ǔ1I_2`ju`پL3w{u7-W+WvOj rE @x5_G,V|gW/gϰ e?u:Bd8Z/l@!eT^% c̃.Vnh2u^fi<'Vk*є0J6 bq[3M:\ZaiҪJ7]i kC+AD٣Ð$IҴfWkG-ᵐLvU'<c,Ђ!=Л!cz_0`< 6^xQ(n·_o4eE{h0DŝAC3'NM፝@yŜ4~;DUɜe3V~I;XP񡥷~%60D;eZVɢDs"nXLyF|d_q΢iIJUp1R;&$쵌Qpz(@b; k-݁[S); k힃)`ASHžoVc_;'Z>b_\ x3zs [-F :8ԏx3zsZ%hW|{)NO>!i'$S9B"DtiҥUpd[H)2S|9FGM^M5ނlgF#۽Ҙ&c~xz"h<\!]1=8#.KlQC@v;}"uHDoUIqKRT}rvwQd> szfRjuKٍM R+d;/n2+\sP>ق ,IP]%Ŋ@W`չ*GWg@W``Ljv=FDRzD4;D(j3 lHTaG+@~3D4 J-"PEJQ4vԬ}+D9`除a\f4۰$LJc7 Z1q05P`_|h˷ktTs!9twmbTnoF'2;Mmy1K[w6β2MlDXB6킻ʔ-̔`Z!-"Է:xQ٤bгk?-V٧}g=S%{hnX˄U*X99ۗV:D!.m `:de;{ς!i7^in$dbN׾*vvM nko=\'!nwiD %bN'(8qbfN~-8.#⧻'; =hI(ܧPRӚF`ꈣВ[7AG敜apZ)xK#>:UKeF4 l!"PIr Dg QM.ǔb^ 3c) sVKN4RSxdYδT*8D APeD"SzCM{䥦g"c2XNS'Ӆew7&i4yTO`w&r5)֓]ޏ>6Zʳ̖ 3~q~#vXܽ9F?^I]YsQ>1ףEv7|bيv wYI4 m KUj ħܸ}pݟ=P_ Gs16+ `!0myI*a iƊ7﯁53g]PӀ %X]l~kMg !+9{wu o QW\3UxXVbc[j؃ʱB[\PO.Ax?r28 #F`,85"e9#2MP"ra1(J@s\;BiΚ^y!y"as#Ѻ.5z.Ead"H;qM`jwwKQ!"xj]_ųRLy݅rqwNLo \<.nz|]?=WшZ~ei>1Lv׶?#g;FXЕ;L?&DK`:拵et\*g,)°G7Drs<]k}b&(RnOqbXh7]#t rzS>d|J 8V cOVZ21Occ@+0A/6ثscA9s* A"TbDLs1MaaR9Ok +QW=T/!@P.Xtx\\=9bD(!͹^N.+ J UCJb"tvz>m#j N>JkX"e6u`?ޱq ".a: P0* 4\,S([ |ABY]m(/)T`|u aW5^ A>}#FMh;g?EBeZW"LX-2[ $PU >0Ś'"a&]0H)2+Q,RC`brW=3XND8r5̎o9I`nھyi7dMeGOgמNԜ|-Y-X>Y= (FD{ k$zއd rJ>a>:s89eiRyF,G Spl17$W4ܵFd͈/"AI,7ŵ4Beu^(TXXnpǩ`ya6č:@]~`y V2TWJP5/m\ .ƴ( EÎo!cxΈpy uouAͦ&^ԠIV\'P0AaS% s\Lx^Kjj?ܱ U`&X*5cƑh=a`Ƀy>EzhW8лto/y[Of~g#/ #=J?T4v`@6$/bX#TLKtx募FP% S9ۨSJ ɧ35&TL&_5*qGfq{],}F~#4)Is0R(l'& KĄϋw3T#)J:4CzILޕ ŝ^ hQ'/Qò ayi20ֱmcmڷk V{sB6W@F`ӛ%xְ!dVKIDRSv[E %D<-;y%v}y6xl'  4E~[H$-kX 0e͋2i:ff99pgy&#'J B_ ):6 ?dP^YseZ!kqB : fgHe6D>`DC$/_ 1ýΎoaa#ΒyylD3mqW'^q < q:Àz0'à՝%l_z[/$|ˎo7Mku9aWe9S-+:!O )7o;yր4fB֪wrA*PDv'×JSm$Bmc2O@CmQpJŝvE<@r!ڊ/,.X8e7ni'm9fq>Yxo fEO}/9l6Y\*z¿PZǧZ+vev/S"i}ĕu]ah[ʬjut׈J8yoM$` km,{:2Yu(&XVh!fr1ʰ%Me2'&(p SrQ9Or;OS!'=6Uд4] IICGʎ$X"]sɤiB14'y˳W{G_ Xg5Ob 42N(ۛ^"4 }RS3zBP9T3CS1FXֶgP #Y:pc.Bgg:*NG_";]ow=-g;Q'x~=v[]VՅouQo~8 I iXF2ɩ@(GX[YS\3$ Q#Qi"_n^@ ,lCGfq?1G^܃Gv5P--l_&$oJ0E=ՅP达Pw͍H6+ј*qRLeܗl@o+ٳK_,ʢlP )*v2q, lmTAVkb~8pQheR &,Sgf q8c:祓S I"Uˀ=M c Յ\M\TN@AKfs29~DjF'aaxXD{(r抌0*62";(_9 E֊e`LB U=t"uZo̩XY^\i SE([+2'`GHL(6`T*gR ykWuLқqnMt"UbӨ|z @2\4CL*ApĽą[%p ˑ_ﮝ?%)HۓD!-?UI0IxG+(7tk\2A!S+ ͔,rBU='J&Be&m 5 H"HD&$T)^Fpnyk%GW.ϋ@Ƹs^PDSSFRu&A#NZHE[F2e=j\ L;ȢE$4e;Mb`K 8Qdy<9ioW AX Hwq\Y>=rw^SçK<>a5)+$lVǕf^3 ajkLOhg%'>]\ IcN?#Bޖ߹oAw`xwuhCN(- #B  xx=5dRy#N{cV4ǻ!ʽ)r\k2T;9Sh (AσY͝eYmGe>e @\,C6eP #P%Ahp 5M8!y"hIWagmbm2`s/rpQO<g^(U.#wnቒ(Mر*I_CS@LhBVX7"RrT UhQy(J|`JqE!.КWe8ٶW+E<"pN&:^@e!ҹ̽& uRܡ 2%=2 +"LjT eJ9j2eP][O-e.SdRyt@ e (c/gheJ/xƒ)i]p4vyڊyv{gdYQ}9yXZ]dT$I7 Ubx+q<K b+йuz_" I L'8!f<*>Sp6sq8jn< 7͕\W{{Ќ =@Y҈gD`]2I'Jg2-((mqUS `ў# K xQ"4A50cN`Cs r; Ie檎 E+P- mƕ}j&hJ1BD n h%7dB!i,h+xkLURW3t)g>v.g*$׬FejML$R2U (qA P::)XV %\$t %fEQֹe\Lʸ I6^!zP7@ h'7 hXtYG:)i܈LNdU^h>Q(N.St>_*'\TUJ>Esm%i?j7ӐZ?H!{Y)"CQ2[AU6̦Id"3duF6MPS? jySij|u9q_KVr7H5 >nlSD*Pk@4o{򓷳ہ @T>n|`URZ"k/{WWET-Fn~wdGet=N>e?ŤDa5sKgzT=s9_n=60O0mnY(eA}#vKEû W+qB4o_0aDINQAElv0cv~9 s\N|"Z5^/|Ɯ? i&.er[}H&,] ζ3ro+n 󣿼?ڋ[T,:0%?q4%7M$˟_ʼnz(>eǟ?y|50kk3Wťך]%CeH т+6F h0:H#="5X2LL|DxZL kc3Nm1' U2ż$(^#saC.ŏsGPhrɌE=t4sgTQ PrۯFNL`ʡr~IZGsYɴ9T!e~UOqPz&n΀prB4QL$(&Yr;([{|+?wkap %{K6ԿLQ@xǐ!W /1.Rw;UDZT[wDƦQX&xD=GU3)eK]_D..d؄@`C8gyGD#Z Bg o (dr^#V*1}!uNQ`^y#bf2"<j" $@y$@y5Xm"f0rU7*I.BQX/33yf2bUa/D,EIMR_HE{4NikR&'*0ή⭷͛2sNW6M!5@|kc( pϭQ'wPpwEgNQD[?(g?tmFMf!uF{ڂ,S3)>S|Uټ6= tqqtr͛J狩|,grI=*uV lrրաO艆HӏCwǵpWFb r~=OWj򧥀FBMO%FR QiBNm)Qkeaݺ5{̧!^ m6K-L+x/^i2]gؗkCLLvc{|{{rMH[_syzx 1*= 9Nv\X?e )6Rkrk(a)5Pn{[>r 4j]4t dZԄ'ɴ`U7XعBWbHA,2-kj؆:s4YHA )=.RFZZ8`3t Rn 0)q8祓S#QyS4MIw(!ySL+!ͫSL^> YP'uS 9wQSS7#$$R(YH:n}F!%w98ז2pRAA6IѺ+ H7Q8|p6s7?YH%sQ#9?BE{ۀ(DK`?/=97j𑭳T+RID]WHXȲLLwiyk4Uơl .|Q?&SPS۲r?p":k2>HE"1+u)apwv0Ji2/mY4s[{,9W格6`N>XHOP鳿r[.I2dv@ Ab#:h葠U jw#[E4IѝJjLW;j\ Uڐ.&ڄVO`Ҵ8q\l"ex+Ijs>fMfN-S8~VHκ* Jt]4U%R$T4{qI԰``ly#ϛVfbzgUq)jxfQ9vS)]<F-dZgig3hQhy@bw7U~~;×-Y8 @˧GnaѩX9BTǕf]3[1nt6 l~xWV<tRS΅~LBs@` maW-5TՎMhzE/z|t8 $ ijETr Hs]EJ~'Du6'kaY:ϕ/~6-CʞΑew!bϙnmn4l%ۡLR؎ ڲi醘}{<{4T0-w4I@;+Tޒ oG&!!/\Ddli7ƵvK JtH3O%Wn-H $RR qvη8\<8Wj[ I^3*3\ L/IIajj8ZG=-#Mn# ܱZ~?Ӊ`V#8)gNJ,Zt$/5e<9w(@<\5[x_w 3{z]Z1sd/.^ڣˎwQ=M9?ooe2#g I8*jhQuhː E"N_ b3rr#d&=y/whd01c 5YmL0tr>NKjC9oV Cz50=W/cs*pk 5TJ-$=Z'Җf-huWK;8LBOyV<J[ >`/ۋt?^8$P <,Ӆ|`.4"S2 uzʢ'kO˫̏%~BG%'mbٻm$ej#j2G2nEX{~$L65(YdQ U4E^p_di4Cyrl4en FC\kEMUϛ]>TS¨=ZbL8ҽ3Fġ=-!E#BKT<%-*,7&ϙ,iVDJ)H,-(BFi^$^ \xqo᎕  vS&CBkf^[C TÈR)MQHar1PS0F1b)lb~Cxg / e3yC ;ԢA 5n߿Zat %L:o z;Zmv-J !߸FȔHtzNSfSV-%SF'b~KSVl m%"eӑV`Dc]>>Gߏ͈#ڑ^-hzTW8xe8s^-6"G]-\%`$)9 $#iS L(˰ϊ' 8%⩻"d]_K^hN/CsFg[ ևϳbg:!L-++ u#׳.z6pf\ hnΌ3+XKKeAhfJؐG%y* bP30DBq Ws|PWIGp<#O/CIPҠC#d(ntmu\:?ϖcjiDZ.=;>t`|ڴwkٵX\̣R05`݇1̻? V^[éxZOD"dOގ/&JZaLcLq;qazvw 8y]NN*1JnI.gӋI~b_,ǎ %d9cư+3VR`)O> &S'_ 3Rg9*))O> &K38l>.336 De5~dQ!E)2c#% e6_#k5: 1j P2$L@SFN)]bəT53ikLX 8R§5478Yj?!a$"'L"ZRRRaKME%AZ4+$s7 $r=K 2pe PfYI!079ȜT1UH$=5Bya) Uй3s ?D ~v'g0U)H ւ D$/yۓQ ,`׏ϿtE*BD>Oz;0G?O@gw "boOgFlTNr}x 3_x\?Bl6aYZ$wt;wV RH՟%]'g/;^e !@AhE#*Bs*)*)4BEe㮗r]MmQa1n{$Q*4E(<]W qYDT (QV 2٘076\,)toKVhg4tsD $n* `iZR5 R #1w;;הKYQ/4ЗdgI8ԗp`d;^*dFm}cJyRꠌ œ˲}'.4;1慲oB$CZ#hg ҦF/^仃&N(bFݓNtF^MyCDX[ yU RDB wDX`«Wn\-a39:^H( Ԍ X)1`щrH!'1.by@o;^=Em#M"Z7g8vP6op`X0ow0):zH7(\c/8 o^DG8GLQMk=on)^^=VmêȤEw5xF(WUGvbqngs[g!vBx /7ySVR+.$`4H)fFbe ke\ *T!-* r޵%lrljDOFݜ`RDZG(yhdoL裠i.A(LbdJn$/ ZJr[(@T,W06˻Fz/VbDUY~hp|8 a y\fqK9co2Pz9 !/*L %Gd%xWLc 6kAnw\;6 ,9 F̗{Zȟolv;}_m \1Z5אښԖި g0ɶ6%uUsS>>Ӻ?"J ȈirITY7Ovt;_4{Ii?w;f>`TRnm|TNv%˻ hsgd|I/٤j(tS/7fP>|z5P pKV^LwPfNWo{%!χ$4o/+_@ogpE>0xsWz zqO4@L9N\z[ai+=B{zzYP:+,(pp{DpTr5ꈣ-ty-?ecbI9?e!?kg 3swnq!԰aڋG4Bwm_[3YMfH$*޵A) X'@ӱSLm*cy-h/7{n2]Š$.LT;aKK-4?[ԁ:ͰSzw{]8m*㽺q/?ܻ_lmݬGnkrE;GXcg6 V)*HJs5o1DmU=k1YS9 ѧif>mC?\^\D\LC5+bՔ6l*`1X,2.~]^f~lAt'Ӆx}'JpBq/=e 8IaC]YR_VQ-i(MI~%ZoA ։)q m9BL>,&Tq&~꫺m&m&0rs4"X3~of~ $˪H3+Bdao|E _W4Bi8wdC2[ ".#1 9x@njbGTu BԆT.,eAuAըVS gETy;k?Od kQJ$+j|_5#|`J|' jB 8S4-iINNq^N9t$TE(pA "Y\;W8K?UWv::>E$R`{-vf[kv_]~@G"^{&38nYw=KX|Hm፜љX2_0 ;8tbT 81C}7ȚǺM&V`>yM08 ). ;-m Ӈ[/b%XZ.8DӬo:<^V8ũfBv !Y.³n:}{7-?믦A&Ԍq" ږ [ ǰasA*E05 1-)&&;)8\ȏ7}g﷩lYLyP0^}<:o8ra 3rlzq;/C_8{ r+ARO>,wCސ` 3-~M ߒX8!bnWa` ]WhrJF 9^c9$_.&_׸Du}3iEXTa]:(H҄2-txȅڻƺnT1R#噶 8bBܖBZc2Q–Nw&`1h~ :K677KxqDjVgDշxV[n^+p3<5ΩQYism|e(iP?{Ƒl ]?z`ڰ@KLR^;>4|$i30"3>US]]}z|]=4>w߲ܶՒu4DbkYUqO7v9pC[_(@mByFvX HsV5RETňFn }1? j އ-Q}!.i+fa=+39e&FC,,%4ZuQѝ"[(B ?,V"Ri!ݴk(%T$ t Sh k < jb1ݘL`aMB)u#nBZ-,#(M n,0ęĵ,WǤNmE㚩XTZJY iWղq,)*L|%8!ک4FSKdʵ&)eLm&ikf "e!!~FcdLbKyLEJILS 8$()2Kr,?U"GT_ 0%\#f0N,R$)y,ijuR .zXcUwյV\\Ϩgn'ϗ1/*=0zۗ_zN '/?r7 O/:x=܃{b0 ~} d:pt (ChL?y0^-yӳ<~w9$$+g}Wm.aҨqB#Djl_fƪ[9:rtaMp$;]%PЊ2ԺQdξ.nv5ۜ"Z DiqQin*YCuItEu1ᆩ2ENGۦ/DH_Ԅ|&ZݦgQ6C__BhT4fI1L=]Q T.i:f5\a-;x{JNbx0D.XY% &޴2/#6׌/RRu IXc?"8R[*Ror品.NoĠA|WɵT@CF^! )0gBB;:zB}&աƽ?Ή>ߪg &7o\JCߪ-sD&-< J:i|KJPM8hBZ($>A+:u@hK).0 IIʼnCqR(CT ړF9Z"$>o+UИlf6AL )1BSD"1F|eI0~ fblJ5rD))'8qR0խo`}$A¤h|yCV98*֜c$TR0#^npڨϙWr_@a}%+𗜑W |~x'ٶ4w+S@F!{Xa9Qʇѵ37eQ-%"XUDv+3H)A U BI&q{]C 1i 7lmek_悼q,a Q>Mars=1W.@I Ӵg38z|?/n=^5`35,tbxzʣ47wW_ZX(,:QAFX3;WZX(,Ds2lقX\ȣUK%7Fd#UQ 7j[2bRZPngm D)6C]FB8+K`;ܚɇ~FE[jLWɨʕigI٫¤ V\]iM&y'wKQߚ{}f`.3TEX@ jhbc)Vv6uuʸ:/a],=~UelRw'v52&q*YoU9ޏjFmftXb6νP3[QZP:K;>i%\+Z1IB$¬)o SLUJwSC~۷D%7}h4>O6yr4n[2ܦ9O9˭^tq;`G\7ESI&r %5IwMتTSps DxjTHVTIlG$ɘ2Ut$9env1jk @JqLm1QxlpIRniN-ךVrEuaRƖ ʌbz4=gT#ɏ=ANBLM81c-RWHRDY˔$$IiM$meJHyiv&*ʳz&* L\ %.L )Cęv8nk ڙ(5 tV_Bp&Eoj%fV8)'z$TJJTܘL>z0uFӻ1z]ef6Kw8q Eiim RodfZ^2&D7-KTTJĤ1]PM0"!.[#lֵYϞcThFԹ) J}BΒ4YIZqke/J;WcIZyYu', WzB[c}eYP4 _ º iףkYňw}YA9jk] Bucw`k]^5yӰs^4В31)9\J8GNڒFkBUֹמ4<گ]ilQj{e͟%yV}`J5ZZ\\MCORpxYNe bGt D*iv? ^G쀣IX4s4i.j &rU]hSk,M SH'lBb494&4D4Ah6!+V^Gj;ey %kմRtkg*) f:ƂabE(QB)콽c֐Z~KKH0Qje L /15L:6>E:Vi EP 07cL0Wi|8N@qZ5.~.y_j})iv}\v PuQA"~rM@?p "䧛 9>u/yd7F5zj}̊ԐiQۦmhOO\uSb'NFLkhe#;RaNFҨ} 'nv;$r+ƎIH^6?>p.:9OkS/IaC5Oőb@%qP 8m&nja쎇ɇL{7!n(م sh\BЂHQxr\+:;4$34ԌǓ#TA-^>P. Or{F GDL@<+˅SfYxIOn[n2u)kZ|%ܥ.}w#|7o\]Ҙ NR_J fl KI'f!890s*),uRqlT,%|fAb< u4b0\O>CeOq=7v'‡+0JIps߿E/|^z{yNbO  ;HRP\b|18M Hؔ [tղlтh*m儁qbk)Q+k"b3"R؈>L4fZ(SF>P (V/ڕYAP~m,?jFiv @AlU3&I-Ȥ#,$&1,I '-'RoȺ0Dڤ vBäΦ!M+O)uhlZ^'B$_WBjMشf㸭Q%BOD`xDAoj$QG `HOZ -2Eه22'g{Na[Jw@.vUrÓ#%:$<_g/&A7\ӗeT3FLuL!3sen) p2N(vJ- #sIt:ֶ+&GM)дd+3z|\NkFwY<~ls^|%Q8sr@cpw3J9-R̓͠;9L;{CumſC[ԥT*i;茉ͻWOQZTtqéOL6طCјJlDYoWY"D(a J۴ @|:Q^ESE:TgyfuͷˆT-X[twʡ<>޸]9O9QBc1be"dc.uVBq\ ? Tti%2{8sUn*3CxyJޚ+7bfzae[g>DyJz{/(q˺I'w@/g uv٬g}Bn|ˣ !jtPD/ؖK(_SSZ"!&gڮ&1oe&%8wFt;Vy7"}ίkp;6CH !JAap~_Md|~ss9q˙&'~(;?v7~p27WY,uv~=KƓGl{dž(XUCX"joE\Z,آ:e[7rv2e OFR JejD31_-)s]0M܍vvQWWٻF$W= \Rއ~ڽxƆ1hCȳDjH=FbSKb2dfW/`#wWc/ vO+}_m:ݴۅ|pm(?MM}3}pջn<)G4e"އwW U5Mo_ߧ@j<n]eߎcX!@a7oxU&_7NNgK>$P-G_N[z#eG;>/ɘϠu*A@v]ҝ/gwY_`--_b[.墝^Uy\ai tKqb׽i2c]?Pꟗ pil,ەt#+~3ܬl|T&]I LWce00S;kLfU!hRLo76_濎T+Eɚo.Jss_>.#E+6|{4)O $1^K,Xo޼§J$}JvDi)г;aiI06HĜ{#5EKKu)kHn}jjiGeb+(Zh`sRV(øAQ}0oV!#O]LnHKLrC3D9D%'8F`՟H* 9JJxwuJ0GlSGE.$U$'ҽj'9ic8}zvs̄L^#E΍紡i%rlB'8;j~pr !`[BtKgR=n%#Ҍ7I9 _SXZI,`͖lemTޯ~5eE~ <4Ik N[p8>X=h˙F|ԃ{'٩փ3p7lQN>'ae殶rCbp\ܶ2qGyY*XxXOx'T8̀nrePXwvЭ^EcM^Q`5 .:7b21Q ,Xݰ>2~6'5;|…BU"e,Foc&ʴt|t6K^ ?.@Q]F?&>G3]L`WAӎ%\b 5.D`\Bs 7q[ztZ8 yGf3Ԁ)*澺5+clvY#= k&}fYF0s"]],3JJyގuSϛU0E4a?TO*er-R S'>wE)Y[dNP=,@<'Ik%Js6PƗ9+22wtOxwi_Yy_<@^"DDfLwd/_Vxs%{SWZD q5 fdT)96 ~ o0pgy58xr#/nQʇsXk{w1ΞPi faTB3DPHdf/"-AI, 9?%'V 0_Lz]V;! *#gd˙x/^M?[dSמLB>5?r΢$Xն =Y򜶒g/Ӓt~ (EW:(AVⓖ1Rs:kzjj~n-ĴO[ qЉ|RTBbG&FH !0"u0!uFb+ΦI~L NRk4*Ŕߎ]X-l/BY& M$7u_1# ZPu"uG4K*t1e bszR5x]ELZ>QہG).a>b xWhsg ZUփ-TaZ;nS !FTT^jK`ꕕ #mØݚYC>ŨbEaͽ&4k!B?b$D310 fR"DA3h6FՆez>Kh=1Jiђ*&{Ji=6ȭU?TI+].#쬖u7y#(U 3NSb*b-[ḣSmV$Sxtjif#ƄT *R(\zc)c2ę1b[%k2}mnB*-yKA4J3)wg<0{_2A4jN!%,{,80ZKNL|1lQXn7n^-vazd~LYL;EWT7 -(l0C&|Po 6 PU(i-9BWzÁwq1T 3V-8p +R`;RDyBh"}(^at4Ұ6rP9K[ðw?R(m ڞmAx4tP _DsUĎΑ9?5ߪx d y~ +&DZ&FeYdY"?UbMDž 7c̛yR"z]񴶙c3š+m6MdV"s`K[&oy,q+j]Ad`SYfp`;cG8TbǫXv}ƪvn2:؀Ŧ ϳlp*1ii>쯝~p6}|J\lнl_okW$nvi3pH}(.9'qm(1PFw S{5^rE0)2#$fP]M_=ТIzn @ Ө^E 3H`(v Hb -e4)H Rд D}Zɉ_L)Uldw0q|e˩&fϥt$kRhGc!`"D8v;}縙PD8^o^ //>,G+'J7n VA+;\ݿ/.Y5O9 ^XqV+zʰZw}ޙa^R^]|ds+p&+%T+;¹Z҂6_XqJV}gŠZ~q`0l/m괉k~~?{_^CL`CZn?)DkըN '7EWm2C}sMZ :-hmٍ4<-Z]w$-бXZk"!i"#BܱٓOdh@$BmT)#w &'|Mg,_Uˆ#?5zgfWc{*f]8Z:پuhZLc<9ZR^FH! {,jn{8g1%=G6I);g2 :\pU^Fyr˨W%\5Ҭ#5@UfpaCw-A(ŭ m5=hkIUAYX cqa"DJCZCHGQ|}- Q0I%O[ܫ$1c.xiBJHg A0&"&LnҤ[4^om%4CMBZo \|ÄcP gH(\=d/z^o'tힽU:aT뇅M[$;K;BעKVb wdC@þ|>~;|iia%Z'7mj1(g@wb</񀝄&@t?lOzS5hxTr\4{Gxr[ 0JH+'Q]<^\푙^D+0>ZCGdϦ/{ "Jb!sHPP5'.<ك]m;g ;v b8l=ܑ͞1){ءSnK8zN|Д+u[edҸ'LKMyM^栝38y?Z~k2x@` bJsTKƕrx'UO j3.In$I"wSzu` &R"9.ujYI(QII<:#gJRʮ.vV~ 8gɩ0S|$ qm ?u;}YóKN#1as-lh9#f;;&|K~-Üpb̓>jJ^틺M Oi{<uyPYs/VdtuQG5Bgt=;}s}h0wi θ6/>`[uy΀j;:;hp0ѬUڻ_ƕ>%Mp(r$d8Nu{Ҁ[:=W+D DrǍ ,Q|rxIc |jS%7*ɽ9Õ,Ӗds64v>3Kΐ3ϔv-nQy)| ּDk^9k̘\$P7cVP,VE/$:QY-ⴠ?OgÇC`R4&o?wlEI7x}plp; pU'>+74@s\3ůFP5!;(iPKI*ppyeR3ʀq-+0B-JTP:kG )=EIwm 8it 9qvB^fٔ`n٧n҈$B['ՅhjU*$Ac(0]$w[Y=깪'q/E얷ne'#~/̦Ʃ>_۴=|dσקQF3flzG 5=G5  .eYr\˂7w7SΫ>~6A/0oIYVI(p7OblJS]P]=E%0B2r5hϓhOpW L[z`.< VsVr8ʵ[dL j( @8͕TH)TKzuQ^*GJ\IqLDY(= nj#bv |]>ֲ2ЍcO{wѵJ1E]EAi3lӟtʩl?*q y&eSC U೎B% 35{ `4{\dW7ERwAe#Cuo {LweR"6חM r&JݣoW`Qj@S.,䅛hMK{tÜI[͖-#6)Woft:}lu|4=!iBkߧ_ZL)xoyx!Li=\A9˫j[HAt'ᛂ.1>זcTXK=3v̿zɚȷdML8!FWm4cNKzq#{8gzI|Z/Ioyƨ >4Dbknd x29=-ߠOotkLCGjGvNF[:.mu@=!nBeP9rFNyt;aRQH*۲*LYV=L Z.ω$9ՄmA%B#L6FǛgr%(#.i(MFs  /Wzl٘0G0q[0| v9"GXIcM 9W9 1;Q=I?Qk^n'8Z$F;^>;gO<6;^\wuc޽ƨIOvS[gPQ[;,ԂUxOymg8Z ) c]Yr9"s$FE9Ӽ༬#*#V8(XԢ'RBْfEU+#[@Ë I`=8s@"hZJeb5TNVk-q%^XRBBQTTF:pkk;KmD q;T1_2 x(zx.i0}P:A\z%Claxf65ƠƓË2wSFģ-~hc%{pګFӘz%u|FFL{\5D=*upCQ v)N^ˆn)7LF0eT!tH4!L@ď ;t-u!lJOGG%p 7t(|DI{- $õt~DI/N4M~ZʷΫ= -L*fv^עV?)$ӐDtNCiD!W6`\?RF]) ꭮j5#+Ukc\8N-n}p^<}ݬ3-ߛa')}&.Hϲ$ +b #MYޛʡ$-ϸ۶JEyn)ٞN9#IӟE HTӇLD3gt7\--H)['p&#Z -Xb1)ybA:P&5 Uʛv5G"@تo_0̳k[CCA*z.]{u{/a~ K/T 1= cxt^OnFr|*I܁z / xFIB^$(:=+#`J))F̖.4)"B&7B؛SA&Յ)oV`vq.!k꿜~:A8<>q!ݱ 1>=no'OxS(N[< 1]UzH塬uRM U2e%RqW֖ܲlzk>R>_Ffa6g.UyU(T}+F t9RW2BaHN>z|Sż-Q\ ^tFd\0pw;w^4.AT'4P{iݕVB֥# ʼnU(w)J khkUe_׷LCJ¬⵶4|-$\XSVP-k?s&%!tY&`*)jcôXMbB95kpĕ(+P :l0r$2ee+pRJ3)K,皀f(;U:B<U[ Xy@XʣiIPi\Jp;ZiaBz)p&1s@Xf`vcBR\LA]e;˫'f47|NHDs}G ㇩o>+>:tmZ)AȨGl:/ps~OR4&o?wlӒ$;WKx>\^AM!as͞}{rqhqH~|njݍ2b&em6 hc*WܠU}KSqM Je4T%S  }|(zcFRE(vi >׊+&&LJ5Org;cx죳~f;?B'Gӻ-Eю{- d)y3)Zva/s ~UP BlQ #֥.9-k9PpGң'K+")PU M)k8`YjkBd&Co0p/}ì]DŽݜnncA=o* Hm{J1yCA6Frg1FԎ8 6'/|3JK|7n+l5_SDgi"`}<=/4 --'.@Ln.z(m^4J8B˵(i@Nm: ;Vƥ: ^:Q#j]cV('>[exCbBL=[z.9?XEªʑҐJQBd"\9B̀qh ೤^"\1a8/ҨVU-RD!_1.t[&xXwwCy:lvg@+f\g,\z@fXRA[+AR 㺙 )pJh E]cih*K;1h'oz6<`B 1ƺ0ZON矦 D :٬OGDlg*hlv07Dk6_5ė}vm+܀\L:-3;tWj͖ 77S?_O5:_=>&T VzbНɼܿxĉfvafRu2bfb%\n1Sc6 NpƋBK[5׶4U-67y)G?9/UfPКl٧"qw>{s_fEDD;߿ ]ׇl?Vr--o JT*/BH'=w\1M .pQ!Y"9k9Jtm%≈7@(u8?~ ? 3A @6,9TbFDВ UnQoݕ`Ь7 PS H(Je.s=s]%Õ>>+J}R!Gp%rdl9]hT1lڤMBus9(\h~9Z^ٛC fk;:u#5ہ74[3sCP/GނMkx{aaǀ&4?R"1m"K"a!~sY=RGV]u~x1+;ݦ /+1 x55>/x  ^b~Ddq? %OOӻ˗Og>8~gMp,7rSt,`:3)s(⩥P Rs' ȠjTyLJ~ 9G *0 =+FdߎϹ% 1X'͞&{Yg2Nnۻݍ1t*=䳉P.>uf~&22w˯:q8.>.[3!fLGc濨T)Pյ0.0{_KrA,Ŝ<οsrC$_&Uw: g@s*> 9Y.4¿5_]^f138̛pLh\g$j)Yef?i)\M19Bp%L4ØP022U|eKy8K+qj X b.&G2kie;1,DS:2vR1ԧKSkLPg\UӺW0{ g1^LHXmw{a -u/x꽛ػKhQE+n|#@6jR֏,X XO)IltF05hC1mo5hR+-C~zƣ*Ac(B;&ZF+iã"oXBL`uj1Pa5~* vTՐxЫJ&|Le_㦁=˅_;yG ='QEGJ:F~MǩZ>VUq p*ɍ G. cdckhEC\?Rp"]Zp[vQ \s6Rte;9[8˶;VcgcD+Uk YHv[9E6Ǣ%W831ss.x[6tn]6CsZs#!f@_辑hePRTMΩRi^t[4s馯c?y_49XėP0&i-1349Xė7Nt^D"uȟZ[P-P#n ktOPVCt2طՈqzp7҈3vP7PXB;s !MX_Tq@5:WҁBXJ#C܌ "B%xz2 vdr&Ҕ QJ׸1#XJR鱄XJޭRݥ%Ťi.8j+Ssp5tJ{67f61}vNcL~ ftSFԈ_|5"*`ȇ_\X4oo4AD OO.l4SNYk`._I2LleLSܙbQDI{J)J$q맧h+DA_[93gөwIz}<7 ; yQU^j.LEfi(+8uۛѣЯڧ-/ ڬBU' }U}gBSmogq^AݥGyjZhyZ Z󾪀uzBj8/%T+yNӞp:gIrI+N6h {sv]ĸ'gUN؞5jJ)zCqmB?|(0.,9 ?im\g A\OxN# 5<5<Wnژ>, >\qZWvI5&7mm|޴.[h"9,=:$ܩ-9K ^zYb9mB7H2k:+JH(}pd@90I1$e;u1%q>F̹DŻKz0ĥPB"YZ!(NBXKZ(gm|kM \1i1 \HGsm)Z86H5+B2ː!Mzp Tc$/1R`7˂{ k0,? C' \NA3%̪gqM7kƢeƚ̃*- Lj`a%%3+A.J0iC.l W/y>p_\%&0 =|I9%vtEΞI8cxd7<}&[_6;_ixD20+T$ˍbnO~`/U>(sf../ )|`wN0G.^ v}ԢO~t7=MWfO fc9C֛ͮ٭!y1,>_qy_`N~1oɣ݁WB+^28#|m- X0] abv< xa^O 1E̤-#P䬥Xb(͜LK%l.SJQ)8'"``c ݽ% ЉXF\sa~v=19#rtKTdyс2#C i\bL 5M@ :^AP\rYC'x /Qnj2RfJiݒF)i)8sy15O; F-G (~k!-X86pe67J!=X;WQnڹ!k";uG"]G S!]T# ޵R>bJ o }&a}rp-y~c5i ]kA!Ȃ _Wf~CyIA?>dGo*Z|7LaUǻ/\w2o懑{OzA&C׃3ExOjC='\[Z7XO9?'%>|vk<]Bwב!І_}OL4vPm>`1jKVQ HYVE[>: jbB"1!"&ڴ%pqF13KTZPXjÍ8#J#JęqOȸ1α%FצՁJB)S:JO~/nlg)VŖ ~bIY138q2o21uTx 8&YX`Td)iLZQALqOǿu&SWWbmaU&- p8aT癹 |8 yun#aBC8p4,}=<܃;(0;BSFA\=|+rVUۏa; {p.U_zwϘ T,j a:| _s9'd|ӰHwqYCR4~nub*):QZϼSA0;PenjH <+E\ e1-m:A̚>ߡ&"eo|} WH.}ǴIo]~O=9(-T{EzzUl=DZ#x4[E3X 8GBwty|A2܋Ni JٷNw:中PS˗ IFnJ34Iu!J^|/c0?/)3?iqQ_O5k2ynHQ3)hS;^,#J[!8ͬSJ$gfs:=Xi *Z1gId-$Yt dr4/?ZSqиw}Mir"EnOSrBdmvZz<#K$utG#d/5p&hud-Ӓ"uۓpfr^oݺd tj:ૐueXEM>u*7;i ^`T'z1 k8v`Vog)wP߶w |*!Fk` FV y $U9]咱:mn8.y.^L7T|H;qNJN:HXb.&Z.ni.8&Ѷ.iBqqD҅8H4a]hʑ0/v H4t2uy[FLV DŢL2f<MӒ%'j,%ɂIzeF " nL9fV#663Eϋ*ͧyŇy ^Z’Hl_sO]}u7i^}r^\ M [sr7gs7_so2U%OސA۫y}'Z2WZ|~ _\ݥ߅o;LVs>ûfq1P !}!6"K(4\z%ŤY,)f7ʚqHP`{H3WAk0D dIlD-D4@DaZ >??:d<Xyq@X5 }'$ִn$+ +OK)G]sqBR] If?T?.ɚJ+T!hC N/yŽTkhwtCpMb½Aa'2z@?ZϤ6*rP{,T{d j Ye, A[ó'vYE-<8nEyɣˣ[B1qT 0$vFgbqrB#o-$(zH 2 y rTޱ!Hh0BGY!Bh >auz^ IDZ}ޭsv|re3a`LKlƠq\c[Ӆ,kDR(yٯq\%QʻٌfL8?]=zsO{ug[j de?ZLC3!{0/)LAOe_톜-b>h,Tbӫ_oC9Ͽ,Yg!xmr yC -h-ThQYԖ! E ^ ggHQeHw 9N` \9:My L1df=w7ӻه/vcys(\Kh@S\<=(YiBMBly5`ʶ0{[a^ P&%@>\ֱT!=5ɍ,a \0fh }C{MI¬`0O`W~Cj Ru ;oH}e Eu?ݎ4 wjnGv1i ~5*k+{>7FZAV\w! 69i-Q0LK-Ov<=u<x}oaߧ-!lzѪ#;&_S>%G\xlVfK hq`D㥶 V:wR#oRlA#[6EBO,:}\Bu^߳woT4s)AY;]=/K䇇EC,SlIC˄z!,H&hYi\(ruLYg?㼡YOMMI&Nox7%wAt}Gv؋tjhޭ{YwB^v)>sӻ% [W JL;xSfى8w,PnG۔T FRk#.SgM:$`LWv6 [x?sFJh>k5EE4  r"gKms8NjsE.9'c"/k%N8)D5@:mT'BtRB5c HQ -J#9 S&ZU [n-$yГ@Vҳ_wRkҪi(R 'a~*fL7m/VzV*t MFӰ}R[:]X#_IX>߭Km1JJJMw]pA#P^%UmRO}b|峺a"\"3Ϳ2J7\&%eb.JO6MF!ͩ⻋Rlsa|ºJ$%*Yo<^%(ԳyhLް 7`o!غF%oݧ:VϷ8n eR ǗP Q8.UPWP Yye 9_`pԏϾ{#x60Lh'Z:9\{KBr: uA@=~Ȱ5FᕏdB(HzR1lu5Z]}3{eB!b[ih.. Y088k#VY]I.=3Վw: 6W]uzUvU(;$D%IK06PwP^6 ˕v@/;9acڻCx6IK3=ٽBqlϷL^UoXV9c}#}3-PEҗ4|T̊o1~Ea3%k 2a^"2yn1ZK1VFWzGn!d=_k8 Pޝ7PzlEZ؞LH;ճQ {GE*Ti#tJ Wu1ڐY83LuyI\ZTN^B IO^U#ZuSa)ًY9S-ЂV$=C;=?{FA_vv$`pgK0~ d[Jr3-n,كLb[_UŪ^g?ߏ#}7u ݟx9KG+V+;>I1U@'>{z;Pj2fT8V dt*٣'~zz@1Zoj"@I8V $CG(y(i֮cp}1swd+MHpv=rׁ W}eLP *1A岘|8AU +[I V nF@§ jeF ʸ޿x縿B-NHgvW9q~hliyʯ e_&619$B&<ෂoS1,L *k7S~6\SGY&i2`LO5E1dbg Њ w1Zj+,j9r p&Z8Soo5O[\D=1{7 XO mOJSPPn Ɛ[8wPnƅvZ61x[Fr7"T0])+Xz* 2{Wf4R5%o5[ն3t[%Xcؖ۔LuuSr( ~ß|o"i$ށɹx[Ƌw}oG?Y[Rq+JNF[QcG0/5EGv2H^mM Mƽ$˗wGο3+cš̀?V@` l{<:>TsRйL$u[=i[@jS{S auk!v=Gاy|y˓Iaę쒺eYv=6:Zpf% 5sYl-Ͳ(|)Pa,1:F+i68P(T{Jhr•Q vI\KY=pM15(,3if@g#KCL%%Of?P]ĩBk+11. J9"yJA`A3*7.ـֹT@IWD*؁ q (:s ; _+)4r^zrmTK0Պ `F09(msP$ƂװGFEc!^@`rjr YbBۮ)ki0ѡ̔Olfƈ?r0a](l[sL.GSBR|^oSdjbH㖧,b=~n_sJ+*>/+*cy7(+o" @˼~g6}߀x0C:Q220pah-P+2# ω gDuD #?'O |2Vl ?ّʉJm*䘓 p1!\< mh^]P^G7m]҇s'F|HnbR^_K<YJ‚ٿrq @rflDz7buFؘ?uqN?De~w75_v}!XWpf1ע?q@ó L:; X4촕gmju~vХWt)+4"5B[F~=hhlXݟ\2n5]˼1/!Ak~9N$oE}tR5;F":';͕dSus,ELw ѮF>qlTfa% <'A,Q~ bA(s<íUXTR6GRt98))Dcu% IvOC`;eir[ 9O;Ur$ /vvCGO%c-0ONTs-Q׶N OW󲜗$~^k7fjs[#v"YBn=TD=vV@Q-GͣyV⮈@{nK*CbvȞ-_}t K<O > φ";&17^,"B[260@%R6dRÃuT+Y(L]j> cƸ+ѴDo״UmQ}Zq0 oЬ7U˗= xѹbze֗2`B59C܀aL{>PAJ~Q>'){K5v㜶ϻ Ô[jIee0=S,eeL簂9X\$'K5|5~Yf:`lv>3Yq (&0;&,B6H sIs>B2#J<<اJbq` SHJ#C(:̲UR;aע E ejI ;"|m2B1$ûa% xEcg; ;˨9q!Q8,2B2VJ@FziIa/#os *xiywCA%zN%ʛ]sh!̈DɨZF9WLa XN ۸d3"7]qJyڮ(h 9IF`'*УT0Hz9US(Z- j=M,ܱn(^W CUzmwuj 'UE *S.UqCm Edҳ>IONhz-vHOtu,Xla4란IJ#[1gJ'HOo&ۣ@`ƭYRlF7oڮ]cy6 eEk&'I9wLvT30Ov(mJL] KDsÂvg`!fW昶DtHs"O™{H4*$5%5?DNiA)A9e0 ,r6q#-"rd&ΐRJDm2}KT(ڰ-LnL$idw1uH;-׷HIOܙ|XM=xw;G?_˾:;V`9n E,3+;1ͯή'F;/\'e#sݛCS~MƱ! :[c/i/VqJ^rO8翔JCM+*J[Ru 3կKbRRu(e]Z)D>|M'űű4e>42HOp,FST_SV/FI)D&J2P!6o0ry}:?.>xHx.M W#̟c* %T-Pޤ6QiYZh uSMK5,$aSS67A\fɲ$,{xPSGF:٣ ʣ;xp)lԢQJ]Jb2t\oa|SSޅ%qcnR$}Tʷ vnWVh C7W( 27[%6J{; !]hG(xIZbIEbcog9~[As?i_̣`Y,`3FYvt&Y_R[}jI0[fyb#{xwvCҁ~>3esMde1Vxn͖h=fVG]`~Gsdz+Ħc+kӭ[}>R/HM3Zm}-rǣC},.B!MSxrSH_y7~˙ 3V{B>oIrw?TBӤUemƃC.'kz蟕ф@epخ5 ӕ1ipcd)]om (U@4'zu.9G)z@q%\b==! X2!L4Dȡp8'Κ57 \D`QvMrLh:Yv"ˎ,;_M/RTRw#0 h{[%[tKLg^s'g==^p z|õ,ҜxdG@#;}7PQ!@g!w}Wہ^U*-*װ؂ {2c Ck-Oc<[OJ"[ 9Z,nbRQ\K,5~ꭱVzY4Ştl Hk|R3N4!uᲘ!mϗ7u_mܴlb J@ -H4W1VJ&IpL6RFpSȌł8p:aSzUuh"]83u~yؒt)5Q p@"@ِHƐ7p+mHCIV xoPFve+xH}P3xܹY߿D/&gc|6Y^&sI,p2V& \42_dSPȺTμӔԬ,N(ڙRq"m8Nmʍ;~%%0ta쇒cRkȕ<+STr\R5!lM|]p:7,(}lSzS>NI8y.a6̕'kѯ5WSxpsso_|y0~Of6ӻ{3.6KALۑUh`9'#ihOT\5: թUfhFTB̊j4ڟDG2,p,lOZ;I^ʌTE sAJ/@KN85۴[iM LR7Iy*X"R&.w: h&ww<ܯ|3[\8BtQBo`"r~!o] 7?BJ/}iɑsC'a_&vVDIAJ#( Zv [-G;'/x2UaT1Vgu !6i/0geNH8!>^{4 nĵ:$W9a2gw#E>/{g4Xgjlt?*\̩+E"@YȽ%ztSiU9ZA2sb(#V6}ЅDX ?# F/{7V$|/;phٷ|7awq( kLwwta2 tkhzq%"K`j\EIɶ-/zJreD>!Y$ș˽n@WS'+~RO;<; EˋJt贋;KXIfY,ƑFG!3I&VmYƟf6[Y.cS*kx$S t{Znrp㗃䗃t2JE*8Ur-E5=KQ4vd\n* 1n8tUmtYU0e>),乔rnN1|.j*;\FD!O2Rh7Mnl$v1b"$U ktڵ.tAu0IzUS2Ukn j~UBMZZoceޗ8Kv&R g3#0v8Dn7zh ,ˈ6RP\BA#oß2C2dxnƪI "NɄc6%z"(8u \`.+;`9^ZmN92\<CuR񦲇@E"GJ[)ԫrASܒ( ڣ)P°#ABT\f^>:ŨpFoIgx+ӭkbh`y3Cmj+W`'S.e9H7 uU-mu(8jAmr2m7  ,!8q1Pv9 5itdW]QD!'xPo/"$ޒL0FJ{e!xr>DW%ivЅz*±;0fսjDa7} 9raJX_ZBH -9v5rx,E}R9ߎ &׮h6h IqKpɠ,q9EnN&nL`MI5D2)c @h7c^`q}o$R-3O2`,Q]|O| ^ehJQQ鐢dKgv2 JrWb@}-WP{]1 7RSα]B N  .rYw]nlHnt(** n 䳂*"kAdAB~ve?AdJ@qZi"/7C]HЌJJ1VYLQ3 θԒ$iJdh% !Ve$E. L%sܹY.8a{;2Ҟ?&{zpnR\8HLqc# C֖tOdz7_.\5}e.Wm_T!*^t;S8q!o(|`ML[-΀H̟/o.լ>}sql rJFwUkEMc@Ke \Htn?g 8Ŭh۬p%7b-V75g@QMDjYLٔj34&IRTicęLr8-&0S[G ̐h6QCfǰr8cBXApz͠ϲOV嵲\ /(,:}msrQ$!̂5ޟcK>HJo*: tw9DnSc%\pԉ4Q,3`DOv yCF.J$AJg݄_ Tv^w<]5mm%yv[~eolߕ7Ky-=Bjmz ^ᣙG9ǹ̟[fW C:])nMox]ٛgr? f`%i36nq0<&D=$TSޮwyf5Nv*H#{pG!JKa U^48^v*^6̀Z=/x2*@xfjN*W'P+g: By.lrA%19X|vP:LvKc( 0I}q\ 0Nl46r戴͠n %@T3oLR+ߔzSFAXTrut b`c+3Ӯh&Z]4'fEt%EuM,)TXNF]?kڛbj ͺ/,f ` ,AYPGdH:#ۦ5?$sb!M{<ׯK4 k!4Aȏ!d"#P^tp1o?m~7axmz8?υ@2}YA)J ,MRֳ "3ZgEL8Lec3q߯|ml!a>c{W"!)ԗLoIhPN+ȭ5\֎F\&=/Cr*t9HS;)7߬a_xk,EW]Klng$h)y >b4#Ai-3@."E"M$HcuRDE.ŋ\BN"d %bQv'e']1?&AqQ_w.:#TIQ]3gdiHF1K2bX`PFRH#v =xuj@hcvc:`:ڢvh>\wM6N~0}z}u;]O'-k6Np(ΡWvBx} Jp^r$=yWNPPᵝ:q"8}{iG-Iif_.S}^O$@;/:s.t^C@9GLNSQ lb*dzQg?m(rP*σoKS'2e*ӿ| *bK>RmX7fşD0pnM``trrњcS 󆋘Ieٻ6d,6OI^<,woOPdUu )R̐q <~U]GUBh.<8T୒09-hѮ#MR!Qׁގ}UFw\7=֣b^9Ơu{i<~BfcbjdJ8뻱&2+^KZ!x"t$ T3g jS.Ho N DϘdO+SQJb"lQaP'ה%@"<#cѨ [g}4mSUX|G=T61&D3d4JP" P-H`Z\lC3{ZǾ,Eћt7SOfqkT-4>Pצ K!oYۼfwJ J+E$%i{㚇rA+ ӎ}xHxq1#Vg  "N43'q4N;nuZ1PN ޝ\:F6zM'oŨ3c00fp* J4Z½@5TZF5jNG8LXhwUHbJF\ĈPdKw%рݨV*a`& i>h+#DD .r"')G1ڴ;՚  ZQR88'Ʋ 'O+Ð ZvQ[*KՒF VT[m>S>M$c G<acRnk|DVT14{t ֡*iH1:P2MJUB&'^}[AfsHd .U'h+q9ƹ5M K ş,1<+D]hMc A]0؂f͓CE >\Mi䵾 < *X{IFք{a8$H0Hy*'Ť x>TcRk14JCqz`7gwW:^mұ#o`fMNыFJ #QB 6Sڨs!@T+ `֩F- א=z'. L_oxޤhW=PP+I0}78IO뾥w ˻߬=y$kIrN(AKTJf z:#ψRіm3)޹Zޢ)%ǔɢA`LYcmLHD!EZ+ꂔrܮw s+Oya +ACnDTkm׸M y^ NN+Ca$z Kp5&"/"wNxNnBK97Vɺ 0d-us Iҵ|~:H4+pEq**A?+0Y 9"\4ft=9G9|4`G&Mht83DZaAPAz죝]Lg糒'ݏU a*J|\ ySȼhJ}{v+q~n@^3dg:k c0S7Io&<ƏZ]s+gb ]xz6jf"')&5tnt??w847wo=ߒ57 ֡ppf X@g4X#'! 4 |}Ё!0F_tᑇ7go6qTaۯޅ NL|]g7%|Kk v:Q"'\spW ኛN$kl|{)_XZUeEaaQAƘ5/<gOBq,<= s~B q̬_WgV+^Py3kc4ss*9Ee]lsUY3bˏĶ[[+!aG6dF6N .sO% D)|9l#7a8%LGm~ISPOu (.Q7{rYn+>`U 5 H%:Ft#"~FS`)Hboba➅("}G97`5OHΈ}*tčQqM9zy~-lP— mISp~ƣ C\‡A <ӫ%!$Um(I lѸ!'F?,FɎ*o>-t3]uiP_b\_島Bg^'˺sq 3g} Ɣ)5̶Uf'>rIdL(9@oR)xKmltI(HU4ﰆ)rE-odwFF J3&/vԝTԺKjL֢ͥ93*R瓩8J;*F)6C ٜRib>LDS酗yx|:S(B)_FX ɳk}|"ߒQ=_Vnrq$>|gҼs?^Wka* (@ "jPsCD ds]-䯷E4 KF]z;zWWjZSr͵Xżsn ZWF΃h-:&|~4ptNI>x:iHkh 4 P?G4!~*LkcPآd7/_2p_ذK`AYn<-gO';(fxN\U!Qx=uO?fz N ̖37ן֟noqM afه6}u:яwW6MO?E<2ݹU>]#6KɹZ3SUuxIQWKvh7SQk>wA%).>֯ n)F=wy= yUN3FurjĞe2S@g_tmh\tgoՠ@@"m+J 7"6+Sޓq,+a4rZ"~ڑ !#9'߷gHɡ4$0,U]]U]GwuxN;ϥ~\EɰZpB py@Q+F8Zm}yETJuaW\Xs'DnDtj5Eil[inߋl$;xe`\Ga+g*M#?Mnݍ@ݽlEo~Ϥ'g ޘAi&~H\Dב)YH}:yN RTm]TuXF=DoOQQ!!s]Y$y$ic1 ை4L!J U$ADXo6Ww``P5`,nx.;na{_jz=^~/[eK V_mtjRS~1lɌxhNdDc[:0#Xl;4d1~1L4eB>;6Mߚ?6 \4 Eܨ Da@2=:7Mƌb)cM !Da*$XR|o.!fŖu|lAwm\o;RorS!xM^mz6;A}ݢiV5>_6-fJ%w>ʦ]}/BqoFwZvzù`.e↳~nrn/qq.zD~l-nv3<4}9} [ʮL -@dMɬuA.fq@u.X gA㧙wA/&Oūuϣ7o>B_.πj(p~=YlҬoQ8۹v[R4~ql:j_MQʨKRA{{P/rqW@<qFvK9ez>jwG#k>__]_ N_ٖ[~ex"W7ݛ㜐\U|gwV@,D"=׳<[>z.(.AsJ%/` lA<5M 70eN6# uz7JeS<Ɯ*H:}lƐ><͢xOpeK\kCm0 hl0 cT.FI* c)0 $Gn3[[,%gd=k^yoȝ|npKsM B3Jsb eHXbX[X T 7Fd=9dS!T u$)$ZȀ\FQ0h4JBۊx& GQGJuf -**pD4WD0D3CL$8:QB#^KZf&iqpSd$RyKXڞ҉]=~9sN;m/Χ27#x@I0X O_;%'p u ĵϮm{{#K&Wf6 i,p0#B:kCJnG8!qQBȝ]X |hL@){ <U"%nNy;v PB1%|Tb+\畁[ l!=(j]<}Xҕ $7VX(۸ M%HA$ZrADБt ZI )2ƚr&h̀MbiA̩[# ~R&HLKK爼e@եVI$U݇֒sRo ZEJ{`{%mٹi܇8x,xIۮS3z#:h͔$GF*r=-c'q)8)iU AX΃B+*8 (JMBP$"K HH;ȕP$ O1Hf 5#5DESDԱ0# ?a AĊ$J{ @U3t%1 alNV@WJLXbR#D @P@= MB KWK^0/F(^tA/DDM2 ϸ4! J^HL1ތIvlg0\Lʌő (EK6;+Fbb+j5a:Ka&ceal Iƃf%Ug8*EX(DF>hp M--Av=>˗("F4n)ň Ry Eg`ݘO&. D҅!Y\E/jq{ƘELavtrANтKd1 ;H{Zi;w$sL3R6*aLb -j s6^;x M7 "󬠅yGfzl(D\gZ,`JɂZt:(*X]`XNr[,\pe'h}yn F/5~^jWǖ^@Hw%'q~ R/vHֱl c 30^"8(VLEii̢*Ӛì9SHƘs{s7IL`(Nw<- T5d4W?P۞0^Zwm'G9Y9Qz=*1z?CB*]`zX oV& ? QJx&L6)(QŰ?4W=@(gOqd묒Qꣵp 5P+edQ8}lc Hv*Ӎ| gO {/7?2͖]2}?vQrm|`%O@yR;V6C R91KBUy~(xA僤zS! x& mH!iM4K䋼,1R'sVUh 1'QLO ̜C&JyR)#‘Jh&J\\,ۇNn޹@lp{'@sfbunԹf*p􂾝:WG.-E qo3n_Kۥ,yfj qllٟꝅ޸Hsn l(_zys;nЪ1{,=bxj,?rػX6.GXc@Wro%s'T7Գw-o>Aўcx>Q3"{?rJ5@si'l=v 7Lr=Kn3d^X9Oۍ, [0Mt.)'I`?L%V[ w"z["8|1B](~x2qߏLi%OʟMv"ws7M ]D}nw>!$fl=OI&^a[8Y%(YS'@nPF)A6KcYS_86/44ט|kFOМ?4mIw?AͺG`ڢ^'5"KɦAw `Uٱj%8;2R:Z/ovp윭8)vUEMyVI ׬U'*Zvơ Em;a \w =BvU#T5iҲlM^t*VGOH"(}hB WL"z@K`E~7aO{Naiu^.:es[kYzr2+ McʐxB&8T0F! bEHs'C?fFyaIaa,G!0Ϝvw_yά:YfdƗ@] `G#ڎk;B.BovDX A# CDLDW\BH1'@D0$$y4m}o蒦mV8]s݆޻L0la;Zצ+=5sg3^GkVSzXS1t^:csuj摭)ٻ&7n$W٢p#=Z+e^NLFGFx-wJ| 䁔b̬6ďocsR64;1tgDxڊvNՎ;bW:+>M]=_9{>׳lJ \=9~:o}]O|__5p)dlx.ox(PK~]&JBNJץB1+J+<{ X-h +-`S!d8c TD64DDgh*הA.AtW'b%mDDIZAF1".{Bq CBZn#ʌQBSh#mńVI:DA_I=wQ2D#Q@"(MwFiZ]qz}nsqF1Tsj<4AMZ=\E $&< ^|E7rFa:&. ɢ❛]ݠfY=ْ{ݻ/f3B4W<|or1~"9}ޟӛ3cnps~$ &Bh53 ٟwdw;Wwxsygц )T`ۃZ0p;͋+5jz Ik5&8C`4IÎ:bG,S0m5ʿdd \^ƽ7UK^wq [.=Jڴc欌9Ҳ;_[^0WFPZϻ iCB%]JSC([݋x {kZs[H] N{{M~A2hϷ['Y݈'jIoҬ?_^4n;w  N˭ǑQm *zA. Dń: ΀!=?ۀŜSF ,@<)7W^?{8!w78us~?/]W]^.ӱYH3kP9|oG{~O2F9̕w b ݲ(}>!- ;8؊x޹yOSL{K2Ϙ&_InRϮ.WafK=t<ז<[QypH !٭:C/\ Tj[ O*aSaAw>Oh1$b$'RT*B>b!2,Zg8ܬ$$+=MSօ}y.1^\»!lC4丮PYLeG,dq@4)PlJ6Y4kQ-ů_EL>+p{Z5e"|]2 [|F.Qo[ߑCi@JMJ a27Dv b\.lxU :No2G >K"ԀYԗ~DwݠZCkN(w C` v//I*EQs3j n[L| $' jO3 Tt&Bs,sƤQ5"Ǔ2Q8$(#i'j`A#zF`dl@Dx5@/P|B P F6Tڨ- 2bi ʼnw)Hǭ$ Ϝ's I;#6¡܌%_= |ɣd]s!WN\,0 \,j 9!s70L< @XrC #4J^bɌT1@$J`4rܭk_V^+:6n=OўO?}ܧ Kwf̀ZGwc4_2 ftZap0XBŐn:R..f)񋋧k `J)  k̢eT-<-7+RV¶O3zR<~dsW[WJ3MD*9hy6Dl E~|_AkIcj&*-4IDN+흩%huޡY$ 65<{,!H5SP5U x\Sɗi\ Qu>ȼP!M.-| bȁZ aH P(ڸJBiDُhIR"Za7>ڕxrl^^m3ez'Bm^x%eE?n=1CyDZuZ@CJPp)#V'e"~x8i9G3Uҙ]1wm|?W>&jZ9eǣWZa6lk)n˻C՞YrDS6p$L£y60IK=ْEcE1xEbThmK5~Hx@(X^Ku.U_m$]Zq' Q  h6:4Apkoaȹ}?}r 5jy)\!O5ORO[Mfm5hFYA3@\"-`aIÃ6g1Ӽncnwe]6wÛ_TÌOn--]JwdtT)G Wm-I\~n\\&`;ӂN߹oyk@v"{R;UC N}̡'Ib+IQU^ŔHI@ІG"JVrĚ' t CFc ժF9*$_oɴ=Ն]5*$c7Rl5՚@}WvQ@?Q2† Q"p#),Un,.>, zBj!OqcqPHbp܄\#x"$Gn" N4^Hr4CbMǻ/kxrJ)pxVp DE#Oメ6knC5WSr si%Kx@UP] B@z\:'iƒA#$SxvДsY=*1JהxA: J#FF-G%yV"O͌xJj(m!p*f$+ Is;WI_W}\%/$"Bg8:4jAhzL+[}^xfQ/B?Ư@,W b|lq|3.z7tф桳jrbmUg#B upw[I*o+y,,%^qϿLd67沗6.A1UyϚ}e;SReIUqĭǰvؕdrU5b;&"q9k-Ct-۴mSزq2rAMo'q #I4=q$:GXOCK;HO4$irem7AnjYT.d*3IG@i46t0o%QhLZ*'V4@ 5-QجC!F)v[*" !> DXoIA˂r9\K!)PI,8DS&H"B~-"+PSz|Jr.ȓvT{N KJ9 GhĈ #,G52!M0Lـ"|pXO -.8lr ~ADe8U©bX31dYSJ6bVĩ z {aH1q+ fH!" ZrqYcd M)pR*`rBc,c3΄HhQ1lwYakHHA9 xAs{$+%A(cp0XE)M(1Kr8¹I4FQˢsD2DAB\-M0Wr$4ն@=JYZH8j`Pۀ([Ck}0#$?WPjf"E 5Œj׆0G$[p* тH F[cOцhKtwM3Y/%P`$]$>J< Tm@p„"li@FXIw5u@DA`a,REdd-TE~YuCjQQ]IR?ʵ@ hDD]΂ F[8K8?4 P4ZքԌRNPַ.ŇRcV=RL2Ikl,%@[@)V| 39*-i71d+> I|;)*)>ﮗ2R$`r:6`?/ᨩp} B ImP*)II)wX4  PTvN>Oo$|"J^*S(S^,nt,Xׯ>lp:nwa:J3Ol>{^g>0?3 |zt0nKC\]]sR_tb4iakYItIh!g_vq#ɧfu!nW6 0Y !9@97cއxq W-^#ÓH(Jq;'A[6 EKf=|$x_ K*U7sd ꧆SBrٺmX͕}GܾD}z ۊ]<}t={ǧ("U)i*NzL%pQL#Q.NO=#0LU xt}s%QAg Yo4N_>7^]Q)@)|yaN c.8,zgV&|6dE .ݳZՙ>J cw>}peC.߾[l$2`1Th獮3w3a>,-b=֑833&7knJuK]Y-G X3 -zqJiO.=oyHr͇>ɗDI׵(hFR1uf z@ɗ2(g;(D][)EVGA^ &,ȚǸKgSgRyO\w ڮ7Tw1CEZ: w̑q]&QZaAT;?!CW(1}32ф~#H_{fPPrkMmCjl-.pܱ_/,RTqQD%2#quCey.1ѝ D̖Ю<~Ђ"4 ( ջFH/z2V/JuTJZ$i1E߶ ~1?*g tSE@^h%[VCS~O FESp41.`сM m Կ~qK*?|M6 A/9@m4vo#cZ17mo7N_<^v$W)צ׽骺6|PAQcIa0Qg3,SgGCEϟ{ƩW 2 R+H0Ylw~ l)W}ϕ߮s%!Dr]NX L?ϟeD"YplO7ٗM^2]Nz|;Z`N[c4i{R/{ŔO +8cǝRʫOOr&,UDU9OC?'<:#fٙ/Nh+%,%G58 3\*߻GpI sro>w}9lΑf1s%֌tR:Dg*9*Cp$ g `Jy-PRkU[,50pT)+r\*+Ճ@nzOW דrۛ fiC3W|"2-$Hm0MZ>K7FիȾZ*yD/co,4T »qU&UNoWFBB)L) /Q ^r-R+4F&bp(0y\4R0V={9sQ+ah{1 UazHBWՒҍüǷnփkv~ה|0Sr;)!IyY˿3—>ڝ:R..>:wO;5()Ekx#xTFdIEj\ջY ֕fPq L!KC/ԊwD9֚'RIk:a Qit>`8Q#S)ujMvY ]c%S~ yaʈ@,sgb 1LP`*(G52<|5K|WPFi7a,͍h)bqN T!ij(aH\!1 kȳ5伡*֐x0IHUpat{*yG &93MI8S,"a08˖Ekfu nvk`)Je׍a*ũ, Ex_߯L?L?G޽]}~dLa79|{f@qG_^gЩozq4ΖA]ߑA2ӷ{w,w$ݚe}`*FC'S@֫1+-ll A'3fngtﮗ*2on@DVQYo]y4cѨZ`_B񭋧gz㸑_egeUS,$X'MJt[]rEgF\$e 1"{.~uaXLS5$!BXȦMPÝ3Ql1Zi%$F9^]"E=^ akU2/'Bg y,!}u҃cGgcҲNRS̮l}f|B, Wvnz2u-G:uG[xQQv9{8W>6#&&"s G]`Jaήdru䠱6yd?ծbf盏QvfM}}SΒ[:Hme1ՈwDLb [Z7{`K/7J"ۙ {1e)lD"e=Ab>աcҿBҏ~VЈүْ~f}ZwS4?$SOa K&1ҒE[6W@2vmz]zݺY/{dDoI`JF?1A)PP~7Ad!}1&]S0g`?3{jw,ŏj!%D2uC**=g \֒rd"20, lBE0ɘC4&F:(ԩqlx:$BJK2f*~%SLW!(NVթ>D F 5'vg{??DZՖA S0Tܦli\9Ik&Rkh>xVV| ']I-`Rz ,gQ7(WckEji'M0i0f?TiUBpd&!0jôK -x mlU2ڲFF餣]@dWdzf4omc#-s59E^ׁk4sF:C":-iC 8#8-:1FhLVPxK Z ZhPQ x`ڳ=v Kc(d_)V]Ar1UdAT{1hEҢ}>P/eP XG%Vbd~lgd[A[W u0<eR\<=:S!.5r;)[wvkƓ hdfQ+nj P^kt׹[I<5QAA0bƳǖ3Ji@ ux`pVW ,{{6o뻴in)ŭj ɻAj w`!SG\`'|:!8ׇ+AZp7&ƒ5>THy\N甇V %PN*RV'Qkh@ڀhN2 *ڈ{fl#PU ]iKgȽ2g?dNTT<ҭdil(i+YFVN<.NܟjK{/o7hRkY/%*UZqDB+2EieȜ&]]7c!ӝM֦si Ot+0\,JrNhZRxZ}-Z yձ[&5t~vi;D};*vRGjv2+ Pg֒k-ZU(Ii` hږF3j3=x1a{MdHdWNm)׹B{뭅"%hOW&?& <#Ϡp<9Yq#* -& 80|bo7le/ɃD+fD,v rHӶ̑L5Gp}ʧ\^&y;&x;k]6OUpz߾\j?(}Ǫ?4xsIQv2'`ާsEmL)k`9}^Lj7 YRܿ=銕7 GB>s-)&za݌0VA蔾u;\FZn2[hLYzY6-]SnN;XeNO9w+ʹn}Hg.E2jk8M릦3- GtJź.g09`UUΩv}_Z?sl -k70}Td#i mK \k3/`G?m\Siz~r{d__KY\\ɂ7 "_wW'ڗRa܎7G~qׯNWWJ7SZ{}&s[#d}N1upu)se}1Nʔbe_6H1l69㛳S>t6DT{u:It$I[#R~YY,wq9Gemd9 щ6FFPN&i>XfMya-H?-n?iSV0k{&M1F@3ȄTM?H Ct 5754 :[bܨ]{k`[,]NK|z-V˝ϾcJJf,~xGzN ɏ7!=%vί &TDFX~G$7KȿP;pHfߞ6RwD$I=ۯ%OcibZ챱hffK4qwqAcmXҨ)NiNRham!HY_=lxc! qw\9L2 jo0X%vjMj-jTO-ɔPuÑ8G ZVg5 פڲ X.rykQJ,Jm!k-|^h.h!C q8'&ȵ .ճ>J6)q)֒3ِPEmk+RKhf->%i}Vhy$mFG0Dl@uRH8ಃVV~!Ŧ )TBxd]5}V\1yîKQe%ZӨmwqX*S @MrΕہbe,b*lqp^1U;f;*!*Mqtc"P> e+V­=u4OC?h4ML˲汢j i[M;C5j[9/y[O^c9s\b>sj;>MzP{1K4@l<)RvSEWJŋ֏H#YOQNȾ_lT0_lӊ3gýxXIKIZO|^H3AJU>'ވYw RKw$ y"A%sktuo5WegtݬoKgW$&o'2̙P7BM\2+:;߰H$/LHDT*^,- 5[Q%D*f,|O@0IgSq^"y]O}݉6C7$]  ۱[Gˑ5ߠeG O+jЁF]5sYpA 9đ"T__%Xs@ 崽9'o/!bn`K8`*8iz E찡MN_=fi1UWBdr*n@ /-3Žҝ&=O^I8Btl2rŃ?>zܸ *MKwU &$sFc{;Ơ+[;_z|J=OIDk@E`+4ȈrI{'3B/{ h[i$KC*W7CB>>vM>6cOwo跪_i+5y hd}֝ݒy>ypeŕ a}stO)5 MX.% +Sp?6/] o9yI!*c5؄O8nK/강{u V4Q'iTV<8k(1**pEH3g΀B4d[Xɴ+Wk2Kyv6В#SĎ 9Ej/MXxre]&#A~Qǚ8+$n:Rj~h6l`g"[q/!N+SQiD |ڷ#ߚtpGEYׂ zbz~  "u&5gR*%m]җ~ *|[Z7JlG=xC38PK9%GSx6}z8&QQ] U[8BsVq-8y `d= nX%ܗ G@8>d8oyy~=[xղ4_A͹X4QҹH/~dunL6YG}q#G^}Q>e)#XNKjP佁2J)];ٰ9iSx# 8NQ$zkTK L@SՑ)݁)}Fu&ߧDezHS%>p6=7q %Fr_v?9/!n҅C٠{;Qaw߀4M@'t%H}Гt2ѡl*`Q :>E9uP+gL2nqhbcb@-mX쉷sZ費LTfp_RG PnOplriynf[Xe*/"cbAgW% s8ѿgn&~DLp*3K&Xsee[xbK'%Tj#cdKҽ{_3O=+4"Q?[ q5-ǶoxD1TU 7(ꇪ΍{T=wa~C6u-*cǝ[~{&|#n\`PB`TYRR_ǣRP^ ';ei9'wlUӭݕпQ1eT 'v':VUtYB}M=̔YDs:l[(Yc %ގ$/"Ji`[\h*oA0y&!Rv@~ۅ*`@_yZyWm"uǻvIUnpWvd͹}E *]ǏMⴜ6:́(m"q@u2}(U&c,V9{\ϙ/)ѯvFS}<'bYK&Ϧ\b2I? _T2|J/%ËP eZ%ѹw1'J4gEKZB<_1¢\8s u> ۏ#t\f‡QӉG?q:DP= X̆fŜU,Mdv5Ɠ-L*5])ju2j+=&W'K*` Y/jV+<7jEO$j?5r|T]rڂm*xU E4J P'y.L*(@?Rm\$ML:7F%^` z"zkbAs¢egǟ7/,XJ#gADNi*h (,r &kXqx rr?Ռ7ӈAH^R9㸰FGA,5 M: rp+Qib ABP(M s1xՃ*;FZ- ZTi۳ֺиfs,( Ex =԰j4h˩s!8f tJE?E읢:lN/Q,i |\z`Z % YMYnѹ Lee#3/>Lg {~4+9/Ng_-|ުs[|>6xton.Zkt8Ul[yg7 $(wߥWewWŪbMݯ|_hx{ݯk/_łp R%mooVC-2E K"Ztt?¼f%GqLvDk?/!FeL|sjZ ^1]/Fޮұw׏'ـh_O?#>ـ @Amtry2K{]iV.e̛?i{gbC0I.¦V+lz0CSm+מ>]OOLI;~ '/ywۻPĢ)?M|ήG?7v=v>Lj @;&]$to3p@{pwme @m G CII^?t]:]u'&Ij:1Ȝ@Ab@H.A'1;gbCJMkd4Tk"9!ʭA&YuuH\! [5P#rl\+.Ti&bpЌBQ/VNJDrA R\%'}Q^`^!{?o,:}z?SNǢgsc ~}~n;SPs)O/twqZl8yZ1"ZoUYf"K:1CHBVXˠ$ *>GDD#xUe1!X ާz;zG֥/;dxfh§X*kN@1P3Z"&4TCk(س]fHcc[: k$2d2ĠSC# C7cP*j1Ց3tF)2)ؒ:U]j!^+6oP͕7%0 VX791,"߽~odh-xg2M'-ӏ1?ĕzӄ߸7 7ϯBL%-^@F`'ADddŴ 3WW0+x8iw(]{pwτB0Ƚ58p?11,#,2Yax /~0$ a q \ >CMWBJ6L'*"NT~'Lf-{yL3M6|-+nUXMyYEY̘5.32=_8rhUyZſ<Mv?V&IT4*IsG(1dJ.gLK F`3n4 H {%: x9H/$dۙ˙hX3>(ְ29F!Re3ʔr=c< VUzEIg񺡖jxea`h ++ ۏ :J&/uJc/qƕv<#%4pӕ~0y?oט!̷CZ]/ӊ~QcoO?Qkf1mIzSdU.y.^ŧѫz!QSCԃElϘϭ"*.>$zuFUs~Xe:?xkH)5C2ZxـXҘ:+b8rmL:ZKʳ?& ;򌝹%Լ5yzNGA{q4a-l7=6n~1Q'0#Ow=w*<9Ff1rB y9ʞ U%N.x: UyYNiY T5xꜲtI3]_*5!} "X+5vRR٢_C+A0n*gH) N,Fic4D= mqR% y 4+|SAS .RUg3Mj fTX`#՜z Q&U2.6%z%w_MAu 7WsE:I;6@QzGnIi@ 0gVj₦^!>T)X:e'<&H-£X#L(&Lq=X1(#u,ӊ8a% jJ%DRr, )D/dhj"X|B.HwR Vt(anlq6%(^*LŦ%B8B)n*-%ǚ ) ɃSyPAl&-S>ߙql2ZGF ,DG`d~AёIzK3GL]Fy%yһr$>eH eL\s Ʀ!伅L:!D~5T#~&ݖEׄf~1IbFD0v&9@ 19 96p"8%&VD0TSttѥR$QDYsnI0cΊ /"dw'fp&Qr( tNT{N.ٻt ==8{Qg^.;;rW^(Äĭ\LR9˴Ug'G/r=P񌣋mZ MzYM٥/X!ғ `{^lD6b]qc*STEO:\+#.2N=缄b%f{X)(*X`u i vs >,e0IIGPRZJAu%u2!ڜf*df<鲔ˣ.诃g0ɛnUa翻vd퀕$v2YJ5x)>1F5;Pf#}+k0hJ^ Ww깇HzzfID 3V^?wCT ^6.뜵矀<̥@=&oFNBXˍѱ>u~U֟ d<O§F4_8Tӱ;*LԆeǻ?)~[w[Xԫpm %4| 2oQPx ZD1Dk-ѹ7!wXV 6^lvA,+8"DǏn?It:RcBs|B9 gɡ]s^c}ƊTԘd0a`2 f(EDBk&Ήbsù9+] p E2bR1մ_?Ҭy>!=ᤸ\*iwyգqy˫;sMq3~l7DmBCFpyKDBFXDEr*[C"M֞UgEpSD0^z3q!^]8p ~ZZ=HsuLoGq 9)z/Ңrj7rQjX}n8 _:m _<<ӤI 1H;@ EBȦDvwg 6S$1LfA ;Nm1!'VF7ٞaeDvhoXm!qB&}W KbJc gǞk[$bP %FR`bGʓA?+UȔ+VddAqqʤgR %%RX$ K΄ČX8e0rR>FC8#5WðY,aհVi9cLo flR?o9rcu-FE3?~W4(Ȼ ,[\G?IjH"Qo"])χx<^ Se36ەQ;ӕ~XN&p} pmPFp_{pTJ*pI7}㮛*)d[yhI%cѴZ.ZVUyuG 3M.lf-n~y/"Pb$B򔏤.g.P>u(g֞%)F`8EZ&arYC$0q ÚD̫8`"-` %g62 jim@^r4u\!2vb97>#<0&M"{-ބS7V(q!u,u6^+& t9 Mңÿ9\{rU AO P4+W::}Jcۺ1J 'nuePFu>u;]nf3VѲ֭ UNVZ3<"Ы@झUXzzN|WN3&'|7*VwWÕpR\T-Ew(5ۖHc"_q Oht.*~PG.z*NV)|T{ؼ-UTRRbOÛM/m/s}KQr%(f&yEh_d hWb䌇:^'Av_58[̌w$n.|[VMZ`Lgjw`KٛKp".oΛqz3ndƃWdi/uI284|UGvO({ Ӑ> u%Z561#y*kgwyE~yv&hgs-Ԟ{a!4ȸ,=eEQظ?=hhYv7s 415_ Ai0~Oe}qڢglms 4`d ‚TmNN U R;yO2-OBuyΖD_\g>f$KDT+$Bصf,"v O0֡/jV%xRЬثvXv5薲wY0:Jc4-O=%֧R<ܥgr%sJux !։"G4]VwZG \Jf9:5y{NmtVH6 99)piG94 Jޓ\yejm 缌ZT@Ζ哀zs4jQ؞ю|ŔkR(}iJ`uCl|Z^Ov-%5^`vԔ4}Ӕqʘ6%9S"_B:0dcj %y;Hb%&Dk WKQO9bA Q!blu"<6+ceb*_L\CC*7|&OtnB ~0(y7ͻDe4pLb{+rʫ%L=}{p ^}ُ{jcWDDEw?x=>_+>,sw}'ÜiY{pԊg(B5f~:|8x(1pgiMW8L^]€yv?b@醹sMŻZ(Iv *ICh v+&cY~L MR+DԬ1EX #avڥC3I0gQ;o n(~J;˖,[Q">wmle,x3eypJIFdj30ZWlȇaz 9Z|c$w0ˡn7{4Dh.}L#+Œ`;m<0%rrc0M(+ޯ`Z60@x<i` sf\=xr8ev4n4{1\_9jGw?Һ)֕%ާ [dVH&oih052JP_teCU n Ok 7.HILi&KZ(u\!aE.Wp0ex\B?JeBsƤ:NbMc N$:o;' ÚY0CpV>!(W]iީ[\9l =z(r+ZiFR7 lO__^VH%n5xKWz5,swj;q>'99JZK,%୰OkbIuQM\*1"(F~K G{h16cYJKƻ"HqD J6<1gVz&wm;j0,ܚVO"ï^aN)4v%N-g|*ya[uZj7JzpSl)dkM_ pf= s}VZqzKa*+BRqeXhƪϑk+f;$% 1i8R*:iSz2-Ҡ1hвƖ38)4< #>ckm$7/w5$Or v _&0lr?I`+d%V?$jƀ-bWX,>j_Ԭ,C7*l=[Ԗ,LЖEsv@;G?x J"yotƴ=`gLQS55Ds2\yja 3RŒxR%Xekdpo% */$Wo)B7J/s.1W;w77bڙ 2(MAa"r5ݎ~~6;Iu4^Gc{]5tYQ֤UR˃ *m|^04xP9a\A=O#s) j:}(<+ـ>q{i1/9Co#1i#\hHXL ENHǟ&8]:NFɨH?sTMʨBQG4PV ZJ* x pdoF:븐@!R|ΔM zүZ 1`]7Ц)6W&r$bR~p<9JR:RVQ {)-+rsa9S* 5(DZK&87zW(_^I\ @iD+ bқ\x{z>1l~h(!eeAXԶKK(ϯwO!  S_~nr9dɌG!N|)ooq"5U&{򏨩 ?Hd0xvh Wi b KzǾ"_1IIp ޟA).<R;}Fwc15rx2HANC_ IYڪhZ!;mmb :LHƉ6M"ߒ)LM8ZRF>SP4^\Q/i77PfN)wq+\\s1"dp3˘@ s?UY96xrv8cj;7vrewM=p>?Fq^lgx /;JgȨЖ -d*{)OYߣ?Y5yA9%<>KZ߾p|t"cgIo\CUN(F-z pMkOCkc,E6|yJ˗e^\39"tP=}Ry6H 6 cΥQ(JSZҺ+DZ(rmd,Ts\ H5 @Srbg%Lx=cPHjY8i PMy1eYOVfy^ iZdfy룽 +ٰy0DWj n%6A!5DLqRI-pn2jb: sѤB1ZP3 .+)٘56xM74B6ۗG*IJJo_"*;E$-xۂQ0T?,5hW8O#)+my?M,i"yb%Ct%<!'} @nXv "]@'di0 6h"tDV#N KC]KER !PE"/HqDy! gs4/PVX=GMg HeIᜡq#:˃ 40@ }yОjhtSܔQjAp!R/tRb'˷/T\ʟAɣ|gogypM@XOFGnc*&TjK6vqYH!.a8z$l ;#zu-#FPm/#ΔؗEy`I/4׳[ 9> "IQ}n6"yo~GXڞO1&uUukB20!#Z&$5P}x蒺R]r'G,Z.d]ROBsrRl#XU P{"|Ym --1ՉVP^=NQ `@cgh%}z-61w2Md o lv~ɭu7v,|~ ֫uZ9R;5w"+h<4 92vˋ4u¤q0 ZmkN.ג]q"=V )|%QboyQ:b^{g<nC򘍉=;-~qz-;G{OLn2i^MHџ~ֲh*}(A4+3ٞՓ,l*l$@qAm)ĪJ)KCBy!R,D.><\5㼕K):mKm͍l^pI!Z9"BhKu!Ia%z+^*$'Ao15P᫠pL*oy) u"9{]0SOnI_%E :Ix0`#@qW;-[] !T>>ο++.lLHbHe- N״Lzäi WہSW BWVPc׌ymׅj8cƌo55!߀m])eX 1=Z%5`tvB 3e@MPDUPl7ګH ;KF.&jD}k/wʉ Pg ?Y*`ԫ88PʵEThD)ږ(2VՊ[(ލUFzK\rͅds}9\*߰`郰j L˭AFOh,zn8r84+5]a}u\X#Cc > /Oºx ~Vl~2CiCXDžOqU\Oxy|w4ڳP8%}(gg-i\E?OnSnM1HQϨcݎ{L9ͺ5hА߹6)!Oޟm4? i:k?aq#dDx,sxii,>tu=/o +!,"`Y2" ߙC,}gOUz j$YᏪF! H i/,It鿒V@\eGiRw $ VAn۱]Q(^.¢@ {dVCARLJjfj"]1@um<pMDfj-o'b%6Fo4V IȚ, e+(őQ9+>L?by$^ŵ`a9i04Lp}aX+>kهa9>l?-~jg7u~!^…'@ܑٓE{Fr"G!)z#(_*oC vkzX]k5[^ ޫP'&Ca&RRI)#3j[Hƿr!e( -77 Ҋ7T`lj`iyr>qRjjM+Qg-}Z*D %~Z*D"ՊZ.HyVlR5t7WT`mk)iZ_8 -e{$NnVcA+gЕ~蕚i2EDea&)kS -߯yZ3g3C@9W4/v߯JaU0SȂyƽיU_$W`LN|JۘzR ,bD唠9_Iρ* -#<HkrXJW(LJ$qz6|]OǓ[/ls~6q_$愱CsL>Ez>$%!?)ӫ!{+j4V++knHlUu2B؝<8uJ<} h>p%@wf~YYYk\K`ܽO#I@ _Zz3b1 *Y8m4BƖa09aZ8tP9ޱ@j_UMf8ܞ1t\fU4 G,5*2?8R$fmRbg9gGG%Sm/ JK т{N#D2XV) JeVf/i+(nNljj0:6nG|Gm}8D#N]X pFˤ INDОjRB XIQ6)c1yZv*vbMsHy'%mO5A`A9ES hQ΁3,D2FCd`"fyzpŽagJ=۱cy]J-(4j%+]ZzZJqR|B'DUj9=m-e$OKIx rZen)k)Phi\iSUK7S}U:<-ŲK 9 -DU4ᬥ'PyVÖnNd=EOl=)8[=Imik)yZҝo*ijyLpZ/q֎>Wj<3Wug-=q-eZʆ*YҹB߻ON7y|rC3z9fL*qkbɓ13y;D[pև j(`Ք(Ao.oy̫]#:2/V T(!BiTm腇4x39OX|^)>>8vϼS%!$mO{$2SKߍ?lBA{T&qW)!.\d8uaEo:h_xc}JhL7yCyQ0nS ?'VN۸Qjqw7#]fw>ې&2 \&뜊eK:8ܟ|By"!GzH7I pt -$8] ,p20yš{3N0"8+SDKB aܠ syge0r(yhID6'Ks)D@Rb-I~Y6K.^H(wkJl`BRu MA[)m klAjaD1x%Ƙ }tb|{9" Z5F^a A2 뱊& W9Q&΂@5P#D x4Xd#jB!Q9.] :hH\#p6S}U:Bw~N1*j/,y!߽}5e0J#Sǻgv'ߧu<,abQ> +(o/ğ, ]zщDr$ 5qFCʾ9o&b_:rVؕ轾oQὟdrM H*M{a;?-ZtzklGKhCoӱ"5޼p'*޺0GKqO'P*G"M4?ZE]2 QLJއyu{j_h־sQ{(&LdՑwO:r$^2Dl`N([N~|b|?_n:맷Q ?>-m=Cd݌DYPerU0E|0閆ŋ?^+J7={QFU/zw*;dPkJ;ٗy-U`Jհ`npjm لuņ]asWT5-yA|CßT;d¤k OAٟNA T4tt0t>o@r}\X4:')NZ#03205guL A!AqBFQ<)FԽ z%dBir!Ŝ4cp0 x@JiA07kˑu>R~/4PQ_ߗiwȤdadX :sHa2!6D^)a`Ad#`zvf鈝Uhtf|h$.+0Ȕ*MO{ 7:n"X&(\Kd `J\1Jҗ||tگ))SzqJlV)ӣ)!yz Ɠߧ WMuTo."H7*ɧ?߾Fe<y8]fw]*O|͟%U\ L۩ C`RRF-,?!40e]|Vk*SqUTz1O?f?I*%Bz)!Dd*̨ZK IfHR V߷`,44?.)]$~mS x%2oF|MmwM?}71:62#({Rrܾ9ztLz0nf.cxf3ݔ;u%(M|ܴwTheu-&h7vv g+|CZK .$ gD-mʌ҄ͧ杇7?> rG}|)c-큲G(f*1w4~;|xN52uC4RmSDtW0k[齣M>KWD|tfbYQ)Gu8B; a{Nxgn(TCRuٴ"X夲i}ɐbr{3Nx s]OCҖ):+q{9bΉTGи:jz%#~~{WU-[#\՚j Drq+N5ʡ{tKG$%8 oA:ێ4tm3=z/TZкuAt}Gv(Bm.[+$֭ hM2 I١0r9OX?2f3Q,C%C)HEiVn hS[jj G K ,y"JT?#%M흁W[Ltr]02~ֶn&ٍ\.w҇M.ϋR'>0~2!6]"D:CYeZͮI͎ O 'D#Caj ~U(~ Bha B6P9$"MTu <`b`tX.d~6C'"Gwڌ|uݼypmL 'DJq$"ɫH*򪙹"[l&W Ahh r'5%mIҭ<)S@e&Rݚ3]ud֩Vz06MO9OJ}Ed# how;[~G׼EN+%{/YΤys)CYyq}˵ćͺMjxێ PpGմMMyl:/ڢM36+\of>wVSz#Dyv,k-_;4{sqL "{w!F3@}bQ[z8)@ v74P:K}LtךT:~Gg6A4[e$HSx$'iqd{/@RÞGؾ8o$S x{IDE-@3/}3Ydm&zuR529cjguGYn陸i|_:_~mbWu 0UmA 8:rL{а0 JsV@"!<4!$,DTIB g] +dGϢLᢡSԚ0dŸsoN)jDSY/x f&Eło/RH?Cb~ՁX@6=[+me@Xgr҉傊!?o-bԲXN ^籹t"Vՙ WjǶdAžX+b!MI -j}ϥ}U4+h=Ou0R ly .{ ;Fi^* clw8vȂm;^1;[2/ ݯí1lUllA S7%'R . PF+#{qjRhb -4$*1XWKCz[1ume-5BaqUxfڃn 9t ^v9N `4BC{i׼~B $4 ~ (l^v{? 6@ 9 5d Pαm;0)IS bڳ޷Sd~xUbg!n :7x/@<:{}G ߹yAGb獫?(Jb͛c\!/%}n(@FN#1BQnuTo..\bσ'D{ _zj?*Z\Ai:72`\'q u`T85` D#LaΤ7:(  -R3U A0G} %UR_<|XE6J&-oCU|}ʍz cfzܩ&g/əUj3.7/qVM۲(sID%H&4afy6ZP0&b*B`(q(c ^(=ᔥ*]^*lq}HYO(8jq"Gq[_.zw%@љ],{;tJwcmPrbّʔdM&)PdY$E~޺*HOD0gG30Jh6uL aHJdR9(qh`&Bj"\q bZ>4L+D 5Su`RoKXU{P_rd96l+<˙q]ljӇkhC:+8֬}tA<0@osEEӯ8MI"%D (37bha]EGڸ?N; çã>ċB}'`y#.(zBQDZer*} 0vU5FyUa']>^/C^)<{_ !;]q[t!7s .,VK?&ݿ)Y+5}C, N4X$I% kMkftB8q. E8 6JnF!$ (ۿjh!7YjPmmbS˘Y&c-Ҍ%uR4(#{0bw\<7D#&:8S]g9 {Jb}A4&eH6\&'JkLʄ+U"%,- bk4e!˔|juk_h%B|GZGJ Ը"NSnDU7=^`AnzRQb kvx9wڇ#m|D !z:~eşYוϜeЗ+ϙ.aujɆ\ 2 9Sghws$pHg=d{`X˘Iq=I(0W)I,2EDG'+C2My'GJ'!&\rO[?k!(vkRF *$1Р:Uރ2cIH8R`,*+4-?:e(eȕȮX\:7`Ue[~`):wŭIn` ΄ΦQj,lϳ?~; j4+ftV1{S@)nW6+;Zl&)2&r?|V7_Ɨtjf5!\$.7[m5ފ~Eo6 wj:J5>e \[Gw㨲|^> {+ĬmY'/N8}(Mɕ]T3FcsJo_n59Mh|/iv4'%37^Xfd[T]^+ |dM eL]-ywl5Cd7:"d!S97u}4,j:Kz@҈x[Nz?W$1 !A *5R, u/2,&{>`<Ѡf{b8ja{ '9X찧q8ZR4ٺgb7p4OGw}@)޼m1+;I5u;0\r+a=3Mڸ{{d5gO@" Wܠ3Uޠ{f#L`k>oΥldyKWK*aUMJvhJU7?mХ{y#0`㴒nyIS|X.T4ͳBs?CABՐ"iN1;@h9d-߹p~ DT:78]xOb}7Z7hUlӶ{]5y&7s7 LCF7;HiŸϗM0T@{t$YsMxf6K5v⥆vVvFv]-OnmHW.'˔T1rYS>o 0%Hsjg-U4EDJ8#f\ɒum@ZHBD+0 RRiXHƘQRaK7@ pnObsh2z/V*oׇ>>[1?ORQD+$L1Ǒ"F1d,:1di&SQ!khRaK @"MYL )$S(/qJl3[~9"z(JG<LR}CO>6;nz63G8K=|WH{#)Պ]W륿G2~V9߳Tc@S~';Ow=:nx;Oj}7ZqoĨB}4pDy)u(kǞcQIsB1z`6qn <n;Etit {(۫NqxSg   A+B V\)DLqC%2<$K,EIBeBԜp(͘2:uBU=h:w_e}p73}$0NX`r3[{˔T:}xoAfj<j(dnVQRr%=1% JIr<Z,&CijVptADUl3SQCKEOKɵL8kE>[\{Xj(l03 W0Oz~ 嬆)Š4Gfo#<=ͫfms;Ѐh&C *s{{jox8*#?;Nv_;.Ovd.+Jrjrurk8y˕;ͷOw|>qqETk=}~ivNV|ۼy;<5}^U-xFk* axjD:L HY">0icА+cئp?̀L2a,y|ōHXnd ]\>;h+؟QIdHShk9O)͇KTwzͮV#{;p$WDr%ɆyFSTͷ!%ɓEyb q~9ho.V#MX@*T0gښ8r_QlhU~zevN+Q|%'_zHR=7qRN${|>/3朢~5\A%ݵqty|Tަ^J!}CjĐsU{ {j vWCB^&&.sClvSzxv}8C4]aʗZfߒ91Flţļk` ?_Eyb)Rp |t!Tag LD<ōx1tԺV83p'!YBvl2E#Ǒi/b'W ..'&e,9z digeVT5psZV $+c`OAl-#;ru] XMQ(( [WY-`]@)IUeU_oSɷ~nr9ҩ9"!d,7pUyum@> M,Q`<ڼ^>͸mh_D><FV`Vآ5m^ϿqU_~!&8wNlU[߾ߚ3 W7ǻ.rڞZPGt=!kaWvVرBGi̱¡Cĥgkԅirw)΀Y>HmmWc bT[kУ3cF4":L+\󬰦,U8vH"m*O 8@ڪ*|]2TYpa*1{ CA)̪AkFRUa&æ, zfܾY;`&QO2V=9暑 /CUFD g?[X<"G9J&˃y5!oݞ7ߘ< R\'6*j`'bRM@F ނ8Ҡf2$=wIj&wHCĸ>v%?]3fƹC;p℩%wۧwepljHB4L6Z:*67 WdNGbq䩼.xťUƢ+.;qd֙\{M}xU;AhQ9Ѝ:AV06h,eN5~le`v4fs\AOYLZ@z $dgGZ\c*M8VZe<,hbٜ7-:^^WpeeX/_-:>?>.4U]73 (:VQ(*BV>3!x@2Tp]XU_9ra\N"*k(4kh@c1EJq%ZWCcYy_.?Y5Ҭ?~CGC'Yu=NsrM4m3 ݼ D$dM?'d=fEk[[pA{ X'$L ?|5[mTD@ֽ^t湵5 gvCw޾7!=s~ǰhL}<ęK<ĩI*Uf3]a+AtNC7CG߶7ck:T2鑓9XBqgCZ+ksQ #k|Uӻ(E.J=誤1!: ڝDo`mڟMb/T9&%J<8moW㌵Mo9sXkc s2j Pܕ鋙=j )գY2%c [/+՘l,֮rEI0IJ[z맃Yi'8ږp .IPzunp٪,g|=f<ܤ?o?Kd81/o0h5Qޓ/ʪlŵÕg+C&UJ=ʳe!o;{ 2Յ%;`x:Ї7s?aH}DqFn h"\[+Àh|v]+e@@ihYEXQ,˼f(”Ty#!0դ*LJ\zrh2@(.j5kfɗYmJM˥в>&uZxg#x%t)&a cԡ/yCeeo@pӞ0bݑ#fUKqD-J9YCS_zƺ o#F5HoC$ zO L`"x|AI@M 6?O?GDf&;4åKEd3dɉ|JuM5?XwhB]db`p,!n+@D%}|ZJA;n-TtE@a6 8apK4Iq\I{T 9Tb>%ΖjSC4*5vcJ2?~Ud}(;?@RJ<2\\٫HӧޔkpPFW4({$Wi|R{Gog+$V'd=c];ԥϠzUH+.r\WTy&F̋Չw|>b]GEu SU(lɢg Fy$@ϓzą/P}*'7-x(ٖZfD89l #D8JJHK2ui/ "׈d WJ'ّ߬7,+I^Wbk[AŢlK+9+$LCCja)Rm2xT_"ɪi2cRW)MaU<ΪWTW`sȼQ _}Elj:"\< LDG&NH}8F1H0N4y(H_> :t&Iְ]3$1 L˼(zkdg&)L $w߽}I&A <%x'iR6a8$`eRb1`XZ<} 2s+*O8ît&51G}z؅,G=#؊1 ѰsSvHs=6py N9F{L|xt-L*H O3)&4`UhVrzyLxTAoz(r.g-U'vmY< Fj ămyT}@ KԪ|ۛ=0seB#_&xtn&IbvHY~;~."A,O W\9YJbCUޯLjKɲqiFnňp4VOCU GO{(HOS|Q\Đge ΄Ʊ ,/K-袠pnZP/!̖|Ҕ2gxiaz_6% gc˳ں4sSkrRVa ,6ln?W/Y݅U/!/Cu߽7Uy\ކU͗ ]6cl^cdi̻Gq^׆xW/>=fApMWRUԙ ZIjrRYZ]my'OS@Ƃ_Bs #;|q#vxBa DJKCT bI=SŪa+;H{5jfT$1NFW9GŨgü\vvecFe]hMJO#uwD;MRjǢz8ȶc"[X#I6NMI8j%0'UkmdrD-?]}JD;pFCmHhNN N4"-jԪke`L =Zr]hi6‚q5lT{2q+1ծrD.^+2SVaڊ}w/u_ʭؚ`P&ȧٮ\{`q蓳V̀欩c9m^)8}! F5Ow9t^pD`,gZ+U 7+ wc2a7g2mXQ?cmo@J^ yYITeulFg /yn2Qd7~f:2AFǮLLf3!N!ds):UmC}=Ɛ&˱%|* UjsB^̨ zE hAAkgPEE+E4r\sQ-jǢ.,l| kUNU6 qO*2+gԑ8=eBi!3ķ5`8|&1F%aw%#ʚйVv`԰܏NLUw|dc€$]+O#4D;}I,s`u8=5MWm,MTM ]O[fhWgLG9mQcR;jLU6b DLC]`i\FS7"N`Aj[?'`:PͦhXYN.1&yN$D83,XRqnJ\`L=4NZ88=4}]=K;>ɜcp{kK#s;$hBd,Z d@hͭ_pݸF$j^ߜuźyw$q'.)!كwbz#U#$x瑃iGV|prQ))`>*h?zmRĒcF!VНVl π8ӊq8 5Cf3TQ!3#)!(=V+})wںԪFp GȸCːr$ɛpԆu|8#"b+XFoXpxԮFE#6E}Q;],DbSx[>̘;*Qq)6s bdn~|_|Ӛ>Q<* *o/U2\)-dY¾|4ﶗJYyEf;$ۻ8 K]sLC೏]T-<;Տ_"hS':ںlZ{r|OqDž\۷^^E>l Y 3M좱Wwq _ګ;w}Ĝ)q]7]d|MT(>}jƿKzޞ=2S<^yѸ`UYS!52UF5ڸ楰kn^E$ȜM,TsDXaӫ4xB_{JZiZQ&n bHi_rC |n~Lu5FCDln.q]Lhq]l]_yčL]4ǼB8jCoO]%=L-J˫jOpYaqI=iߗiE~yw7 "9\]fM+J=$ƃαPs*/B8Vy|von∹a9u).Oonoty𲸓MӍH/!o=_p"?{K,?`ٷ9yp{Y5/Z[u]k؅X8J5~Iʦ.˒eivpXZ.`>~ȷ1En%m,Nr#Dr({Ѡ5VxZn@\ɐe(lI{(x /:`ci ʙ#ˈIxSRJ JE{&u[sOI}.bx\d/oXV܌K ;abehܯ'g.孹*nCD)!;5tլbܸ]NN7y[NR|x^buP$=a܅nԀFC&P,oo*H*+&ƿ8rxL8xҘ_[/8XErZ?<0f}ۿ^a#d* q=a(}?I.}:`# >_իc6q\H\xDFqIbL%̨T\5Iuk%shωzܢ!Y^zZW>[fbXkDnԇ=h& F3{gP1t LhZO9 i̦mEm,ZPv*O=|6f9|brr0`o2AC@uNogqէLfe9Ӭw3]SLi)rb1p#C:5Ȕ,4X[܍v@^yjE0 xKGq/B6{B`(Z|3z&0 [vk%;fd)8?~F%ch@ AߥJ^S i`6xʋdOW-<*5)k* 7"BEB ib_==foή.ϒ]rpL,u{rq$'b*xͼƅ:QjJeir"RY-8Wms| $/$oh iH3SQ`P<-ԆI/AA^J35Bˊ]-q=b?ʹԙ{ˬ;?q˽{sN,+o)9Ed3,3ƔM."Ӌ5ak0.}0xzy0_f={0"D%H#C`i`ͿH0*Θ0{[>ۭUiz1%ȈJ- /_&sG5u8x|3@~s9.e0ԌrJL[w6#Xeu#|Y͠A@2g4#A1$6Y )Š769IlRJv!S2oܮC{O`-dZ=WȳY(~ V:"w;uOInӎTHb A OvrGmzFv撁R*?]{:ZZ!b#RJ )B`mZm|zV 䤛f!O8b ,Vo^+FCwɱ`l&ZN$A DH7c \F1jlJ}#qBZDq{DTfMЏP_pqC;N<=%A_vjs+`Hi3bKث 9Վ9#(BB&USM?B䝈s NA6x(v='g~|FF. ҵ%-HDS͸ A+dJik\6@JĆ5!Ӛ0iƒ67_|&MTWN2;'5(rpace JZ(]L^c; uK/PGZR ˂^27ﶗFT:KgRy|EU /Y60 "gl(f- kUj k#rdܡeH*/Oӆu|8Bn^r b@15ԂZ}Qk-ˌ%u6zXST3];eܲbKT6f2LvﵖyohLď7_E&;1avDAاs?ڞ44oq-.^rtKNùŅ= 5 d鰴#zSA*CZD q<%?D5U +Y5iPpQxM2sq=cʀүk ƁBoXpxSm(jMn+FHo샋)Z u֮rM~R\Q'O >qzCg[xHx\#uRCii##LX9-{!:{Tt%=,Qȏd%']5ad~n/;B躢 EFi]§9W;qo[·q2R~Pr:BsH*I~0 ׇ&V#Z#e?)'B:6,Gc` &$3#/R9Y8zGhP 0ٓ8J<ȬiƈIcW}r|4'y$xmmkw$7xk 0j˜NF "[ Q|[S`6:PZ1MM !]S-w' FteKDb$nOנ52Įf$ E^"b4o=[;R\(-SL2$%ԠВgG8Jl/Yծ,EͻQa)zLl"5.B&n)ץ8-vK|tAU+ #kXγN"G ˝wWSW^@ L0bMݱ@r4vy}`'?WphD̾GXS+1aFgQӳ} 0"ycMX<ϛyY7bD(AĽX=s.Q.pJA~8AaeIe=P_0 j!&@ciShsRY_ G M2Gڸi{Ju2{l-s ;w֘^{l2I*Bt_4:JߍLO[R"w*=f> Z0_R8Mmt"y1A7RưA)3ƭ 3# 7G~f$5]Nvd_)^&;,s8zM2o8@b,Y0FO%TǯE7O.Gҗk`JzX׫׈9,՘7"[3L.sZ0 244MqRøя$ u][$ 7]) 0(;@JstG %=䓁@xOB!hC" A\2P|L`lySl2Q_ c#SFZܼ_jHkހ^d6rHVC|]jy$h2fz)ʙZ (kXΓN2P @(g՜^̬jkknEڸ_\NN6$yNxmmdIѥ7=S}R(Yh7JmQp0 l6Fi5ګ_>UïrADNw!q8e4\dvgvVg|%)Ƌ+X/z 1e)^]4)-Kʓgoߞ\ƦC1?⺗1L-mD&8X@Wcx5AAxҥ˷;XaEqGT1,N ib<]_f @N62_xAӦ<]FFHy !Fcp9C܇]{3Hwk}=Ǡ79ƞOFI .k Y]͆,a[18 b1ZxІfpqAh=%r2Ę$(:IF( :簉 1Ę7u襣,% J=:N#wdsd.4:Y΄g ) #&a1c@ LoJ"(~!\Mu2dv'Nz_aLzAH¬ X *AQA!@AXbX2NSF`Vh P2e"`9$Zِjs9#8\vɫ!3!('RoJ}%F t#1{ZyN"~ZHmlzg{}g\;$^p4o.J+$1:\ 0[Aj,lmm@5o]FO)dJO~þ2jbĸP exI@ &cB*䙙=| 8| ` @K, xq =~<fIgPjj$gge0TTrX1%M`TT,Yr+i,^ Q"aX΁lg;' K5 $ԒrK&ϰ]eVYG3\"g4ϙ "T0(dM"g%, D^l!E,gFI'ҴBrKK(PyTR*2. h橏Hb s?7s?:&DHh ?a"GLW0$n#7c zLo|0 zrK!tGX;i@Pg& !ĉ%7HdJ> (q"ߨJ%C/a w5ZѠ ;e s4V+^"bJ1$ ("gbڗ8ߟ.RԚר|vkUK¤>қD0 S( -}mBDj@{OGGp7:}tAT{l-ƭX?ğr438%\tʍQ ̟#/Tv΢a &Jo0-\Y]n?/][/E_woͺv'kE{?DC߽v~W&B^&saXS@7^OD |A'LWSz68y*ke ޡ~:즤f5;APcuVsmnwFD;כ(M|b_юVT΅Z|5x.55HN]y5c閈{Vk5+j wewXG7ߩU:# UKNu"ϭkwѺFu|png4LFn֭ UK꽃ܺI +mźqc(`a4_3Һu!_S9s&]ho& 9-gAK9Mqm}XcΌ{?QqhW7G1t]\}NWqviʜ Y]I^m>]9qr0|1<.HZ FIpuY哖uZ~B0zߟ.^.'-x1 ~6Q$G6Tw75+zj^bfe ͆)Uc/+:눓<@cFgȪd~\nC؛J̧rфU0-d .}Dݻc1,OKpg>by8a羅LTD[A/~iAm׸z-W_w6N`{jTt!DB tѱmG|8FRNX(OGp7:IQrCzwjDĞ&P ݧcV,ODGntl?*AKސ0D'UE=R *DӏïRR:1UP0v\6

'ݶN>_>|8Vs/^Lrwou6|}H]C x%hKfn_S9 m3QGA= )ԴǏJ+r9>TT+=(NXj0W!q8p.HxD !<5XN ɖָ9( B4)&2z7w-0հ{ddhzR1\{ FyYNT1{s6 =|}J@֛oy|4qdŕ92,fi;*;:h9Oҩ2' KZQty~҇bf.ogz,Yލ I-OU}(e1ATT23ge nI H| >HX!a58]Ch "h_+)Z&EU t.?ՏÑJP"ݣ4UBdjv "Ȏ@`ay׆m0Y`,1Bp?o*G >c-!k\C/o˴%k5o"Cyc0`T|;KZT[R$mT֨Q[Mm$Qs\)!9A 8CQe0"'"c,(RB9(xx>V0\]j:7Ñ[&ַцǒhսލh| #Sd^M6N;e&]&r͆iq`gc4*hTʌ "ȁi`ɡw/}JJ\O "'7r(+*Ow~lwPU"[fnnb؍vXT-R>h`}ӹ:f9wh8$=% [;q Ww^]ENUJ!RNv gHT %H2!RypwyD=6/?Ns&5YskJ<7Jp<`׾\(:K?K7FofYa޼r;ETP.t8x7nao3EWci HT⌍?AZJV3 dn7aJW#{QѻK݀63e3H1l@8I*hYKL%`R6zq[n{(|~2 _TLEU֨Q[MUl&*4P$„E$)*$TH4BUI~anېWo{A-5jMީyLwC8mM$*B’P$YAI&KC[*Ta )F3cH1"'y3'/@_1GEz8 8昶W 8KW|B1",P@D[K $ocUť2!u쫆PP`IcZ Hb_V^'-ة6e^S(mNN5"˲`Rsθ)׃BQmrP02\,HCo{-^]zﺬ;_pKmw hx>?]Eii4C_<TZXг軲Rhʲ `5-M6=IWW'oGV~\j~LQTB`iX0UKOM^6զ}K֩NZ-zViuG-j]'ȒK$l Lrm9S>pnvߩ{Hr*"g@d?Ni([J^!le~4}Gc:dl;=@vۙR“s[XWpN % .gt{/Ÿ /Ӆ~UatTa\+fjYvsqtUjG|C>&+ @PYՍQJuGi4=3QfPPҡ\.JΩOG1n [ ~w]G9JA{tA!*$юx ,h~HKy[`21kE 9?!PH 3. ya9<!NsqR~~E..yE'OU2 ՏBi'(! D0Lδ/ G?{׶FdEᗽ̒[&h66b7zbz6'umiVmv$) (Kbȓ $.yO]7ffa9'/kE|=m%x,KjS/m.i1Sq| Y23sO h^cAGI/FpbQ00+y߅tlnzLK|=sp5!@e{  шОoؤGA X]4MU!bњҚ `TԬʔDu<דL%)@gJ*6`%7juV٨*[F IMi # HL_'p^, HE} .13ƐRDF(XӪT .VB !ߊr qڻv}DRA-ֱV"¹vGoO 6Q< CR_JMsioy-sʖbia c -A*0RШe SZj:) 0Y^{~HgSz9cF7ө~Z*ev}Bѹ682)"o4[YXSVNnϭ IvJ~cݑgWY ym΄ł%vnz4J^|uhb_};~0!3Am3 tʟ?lƹB(P#Qiظ0lx/bHТ{*E<J9O8a@L9\H4~0NIKw *c(RFx~-}Jt756 M[ Եn(diJ1eT]$<;EQ%*m1`\ҝTQu Eu YSB[5QPF?#ўKƙD1R6/k/AKJt}pTY³61(%=KaF)fmZCV 5g%,gnR7o9qv![251F&(oDRnfj+,6I}+V9o%m*RE.Xt)ԗR,f+}V \dE .mv|hD؛]Ɔ75ǎV٘75O\0p$,x m]!*Jc<ʶv:'1R1>q?C 2R} /#+*zLy8zFj0n'BB=/H! y9 c(8ڸ=%8jd@ =6",2P9ΰI3lC֘>-?8ԀgN);ϿͫO5&ͫh5x0V'wN j9Vk͍U 7w9u-'cdm?!hЄ9\ xDaA2S6t%A}vmn]/bQZi<%I(92-HDB+Zjq7H^IWM4HwMLX W ؞6^]?~}-$Qixp>y%{z"K8m)2yjzǯѷ~xXR/ ˣKe"Zfv¨ɣaB=pJbDO?T"6cL܀ǚs) o>؂$Z,2X "D"(5ב"RAD$烇ES@Ě#$ Ҍh:ςE{4j:*y )L'9.y0`lfK Z;(Ge>>HWud xpI*; &|'"`6lH7$q|5d4A3?ÁB<l_lC9^9چyS{⁶.?E-XwQG'_esjZU' #?#B 15Xi,Eu mh ~Q]h8fCxbæ^[L/Cz{xH_(.ź]mkDinbut]_׃ܱúKGd{g<*4qyxBiKM1FsR-L ]ft0& ZJBU Q'y̢M WI, Q/$]?vut 7y6pzBMn #Mn͹ѾaHB3}( 79DYV>bVK1$nژF^Jl->V[j&4ӕ9,Ԓ >y%qJ#'* +=$3l/JJa$Q 7)fmcJGHX,JxfABanKYmmڸMɊV6X%(jZ8IQM*0"C6+-][/ch=KW'SS+P61q!@Yvƶ9ҥ)N_g_\StHV)q.aǶL7}(>}3,y3yIgZ?JU駟/jBVR%9.Г~[}.^ji#gClϯ >5cfս`S:CFޢC=?T [%=T zQéynGޥ6O'sv CT(!>OC>)f\Pݖ4:D'#<@(/M $#> ;C:Jf6A/DBUnn-Qn4.}<:A`{HX4y/.׃8Ppsՠg$3mЕ+H!)*;g1#. J-o:B~\C&^{F9jn9 0ȫZ7n4 j-ds%򐤯ID*Q 1/zɧ\'xZ!.Gh&#yUUR)(28hѼKj?ja|4&uT(rBº09nV,+w<4vKɁ<(MzIfyz>&=0 uV$|GjY!ڿI ꁯ$,ɡ%F:'hG%NiL 8Qcԣ+/Nhs y:9%:QLs}{7 w5d5:FoOl 5 'DONt8 $Ny 5H;> WGQu$urCPztQÕrT=!NA>8n46muW>h8[1 #LHy?-^_%Qt┏UMz;v좋Ɍ48@|%D'1U_ :gjڻj͒@ݤ=<_-8 4M[|Y~M#n_LJNiEVӖnL%ej-KQ"b]B%/kFF,YE0]*:LCףN`tNA3`4l{OqssSRU GhJ3*ЪMQ 1)m'UKQVm+dym5 91R-kXoWS2T_,c9ݩo@#8Ntmp\{)Y'!Nv~ 0Ưw_W?Y&q2`Ț/4Hm˽_%JPe],HT.5lK ^7!Iaz:Oop]OSx4'Lq2!!;>m#B SXw(_]ЇO&pĢ;f?{s^ݯ㗛۟}g}s}3qA̛~#3~A3i޾־ulBQ<{ӒɘxsK~zT3\ gW(PgLh+dU) 7Tv䲪PuJtVAl׳Ci|R~R6Nuh ig:]OV~4Y?~Yםh2γ>.+ !ȝL)w;f4@pG⬓^\WT:c|鎇m5K">}m]4^} Kжo|>&qd'lO}}Qa'j_`DUkQ[׌]Yo+V*)l4JT4o|&pϺ1xܯe]ӻp[Kpף3vqן>,ww߿ib7쩧v)l7N]XO^]]+5֭\˵|`wzw<4zm{?Y|o(b͎ǻeSCP5 eX墕nMi]7itoϢ.B/s^XqD57wf·rhS*'9#2Q簡Α'c`C8?:xNj?r<7@$$:ٺɾ6O'Go#} JjcS@%4&TF#rB dKtS@ 23BUT $YF7Nh@p6coiRJV?wwpH 6oxv搨D1Î6y^0MӾfĤ,(4/xـ(:%ozCqŧ_o3ACw4>>@d-AȓĎD^Jj,P(B)m%_9J=&O{Gw\pDwmHW,-ڗn0F, y2e\[=%%N]R"̋*3q`cJP /_'pƥ {AyFa|$*^9lPw;5P&޳ޏ\pdy{AaxK@%J }R$u2AW7EQ "숚f?;uБ ƅ@4t T~8fzT)3* gqȼ)SƸୁƓ"pƸBL3.tOF^E ^ΟRqRvcfONpX3?AjBƀ*DJIiע(F*Y&E"tu 7L|' LF9;15$2Q.,E-yRwʒTj2Aʢxnv (Kt'T-pwxδҪYNşN @ȑ>gR}I;ߙ}4lYqM³Ƽ{ާ% FvЛ{Tцoa'q@fOCVl[MUiENhDfj+ c_5>Q[H@fjg\E)nT d}F!oF98(tr^$[trMnk]`'+L=os3HI>a2M4TOM>yc%sO(ZRXΫˏ|OOCl÷;qKCF {Wu郟o R?`r|7r>gW 'z^?mZO^̈(rwgWeh->9>F5g CCtzP}ູ;{a{YpU t݁n1'QIh*j}۔c\рzD0m0K;[ K},cs iVÛ33ުa[l>)i+?}<܌P}yI) Ur,T\+"I] $̅ʉdbҲ2>L t)5zlcضYU= T?ٱڮ.sWJd(=dͶ(xQj<̤VP&hRui3X-R t^\FŽe=9!K0\NQ?}Iba{%h_6~˸Ǒ*!ơ9(gNcI5= B14D*!h,N3@oсfEDecJMc;bHPޚAI6Bͻ<}J\=9z͈TmTv;vBNw- ‰}l Mrcg!D_TmH!uȂaڞtPhLrHӳstT"F8j=FE/>4Ӫs:M OƴdKIkHX'|NnQ+=("ƭk8bxg1A4Vfg_6e}c7؁$6+ 尲F0 ]@"VUsvfUV7qX<>*^c$qj=xbh%nK֤(yQ&wiq A?tD{磉V}K y*1ӊ=/pYy}mɦU7O%tT +;ƒ mL7@XCL1P,GiZj9iSVS۰̔0%xBaUWY"Ϗ4#",3BŋGD۩w蟮PAً@>zF/.2#A# ͸p{FcZ_㲲xPx-` s'h69ņm I*iUl$2NZ銁&1)t+U>Jկۭ\%%OJ4%`cB̘TgF4$R+rpVmU|k<%)Y/"2Qr JLpDưsLX_ (&RV*l=e~S>-扫]p?O(~z?&>/X?֏Hb,2^~~7? M<-_üX/_GSf/?\Mb wwrL 4ۓ@bWi5Ӓh3+v <}kؾYP&;>iČ(j"K_ܻB۾Sg GɌY>)Y}m5j6R23%3]3/>b0RR J[}e5gT_’^WR~Qjfa8(\Jz(=dͶ(=(/Jj]j1Ki}2IQ>M9*<%Hug(32)3])JL`YR"<+T%:BmZyX f W`3' j8]Px=&KP ynzs3#Y]6QllOcxw~gM bUs6@ fp-WqC+ ePC޲XjC0,Uv'Ne:2y `W^.<}[)^ΉGw"u,oBlOa?mŽI o\(|\0ѭzwf2hz{{" R ͪKBFi"ӖRƸBl2Øуd{rdɸ["tD>.cEmON㪮<Ҙ~(|PL:`HzVq.i r? >ʴ+\K4n\bd-rmhP&ڳzQpE[ H ^ ś>qC !D Y2RKH.U!L ݚ)S<ɫL0mGkB#cd]R)3:R3Y(dJ$EJϖ*kHH6]&%l8_hÕ{M6tPd"!(L>rHi? \R'()_d޽/%Íc8kq|n5kǻ$+>d|q[ q"hMF#(f'>N;u.nci~;PjQj;xO TBRT,<;zƧ(kUYC{{ EYXۙ᳻egEdpv~!IcQzXѼ%ekb>'aZӰJSlPu7|1 [)rNAJ}`4/+r{0F"7CܤK6șSTI?`߶弪]˹UƩ UjeIdG]+fOw旿N繅&+j0} O.4?BS54/Ss^"S DTLOLJN"MdF%dMZkdT7'M?t=q7&IY< wBk6ed|tt7ѬGCg" Z'$E!$"ɍ&xyV,QjMg^ pӡ7O+|w07oCQ7(܇Gg Lkvc<3ʤ\/>o^[`z&;kM/}n~'W׊eI"OSg}Y|&8j>E%YTߍkXjV!i?Tӂ` I$ETJ2=RDdclW4- 1exfXIb{ZK/>M=%M<:7o=PZ3DC++RChg`% ^7Ƣƪ:A+h:2E%i*roNB9%)*MLQ$I$Vh? rIar ^[>;vpؕ7rsΩ Heg$YKSR0:1ER{I2eN ,hLJIw)fK]rp^GɼAHKU[6\1A %r1.ळQbq` 1(9Ӈz0Ue^.^}͟oZ*W \. k&Ar3.?+5pb?s{A|3y6N?|h|,jv~4GD:}oRU-hobCuP 뿛@Hk%t-mK@Z'~3Z:mOq~m%@PWZB+hR;Gq}JTD`$d\sm@=Ҡt]/hr:#\ePQ;YB}gm7/B0Vkׇo]2X?eU73Nِ66׃N HCILØF7UQqtkiѭCLw#юono{n1"䝇h0% u3]atkiѭ]fn͉۹$ Cu (p'^v( 1< i> X8P%@HB91E>kB@yJ DDHXӞ\I$Ԍ6Nm1L0AEib:0] <0S t\\?מW??W">\bY|0qR>M٣Cz2#`55ǍEF5 lUHgih&33VRږ8I68 \*xeY%sP@iyP~(<#0}e#$|/.qYƝUy6'It܅J8>;%qHDsPrz){>iR~6i&z3JW(g'tmb7+/'*3f6Ĕl#7^HRY¯0X P>ā% EF؉+ Zib}2.&sz1(frwBFՈ,/`.Mv0V+Z@K:d[E59DqZrzr&)"pt,Z^!hK| L*7@G+,2m>3H cOhaAY< sC'0A78x ݖWso%-ڪ|PޝrE,D-εbc1Z*9ڪ\#ҹR`rwE#E2`2\ P4Dׄ XnWb6V{ TOlE[%4~o;'~'oob_-U/G_H]qَ$PLP3PҶKQ,oRw4m4b0$*<] N80sB91ᜂc T(D=ԁ=Y lLQ0 .`46F"C1oԢFB!) #ds0FhHahfz[Q@hgH DIUuzغNn}SV_ىsȰ-Tnq"K;; _Cy h'9_ \C2ͼ2'oONXT:IW:qh@ص]u*fM輲h"NuNlx ̩IJ7NtRgxj>?ᎊD^|n|,z1(RݱJ)8$#Ǧt)k Q_Vj3/Ayb'.XZ ٮ .* XoH  ߏʱL@tiG 1汎9zGxԌB}!/X'/"Aj o\K(s bv>N $9q;bgofjg7!߹&APwDU8sZq-h:@+Zhu4;W$_![7zQ$I}Gu;߮sEg-=к5!߹Sb}vV퓐jkOX !S)EFTN3 .$CRh[ ff<7(5L~ܳçqpA9\(g䩬vM'mS}_$9 BYe2;g$У1ۣ>/C{r잍v3ud_4 iiQ 0_ 4 ~}FK,dM&4g\mbh3oe{F\3po אKC؀unRdZe C\")\y;m.^m V~A:m.hi3JEK630rW:' YbwPJFiSa\ato)z4ֻ#W :&χdc'^Y<9]WwΓW9A=9Q-O1m<.ӝd>ZqF= nNӠm[Cdys s%ܻιWQEPWL1RgكKl4FXìNӂ>;-o#ApRNB %(Y K# d<{jh7a$I歉.Jvy6mz?@k1;5zr-7׳m⳱.wa PcrWכ3eLʫ$y!>0D%E]ۻ]r҆ȥ#A{7BswI5XhVsuv\b|&ngU ~y2Ǘ[^hn_,7񱛤Fp).}b(>˳/[\fc6d+eΏeֵOպZPoA56*3Ɉ4ԭTY,*ڲ-4:wN] GeH%z_21^.ϻ2uSECrv~VL7xWn)}䀽*=K+jOJpެ2T;iȟ\EKT训a[Y BTlE6Dӌa֭7`uBC*ZFƭߤZ7Ap BTlE%7\dBmGoZ*4OgFcpab!\>~yb)=X&vbl1۟ǂNQsU-n&fG dfvNp/:V! XX(2Yq!Y !N *%.8IPz>Jn3:f'M1m!" JAfnN՜c(ckN([K8B7yuo֫ɖzwe0.Ùdn7ޖXc˷G m0wePl/ٿbN5ȯ[{Kgx>wK$eepmB-?{w_DIbXk{, nl5ݟN*_B (_8?A?8A%'t1tyCozO8( m"5RsflHbm!FG#P,f LBR|0G3WmI`x,!CZzLQ^j#Qh)RxHø+Rh Rr-=.(/ۢ]">43w^RKI=ږ$0-'R,ô4CR8IY+Ԙn]zZJ a,̡lP m=/]mlQ8a[o,~ $R]S2= uQ -Mê/.[{"%`Zd5=j5-j~[PG7ni]>4J=mr\cF6`h`Q'oh,Rb܃44{j5>/R6j3Q+f柫8}_Ѥ7o>`SL9v6l5 NURN\ ;m^ z`.,XiNe*F 蝿΢>8{Ũ$^AFMC 8izJ j8ޜƹ{nqcq p$np[nMCh=<g 311^iE :v\v.>I'Ur\[Aˍ]YZ1g_Mgd&K v6gg88KgJ0WET5Km2q!R$|a&skEIQjҏV$&=^DITP֣~Zٜ z֊\߳V~P@]ì1ZNWv&Q|j,_mu#o4W ! ,>3FQDyAHH ))ӄjj:ӌ Nl 9@5dRGT/e{Ws 9݁Lң?Ir}0#R6t;qG9JfzMIM dxRH V㢉syҫL)p=/^O9{Z#ݾ8Y.Ӈ:K ̔˹- s?^zmʐ )wa%Kz(xP+%3 XɻrOl2\fY0%{{/AEdzKAR tV"-|SXN^1s [[[:&ׯp>ٙf?<޿gYh{?cuu_FdB(CdmƓ,WXPT^DM@ųyzh^@65Q e 6V=NV3ZF^ן ,ͪB!=3ٝчww?-V('3bNހblb6q?q=6Emc= @ف$YzE F:"BxY=kZVˆH@vͱo~؁6^24A ^4ְ dhVYKrwAd9A{ܥk< fSJE邨ۓ9`b/ONM~P9n0AsݴV/uZ7Miw42 )k]hJVD!AdFL{0-릌eC[KdTLF(XX5DusT3AFyuQnSU ͎htaQ1u>r(4Se<|SqAS( e&ntlA!'H}zXL 5 Qb{k%bh&RCj 9  ,5!_͔bXr ( 6Z%rQ$Bvmd|??yɇp}IC[ -o$&Nl^2_&ؼۧ׽CÝm}T.W.Lo_a dm߀~ώ՚wxz7@?=([7Q?֕A" }%~oԽkm+ELcAiQ *B@AfVͩ120eG;RKmpdȎ-,-a_W|rdx_Ww~V{oW{ Xzk_DG0 }2|t_B𱟵wsD0Fs|#yA64ZzLQ^jZ׭i)Ǝo.hzZejר$p'~tK -%3~"@tZzZ*_?T6hqGÿ́wZzZ yB>Mb+ԣe&rר8ЖĖ2[mi"[K RB>jzɌ_\רTi)yi %ԣm9NKWK1x @& !%Q~>ul0[ӭhWbf[Fm]O ^Uyٞ\TRD_||o}_KVe#@Ug:Gh+hfSTƧrq j0uHjҿóSt,p ŀ %hLr~ :^('̂cN6$1&̭h˽!cKo6)/mp&1 i:YPLQ#YCṝ6PO嫌ɀ:xp#DUE@x$礕0pg/ J?lN{vğhe"-$8D>|E0Bb((UHph("I&1ÀDn7?ƫYlހ2C6#,ro!O)`9x9IJJ@p# "Q*(6XKKaB-  RXwmkxFȫܻ0}}M^|/AⷍRj{B%?@7l!:ͨtBoWf~ntJ6}kWT)a0<7?콛<ޏwzK8]MݓeYΚ‚dpӍ3_cnPQBjg O5Nvh[3Ȍ³+79] !Kqѥf4(,D zzP'._w&c\c=Er.w8t0Ss4Z"qI ou:Mvv9$RJ c̼5u 8ɿέL=iLhxc%P1JK:yWu77ܬΫ!<R 6?nCd Gy݌Z%$yk)τ2]RK&T\teCʬDhS0$1peFFˆnjFH8;,Es^w_۾A. 2 B B8f*l҅D 9Va`!B߾`1c R"q wqTOWc㏫<o_!2VOccduMS[ɇ 1 Sc̺c2ri5,"cŹFvp 9 V@_@~<ϸRۯUUPPuPJt&SAuC^HsGRb X ~WxL*2f Ͼٷk}6'2 0n *,ֱ[(7$4r32ՊvXca8 nQP!`HT+ cL1xQ`$(pz$Q(fBt+\\ɤ%Y IPUvO]r#7^f3Nj_잍n;aсBZI;M)H"uaIE HI$@@IN £n~g)K B ޸*"gPtbd/nw7k)6ʗsnj9 &4 yd-ݯfs:bjmswjĚ*nZ\E=|j[s W"L&Ts;GPUBTKMKi:^C^C!᥯6WYZt6}YUDo[iO6u./*ihm:N*L|@UㄍNSFg6 8S0Kv >Zt=~("(ZfW|W?]mɋ#~B|7tׂ1st9VlI|.y]b~5؜.P-m(~Esw ޕQJQ*.kH!"hZRa5wšHaP1*&׼8 >wq@L /`$ 0mCwR&|>++Cķoc y}_Fn>Ir>-_M(^ӛyI.얔%؆$d%?J/b7+5o86O/2"2?WSMʆR4FgIΘK8<&:F<N 9^xK Rp&Obf\ Q* ٪*%6:ۋ,`cJ9?D'{Q;XWQDQ RMKuU|vkD@{ن~4bН`yTm-lnLrJ[qoUF8'?ӤVgq@p~>^9K@:D. IMTh hP^YfUiZK1goRcY":i=O#:O|z4n*^7z86O|%F#yUՌut%yqg_gy5`paoh/;k` F +ċD> }sUwTNI픤wt6 `ށM;P&zY>; ]#*>yNDGvT֗{7jǐOuҦr Z!B|U}1V[_1vvZmٗus_K"\/OIk%#3%@3DnD%M !98=SY5RKGCh(z&F-o{[wv㌢72l:I3V"d"oft!jE MZ~ X~ O!M qjO/J}L=hJ9)M`*{M8IK;-k^-PE]\3PD7wx/]*[xiOvFS- vrp*D82(Cq h5wOdz_OӰlſysväjMZ*)٠̻JG,ǿVTjI-(iUӉͦac$Y/V7G Ḽ[C'K?TG^coC-+.s٧E{F.Ul`$#Cjz{*9x#MYs)K(,T*f&t[ꌩ}:޿R@wX6(PO@WKU`(F'?΄Bq0XE Pt߹΃/%Ӓ 5oBJfaKP }Noޤ H0)hB5w!cVIBmN5`I*70pI HFRH|$2YdOj(`|ϻy,Sq-}74Cin@͋"G=V*I9π?ffoh;臔<Ьl|3&vQ]Sh?Sr`kYT"f0]d.P cPa dߥ ~ wMt>\.w~ 4s^iQ=M~x$[GCmϏhh?~j0b˜;O>aǸaPȳN.ܶtxX.n/0&_,n߼!2V+{;^܇KQ`w' 4_|Cnr(؋nL$fj(Y*)yUE/$* Q.=fFg̻Rp>ƋDK&9N|nSr$3`kax],(Â!MQ?j =nW{?_YƼ|f <7dq@x/0?VnQ95s(PzÐ}~N:-+ 6E*qPF848nư&ze!ϏWOwa1ђ>uSQdO9'% JC>ĺ;$Qdpj; ۣ(In$'yT]m*ć5em uU%hzf /r]wbݪvn5VCv5_MoM_k~ۄ&T%2n]ƕz#[ć ean=<#Yr7y>ٔv`n߰JD6%m8JBK8GGuM䒄}Յ%,/&*2Te62>x)Foc&}x)Ĉ6\%LApp&\JC ]ՖoC>z~DRh zWBZn׆$Q0iHˑ/cbp;Rǫp_(䞡My]hQN]\RoCSۃF%u#{ڔur>f  BdOɸI9{K &PvF4=FִQL4_K;ǭ͟Fͯb4B4_Qu:ˮvb88I~|!>%P}}B ke"NS'fX+"ʍg?e}mkPD/V⸪rf]WZj$U<vVԚ:5n{:R17kS_iQ[ 4k3i@Z~{Ylh@eUVڠ˯ݺqR 7vۯF5~-@e 8 ޤ F8eJpDh%3be<.?T#h3(R{ROY ))3k32UZls&N(\4]64e ʷAk]^H}1&MsˬIq "rCR쮊Z`\dynB9g|ΨQ+T吧ؿ5ۅ8 +,_Ԍ]i["Lt(r`<." ѩ<~Pxy稔X,dZi7;)Q=$*RhIevxush+PD4vPܤEsH+͈"$M2%z1@bhڌFVCs5@yhB%:pnVzp&Nwib͓a|8^6\g q3g˘4Whf/Wŏѳ2O3"s ή ! q SpQ5{3ZZহNFw3?/-zA{ّ b&;>ļYʧ{I>M5S\ &<O[Q>RGU.O-323g#xwmHr؛"`$3nN|㷕e߯ؒmږlvr<$,>U]Vq|Hf=I5yOu5^'Q.Nݓ"=I8urJwF-jöҝf.; io ^'\:?@i*cR*Y80nIALLGHCʐ9ULДiWݗ qH4FN[Q}2ivVQۃ{= z-#(KZev-).BHj<֓zV—#7".6̑pK~#-#KNb'2I8='2刌F66զѕ75{L]l/m-u/: n%jl/wTd4a񾃱'ہ]~WoZw^'iyrr7@O'uI[m0 m~֩dc=1nC1F w~Q ׷r<}ՋN0d?-c+nlˈVqIw.e.:yEC"8Sl|}}參=v1hKxCo >\\~I%e1CkЄj<ױ'k^1QL# .^$pWWڴکM85"n0TfQp+:р(12 ɴÁܢe[\3\ޖ,m^Q!  lү&%m4&:<[;=]ԝ<u,%OK{Zo@65^&O(>`%ѹ!gUM~_Nbq_,6 d`9|cngaQطQy0C26H\.bq)6e5Wܘ&96%""#&{r#O[QI{<-%3_=f`a/pϯSA2E0wtmlO0k%t~~Z} )ʅ规Ej+ᦪ0WA)/5NSOe6s;.ZIa]ˑ$x߮A!3[KP= 8ȟlHfߵ,$C1z@&rAn0@H5™nP:#]r9 $ϟi7UoL|_7Pjf{+<$`Q ޞy7__M.W_,Ni`]~9>8jGgz`&9Z>L γ,zB(h΢zV#I!v۳Au]xxWuś㠞o_,}R?Df!6 ѷY r="&0nsڅW$@jrq[N2`% /e؅їbg8 [ 8CU 1AWi4.t s~T 4diP%&%;$5nM!u}u; D9RJ"twxoB3Pz`u% KM!Bh' ]iH)qEt9[/Kћ<.uqUSkI2^2NSoSqs&ztaHG! ~UHOs?W8`>^lǙI넃o3߅kE?Wxc"< h?-nªIyZPw;1^st qɇֻF 'wy|3焟BՈS ؙ"N繿:t??kHb@++FŠ(̉KJQvzW ҐC:>G,;;1DCBScCeK"!,g6eU`F2A:m@R ¡8KxDlێꯃx.ܘZ D‰p$PiSzeʊse},'֔:77EdUr8T)|U J8P䖢j^WH<@oNQJVk?5(;vw|rLj9t^2\EmRzꉻ^aMo=f5=,ߟw7D֘Zg4Lp+06s?oo&|\ ̇TN?B{b㺇r=k>^_\M! HT˻'?P(tQ`}0]_'?ȍO A}΄nCl;&)GԚ.I3 ]OZ#l5XLx{rHk2\߫’2h!WhBfnfҜu?S誛Y&I;Jȟ)m;LT-Z '}rkB% [rirK\-kc`6  kha Vnm t:WMPtlJs(- <  0)-uzbTe5y⤬F@f䧪eCG{rR|Wcޅp*٫q=Mp""RzSi Jn(Jɽ G *pOhB-a 3ݤ;Q `X$:,/d(p͵3VSUU餱Z2Zz1*d&\ifs9j"}рhyL֑dhߖA8Y,t({VS or=Pm:E֤3W'F1Aitc490qwKB)L̷pLBBsm/S+Hr&-:yD;hF(cӲhbTG3-( )3OhB-0>ᱫ w{rDnT7<|kZRGjmNG)yꑷg|&t=QY#kLjD8xyH  4/ jd2Ƙ!WTA)>Ұ(ZWة$D ɝXB3Bgč-c C4cFFl0bU-쿭/ovMڦsskv֣> v ;?):ح䫫gf2IހB).:T\f&,ZøX\ftnqtb4K5$8)ZpG6d^U)m3~Gװ]uvP&' ݕdDl{Ew r9èء=͜ =7|]ci}­TA5y,)uH)D]GB@)%I}JGf#J'Ba#hP1 9*rz$+ۤlOZ:\tȲ=ql=Bp lTAӵI`80U @j#ŭrsCn׹9N\#Ʉ Tأ [, .>􊉔;q+zECUuzǀHSBE5bQ}~܏rkUn'PLjd~,ľf!5c_)jp24tpWBYU,BIUx-˲\ 3r0O5oo&'.qWl6d].,xPLXU..{!EW}iN Ad<àPTRwD ,+ JUic"tg?j hhTS+jf[M:Jf@%WY)K.#bUEE1uC)'*r( ƗY"MtSk hs}Iɴ4\Nң|^N+o@򪔂q+"`]mrN Lp):W o)S`xK<{ݵZ!DmS3jiO#L(lxkhdѝkh}3p;_DfD> O$(@li@7QMp1ɑ!̌Lq#7?dG{g/JI->Wf>DHB{@T6R%hw'#ϦҠd^jP%USSx=eDЧTNNqOFjoLۅԻVL|V@m]#C[nFӌeZdPckEWwm"}ik﷽ΫS{MD߷&jyjs3z4 %^@sǔWm|s!d$E ׏f+Y\; =6T3__M.W_ 'R-Wv諝G롏B8h=v)u ]L'vSOy¶CbS8$E"f!0 Y&`BkAPZk0G υ'`3\jWIYYTrS܁".?=N67L IޕO+:F2PbPR  FJU5T(P(xT>1-@JT0\oH|Tcֆ,`x΂9 X9 %)K)qR~&4JRGXU+*Dz2:h _1[b羴DEʤt~UJqƴrxq!H!+_E h J(+pԞHKt"A ҂ J F `J^2rAH(@ѻa~?v&I ! k7#8[*jY"P* @%+E`@3n ^Ybf%MOUwC1cC'ձ:Fg:|+E4ki :uRH&Ytnɛ4%ԮȤdC@EwKfZKvFC.]$D!Zd"5SIېVw__|l[W,\IAh 5֭uT:!0 lW K >T7$Tm:A($ !̣隈]ckt|̧݋:Ϛ?oi5G]s8WTt{~.eں}fnS\$&ڑ-$g,S/HTMk4lM~aA]_?ŗ KMWJJvz ڑffS0"'N->~X(=#P0vC$iôS+{W:Q굺?=ZY@c9|[l=OrI:TNas^\9߲}ݳ2M_\%e>ӏ¦)~ T]WD 5!'z=4{bv*DM1mڿpp~g#~fj-"Us봘Qr^+Nv|if2kloW6 QNs$uK$2ry6ϳ*Jti;%]p7n83;owLV ^u k\~jG̔#z*~vZ7#q .jo8Xp<>ք݁JLV <1#d,cF :Bgc,2H[5~r\uϜN_,v͘ WP0\&e4 }w4]']?w# 6IslUju2y~~t qyv:N)˹Oo]r|Ik/0]c*TڰB=;2k\= Ruɗ}м?1[z-'jiKBx%NTqwغuf/@, C ciel6%ޟCts%ɏ!dӚBC1% W cj[CD1 ֧ar5Ve+L l4˰MޠK(թN okPUZhxQnT|嚷 `Y {+714}ޕC=(gT!\[& ٧*hx*n;s4YOjuJQZ`}v r1uA]N3-$%tN4&̐\AKT*e-V,/9nZyq.n>w*y(yC&hO4y;|!fC 1|)@F_ XRM9|!/spn;Q)$o&@SrNxPU/r5,C~a͗l磱*~sD4:5-1KXe7W )nWYn}l˜"[3Xu^1)4E9)o;NfP oP`e[a?D/N*bIjj {_>( |AFPaZ)Vz0~hU<d.]/(یzPWMgD )uG;AA" "1Z ǞKpmEV5BOԹ \>F$ʍSsբԃj3ݯ ĉ4J,˱e ~D\TwH myJ / A,`kMW?̍}bHl]خOڋt>ӺC:!a=xHx3r o9QdTZT2+5)`D!:{ $,95и9DT?*7v?C;3Ԅ `^N{նrO+ ԗ>bG0~3TArH.WJ .Tb1ϯ./:?V|G561zGTmwpEW@<,0}X |HM+A$dH6@ #![@u;;7v;.fmz?kX::S1kM3h_no^@8/!9W5',(;2D~BpHPu?6[~{dqvyzĊ/8Zǩ{6'#¢%';trO)DTiQn~=u璣?v$h^-;k7zTk¹Aw\oF9Յ wXpun cGFY#c\gft;~}tv|}۟ޯ|珫>c=s+r3#m\ԂS;̮ײ߻s(` Wt7a=gAvU7!oEcxjwUnl,cy:淀,EN~̀'>]|"<V BX4A:u4GZX#mcQsL!ϔT7΢Q<%U=Mղ;n4a1B[w ɤ[|tkC8S`vSR f\nuWI)w{l 6.]3C\z;mqjt fYwɵɈ?OZN|Ht~aGŏ >]|Y./1Z]s:7l6Zw'*p &S–KwVaCԱPW D(U/~\(gNW߷p,˜s+jk-kE2r jgV}iZ3Rݞ7q7u"3/?dMlG`%mw C)P/Vk|$iQ$M.AGO/L~^GrvGPEf53L5H h۟LPhPgrN1X3cl(Nq3³3`)AEu!_ csًHKE 9[+ՏDX(6Ժ|JEFiYo3BarT#.VrjMm*’RLRZ'qT҉aB< --$/e[Lq.3p`l\ zt5yj~;!6Q9vlX[^U%9G|\9SO,QT~ڣL$hM?sar Z(6_qЈv}(*lxs"WG>z; dBt?_0ŝ{OyaV%Z=VgVv[mk[yTffJNqS)krZhrmrd/aR;y>vp.w ܬ߻ƘP 8Ԁkk)q)\PBTTPpDکyZs1 E_$ z% s*q2QdJb9PƘ9deF3ם0E^+rQddo1Geۍ$JaEp̑,$g,HIeFb*dZв@dVSSUʣ9I^{n7FS3@/1d< #$# eCF -́\)d@ąQ0f%Ǔ)D !E(2r8o+9W%8C Ǭ"[D%%fJ)4J t^6!%Ũdd!I^bM-zφ ZjjdLvbDfH2E^ѸPW4TPS CS/F QY94ˊ"+E.{Ijf0e<+(2RFP1f8Gy)u&&RK2, f0XP ;Wܨ\2ssO&sDxw9󇯹r~5K8+ȯ\Eyę _1|Rark(]759B $ K\n k})`-q>Q@NoJv)IU?F`XV+S0^@A} Y,ZIDj,5aGYb'26>R"3&SLL7(lR Xp'n >D3mmd%[`!%$uTobO7)v(KY??0g/g'I>yno_ %nW8/8좊^NoGx0=G|"AC(]CЧ_pu5JOwj O SwO+ ѿVq7K.̅߯!3(}jv>Vz`+=$94]{0^owRmZBy==[(Zm#2&D$QwQ~:9+F#=e{&"{ԥCua VπH \ 5Q㭀*,G*!Ygv} kHm:F×vp>n kYaEcf`BJ5:YSB0_$yTs1kOxmpz/K>YQq$ FԨa*6*ց|7!oEcxaP){ї0^M"Kؙj%NKqZ/j%AubnY=)CzJ_)nMpg=Db$Y}A\OyѾx&-:!2ez&"M8 VLf%ug(a%48a&bdңD$OALTD+,W$-Z %U&u{HZL KAY>n$0^D;6?^'D BHND 1a,ADVzLw&|| e" ʩwADLbK!)Dv}LQZ0,5bdqEX+`gJJd1X`p|6nQSz#lNc0)q<79v2| Rg(P:Q*/~I rrUu*_Saf06הĐw-f( | 0/LĖ4n4=&#QAu8TlVnv'yGeݧf)F, z3v9j,cpу/_%5GI z cfh{-Wz 'QєiÈ BsFˡ Bf@ SfZH8y*%.@R*nP`7vFjOa:+t{[ כRlg^9aݘI[R7TzZcƈ- dCD 'I5'"RM(MS9B94QE. *  Y3 cLLFu! MKcw_T&ϿL_:)n^ q.~B0l2 (ۻSGV+_l>F|ӑD'9GoI=w2a {B{t=8NYاfJɓL%qS[/2a^B.7SY?0|ǠQg̴>USog_U6E_./_Q/c6o:e1[q:d\98i[_^G9bИ u:W=c ڛ13=yEcZщp˿r[+6W}l,9RAuC :URћ*jݹZw*FDC'Gv]5)18wL׿ӳ+]&vvawF[WS, RdC>fi/D3l]8$K(E\aB?z%IJe#e Y0! z.`c?+b{A( `B3uXk#H.U$߬opQVmeR[8^[E (Y*RhO]Uh%"J]*h%&J]ml1|MıJtZ|8`oXf("8+L\d+8% 9(䔧X ҂VOֱ: qS\~Ѥ<ˁ& AhB:;3̳llt ^ڌT;MG+K2\@Cҧ"gTpvғReEZ#aޔZRp9m+J)3>dtVJRPt+Ҳ]zSjAKVzV B;wr. J?+-7=?+Š;zKjmY)AsgVJRKcg+=m+%VJ (Hj>Qyj4{4Z+&ޱGgYaMD`2'@h \n͡(B\B*)VHF 4 !L"C^yVhKsLdx: ֱ[Jmgz7Dw*YEy'jSB9HzvIVzV B2&v9w9$xJy&7eZR* ċËidPaST)Ka*eHD0ʙIH2R%Ik9JmYTflIc94(<տ~=x y|/.3ZŤJd4??/Ĩdaެ7%I0&GJ($sP0DR!C{qq]֛j:ry5ĈAUS̨cj۽F?xLz1^Y2N?5ʇs]CH% wFX2Jv.\I벾<"3qHuDKr85Cn f@k11"PńFкUN H 3&jjVM}Tf)jugFlt?}*m(ݬ{uqE/rKkB^ݛNBzv "7W+$`i&/Y}Kz>'{Rlhs"!|,`NJa?Ezje8׋?L*N6OYoΒ62R_oJ-p()&~jMU*}~^t+aзcO0*Ņ'u ')LA5$%sJwc5<,jZ 9ZzS!Q_`y)g;N_QwL/i9yl=ickPTwǎo}q)k jDKZ9{ZSXGYͿ=kq8 Pg |P Ac{]h{A[$CbcMoZ@E8,f8R\ y8h|nK4A`*-ֹz; wUpCZ I5fʸb0PQPy:S*( %8 C"@L^mB~6؜]WZ] "5}j 0]>gbJO1= B)C/$'\#"Rb*SiAr,9O5pSՈa#Dؗ ^[]nt1& 8'</rM$ PZDSLUd!23QL)@`%b f7SeV㍭|8Ъ( P5_)Ay׷OyƂB kt Y `xQ>\8W&f @+30F|}a||\zwʙS|՟?h>3AhoWj7C<L?<ݙٮzU$ܲm?P a#?2+w>Vf5>\& E a3oVL eEn-W W#EOZM;8> W*TIiAKd/#r9/T۟,OlRdf 2P?YTv0Ư?,gWi}5za#Ζ̞W +[܍k{-(ظD*$Q\*k!e0\Tv*>A.[C790Dek74wRܯRC^S x.~}%<ߟ|=ְ&ߵDwbywWWKϲ@k RC=3gC%Ϧ|i" Vd^y[sqvDH;moRuZ-FRZEB3gv:9! gt[ވvMf<|w fд,- 5/ _N~Yv|5ALx=yz0vV=6\jvԡ/ͥfX"f aDDk@XlM\~XjF !{[R pa(>Å SS鰁}axya hM5epn8,}z@ L'[oq)u;oݻWz;7b >n "v tBQǻ@ҹm~ZӻŰ߹6)`2u3~{͇yžfc4;y%F)kN_J1+=,ig+=i+;nYO+J+<ӶRAX)0W"5@AkP/-y H4`,u U.hN2蔱ѴBkBy.z>^m768B"MSO&Ilefi*?0wE`&E~ ;܃ ȣ7:gs쿦Fwa[%UrJP%ͻC%H f:R0s0K{ '!\$1ΐ{R1?^̅u`IR?}w:+wS[=xb7yϿ$hb^׻;9wjGU)!HM #*?vSM8I `^Ȅ\%)WEh1,%ɞwRnb};ɟG>ͻVݿ_6OvF) lƋ$?&(1$'˾}c-Zk٫d}@B)Rq!Ő":&m6bwۛw^׫Q-5vp/U Ռ[~қx<=Mx_x@/rl NTJ S$11NR1Qd\LeffJ[Zgm` %I<]ms㸑+rIjp|fr\.-wь}M(٦$Jmdfhh7@6սvJk-}h\ d'ATu"eC}ODד/( |!݅sAu7tŧmuw)B, F::!LNbivڒ[R<%QjqB N aJEGfOͮX=@3yt)D t8#0 9Gt~ &\=o8 ([jܽjp>6Wv33[[ed 32"nxLp vN=hdgK΃32A$" @w@:&>6 %jK5(1Be~4AܠTՂ cTaє`y!1Z t\KZR%5J1pn5+X=EʄXPJIl!_]Rb/Τ|u_ޗwwʽ|=U9H:%ȹ癋:F(h";8Ouܻkg۫)q g ߏd"{bn?l}vY7m?ɍ{}ˮCŸo;hjBݷ~gL>u~f_[zQm>BoZzzl9[Wmp?0rUEr~HY:SJ6!^|[]fW :] >@[1ڪ >dY//)mZ.<^A CBTSk׍垗oNA.{QtfN I?Ԗ^g〾}^.' rww݉xƦƱtW ,s& ;)EDJ:ͥHP^bBf >p K0EadQ"*J6 re7 Ea y2 2T.GS{8 ƙO(|Mc׻Uv]dq+md߾5aA+!x'Cx,tvJq#N< 4(ht JHGC5bCOsaļSx@ _A{@oMH{#pQ5l}.~9/ ZIatCsq6j&&h[<_Ax~)^Ԗ`g &\=`&Ne|bߞ:NwZ`7RJNf$!Ae0WpV0 Xo59Ezyl"*ߢzq=BM^>]AHc\ɚOj]Q)v' oiI1js -.-m"(;Rk|T?U $>GUR̯jH)}xwwk6 |⢦ݬ\ve~YlH}sd yjƻ x[7/nVL^"͉gNVcjfH6S>5&ٮkud!oDlJ'?Ox7)wK tRFNks--лua!oDSlJ|y4SnN3bۘYۻn]X7=6%8;"L[u>E-rFjquEtu )s%Xߊ%vW>N q_Xk0uR>jEx4TRK%_%~MRImx@BJjc&I) v5llƪsc@@&H?;E8G \k^kUѹRtj/NV#x< c29gOa^A~V7ZʩЄe?Xtm*%a:dAg`1! 8rn*<ԊN*CjC)KkάrSY5( n%!,LTLQiLh3q]aw'I*P.^GYm2jh˂3&3 e\ Z,KZDআ?guuoo^u5su|–rO}Ą@1ah(TD)sL"Z,ܭuΈD"쁶E b`-G&@K |of$1  rsk)E$h Ƙ"*&XD P'.hFC M8/,L!{6;l4K +Y2UR$vr .>×}<ӂ1U>@SJ`D4@f' Mtԁ7܇sC&=Q &f}AK8Y pF8n|z k',%TaMTVm4U[%¢UR[P%РQI3=XRS) +خyA9Ύ' ]L{"hyjf YH\d6̨9V5c,?q Zcγ toR tҤYAi"xjc_ͷʜ ykP j#'qvӇM<#s8e=E5>[B|H{:]8N~8.:҉AM1Iׁc .cK#4;wYq g9jwP)4+4Ɉ%96'Y^d%Hdى@T|TeA؝-o)2$/C+b2( 5( K=| 3Qk.(4DH#i D$B_\d2@Y#@K $V4sE SR&9ad0 jOC;ԁwjqhxf jzJ*Z*8K*pOn >/uqK?IU.<\z\imhji|ܯo]iz2\^)KBr6+ MpbA@k~S KunywSjxMէ^>{R!;]%f;ڊ3|^rmJyKx筰zSR4nu ?PcԖ9 NRe`7+OE(B7oSC;ˑ);02]o<7[q`11f d L^ ~/A6'mT+(aHΡbc)tVq4UaϱM ӺƋ`0S\) #B%RqM82KCr,Yqb$vY ۖKAP^>/XҞl;%풐=]@@f\]T K>D@GJV9lf׷wY͏\OVui'y]@%w_/dhw^Þl"N;F0p}eWߺ:IIM[%Kj!@»,h흒 $ϕTO8)Z{Ӹ^hN8Vk9+- % )\!+$ʺba6U@`-7jn=ahUr.hGrfJN]Y)"A_WYJ0qmì@&a/^EZ R Ĵ/Z;_/N-g5~SqzFIdX:aHwHl2UBkcDxp9JA%r:*iMJfuخhlׅt΄|bh/3ԙGʥ5Bꟳ!ƕӠm3.&n7Z7?:wׇCvHܪ9F?ܹN<,ۻgkК>.\s\ϋOPgg{ƕW1a\.wTbc6[ߡd;ؔ%v%f8ÙvDHVxx^ W̻-G-r{0mv Zvz'x'i k?LGL3I>`=Dm/*ՙG߇m̾Қ͗+ 7pT)e܌=|'y?_wbC{23!`;0xkI2%:E>e0v FAMe)5YʱǸFjё\%/lyrc/LNySd6ǫ՟C2]ubƺZZ}[lLҼtOie%ЍT26F>0UӀh̟Y6c7{3ޗl|?9ƲfK.YřT.2K2s[ W(Ǔs)Μ]HB'aBg$\=Mfp46ԕji=֚K,͞F(^]-Q7+qf?{G W=a6;9Yef$<εPDNFx.>=$OːgB)_ 407S* A4E˅63l$:UNx/jrUNx-% QG!Z|B&Rʹrޑew)SMjt1rR[J)t3Wyk$;9Fk3 \Ob/a7e"|avSz3_cJʝ_}zq8mt[SLG}@`8я' [^U kQh4#)֚K%Z 4?9ر.({`I=--Wih?-ȷdbց=t)߿_o'&ꍻql.`|~~5 O?1|}kҥ;|F%|{ݻŏ^&qOS}wo?OQd d~~Np@S^fF^bΓR6knfMլx8n]ן:tޚ`:>Eso<)@;y: ۖ1H6dI>f]@Ixb‹{Oҁ' WTHL7&'#T=,0_Ύ&ڃ/PL S[u/2CO7 ii>%>4;Yߚ~/x1RXzqr% dSgq{?e}{&+4 _ _ߌ,)c #L0 Wi& //EKW마p2p:7 { /,Q`x?rV'dݷ* r 'G)/_F/w10s6oj}gVc?Ko߀oQk|4孉?-AY7I,(bƶ&q9 t%{KBfm'+q3%=ZoОv{^CƬ>OYYilK孵=YErs.]J,'J|\k+"P-+ۗGe>VYV8M]h Hsûo?J \$ի3|/%y^M=O "ڈO8zIw=ؽ\?9wtmH, ,TA/ЏQ6XJ~D4f;St=V/S^;JO!<]g8؃e<xlYԙ#1 eh5G{谭kQX8jH#QH+s}HJj ʮ\aHs狂 7\ LcYhwW mǭi:T)E%y])HV""F{#Dy44 A"CcG^eyDP@aZeZ>pn[P~/ BNPX 5"Ƃ"c{SPy 4`l"SD+jG =" ,Ah||PIFTpPA)_)Aa`p\ C-{rpFubDH[롡wH gƝf63g0SbBB;)?2R)'Ad0-1ur@gE"KVʵ#34G?gl&vgIl:3^?{(-[:0?v+u"s>j?7q^MZ5vynn3L)QodڏSo:p3c}֣ $ L/ŞuA/'0{h濖rG%ͱ /qx1qGj3MzQ8hBE&Qx=;kEN:1yx<#7d2S0o|XOY.v(jL3K~p(Fqh%c3Wk7Ӎmh(I)QT6ýHv1MU5`=J> T,u{x̖hk-%| ͍Sh "{@.G 3{<rA:n+DZBxW/@46w#LzTF3dW?wZHmg }D[i ?Jx[.emϋg 4rX l\PoIJ4Jufc)4j"[ŸR|ח]דIT*֚rqi>~ߛk Gs6œ/W _gk 5jv+ZV.O=Zצ*rV]@'qiZQtZ *A Վ9;J;NR 3F®7bH %ҷbv di#i@@TNnGo)Ռ#j4 l5WycPN!0 \̲$Ϻ}_uɃmumLG`pt${ނjY*s;:_ 9x SF.ܢI,I`V'eI3R2Rv@!&>V)FV#OW%QrUv+ QU!!\DKɔ$Goh7 v+mv; x*n*8j*$䙋hNLɎĠ&Vn5xb)8JtGwg6+زGF<[$ԙ5mcOVsحU MTkؓe, "fT+ޤF>q) ߈Rh&'TtB)=a)e5=J-gu)p9昚S J#'-n0lߜn*OgiԤ--P(tWyn4RzRʄ.ǡKpҔj]zRʵrRABJv҄j2)==)l9oa<)=5)%MJTMT_VTFJOZJrVJK[X[C2ܮ؁AK-2xˆJF"K|)%#*# G/Dn\.uRX\`9_"̣_ۯ_7zUͽ7nH}Wl3Rf^L(W`nc ԑ"l-o2|ó|!̍RLaJNa>oua :Kϔ* 3R!<2%Ѩ.EN'%oR_\*pf?}T~ cj: kxRVM䷟h#Gmlf\3U([T`L)dmvaӶd'ʠ:37Ņ5OH=A0Ac5elY1ΨLcLGj*F:v00 ! ]#j.ȵctP)>ܝ\5׼a2;Fsh pFCx'FLJȀ VaErn[ī`~B)L k/@A Eh| 4iD/KgSlI^0lš0D>-\Kn| 'Z W+d`lxY;H6߅xX|U*&ͭ㢇%0(j̅mSvP&F|m+삍P*D`&U ̆{ ٺu_:#O~X$mRYI1-!Z^?Ttm^#`1Owۏe`&s]cb|T)v<8Bha3qR! $Nr8\ ה8 ùj9v&9ҵsq= *QkTudlZڸ86Yw4/ $XA{$+l !H(Χ'A)?w B(lpx䠴N\̫TKvBxKUn/Sty*YlƎ~]婖$s0e1pq^"aSW!!\DKɔFG?Oj7BEtʶQDRR]v+}ݪg.d7W♑p@V)FvE4zڭGV:Z2YkM⠗#{g7 8 Y)ۣ/x@n1l!m}6?0u; ڱ4po0>{q waROͰ[q0j-Hv`xTM4X՟o{apYdβ1qͧUFgs1ﯷodڏS-95˼ /D5xg/Fx|L,/*A\k8^jӍ.B3D]ҙ;?{ɭG,Ha%ǃIvUvR$m=왑sӰ}Ica_~B"V/z}&u@>@gD񟻻O/+fj"[nurٜoHzD37c+{ؑJZNPr#j+l_' %~wDԈw^2%_ Y+$\.~)= $h$UPs(c 5-bgF`M5Y _|ھam:kE>֘Kl==H^#*5>EAHw Bfŀ.#dsI1XYC97ʜL>:gi\(+-mcUR2x(s(\tєOr/F*qd!y(IDG>ą ' ([QNΈ#ZU)C9ńu:E1=!GW=?:3L5;)Ot .@?EzfE:%lm9H`睥T@%-xiJ %rHօ.X$}\dJI؝:F, YI SKͶ,zV kaIA 2*+Ok{'R'¨:}hk =q)DtU{ֵfF*0I`ie6\Q*uvFF\F-{ǍZ' F6CШ12zWT“TғBo VeA4FGa90iv`Esp_0̿Zv_8SL 9lj͖%U$$ aڽAdqXikFxU@Z&؁YOVl0*14WE./:A Sȏy((GA+}+qק #b= =B.4^8 {D3q􂢡-YG`2vvo@>^8 g-*wp~0KNM?y<Ն9xػIKYnݐNa_?=wcN5xf/}:)M~r+Iwr vg@Fa44sw Aiz#u#h,kx}CeFf>V{ز1Zb>-k#Xٹ7V;עbV1kA I}15,ޖk*΁3wt!?DbkdlB(s *STs-snN;X(Sʍۺ_JKnChW "a#=Fܺ4u Eurź͸@vRym *Sr]h)di[ϯGG{jO-jFiXԕ/bCsXq.<ҒlD!=#ewɚZ1anǙ4lDK(mgҢ~K,ǨE C(Cngs=m،dzL }wn޳5\?Į}m2y ]v~P$Fn _GYñ[R?˻aFcG7}ŗ `vaaSWD_;U]#@kBa}R2PND #׺Ϛt ؄N1j>8ү<%I;޿$YNaN-$;(X|s;9@D|t8ǢXD:xE"W&8ad($L|wqHC l.A`!ԐDynbt6VȲ1;):B7몒%E'ʸҨPV!-`FԦUhLV.3$4vѡU* :~OWr4@Mx.M"MM%e=߼zB[*X`ߡ&IζOYHr2ҋ}3 M0u{/_!)>9[Om2Ӽ&>[-yw\StLǀ߂'3`\̀]F*8*&_Aef1էB┳N) ?& ;` wݓ.*!tʀĐW7_/e[ag${tYZc[/b R>Eϵ {[a/0 (qT2$@T@ 1VU$%I 2S[.~  Е`A`H^ik ѴZ+wz}m*6cT뗛[wrW^/Mս ;r׺w9}1uw;} (SV::Qz$.}BS"r CګLit(E\C-$E-N_V' ;ރ$;5׋lʆ%'ʀ<׽0(~򇖃&ad,|;D[œ"]%^UVb(l/*X i )jC*1TW `rZ9 })ADi *7%=~Fl- S'mh!Ioý6SXwlX)D୩U-s<r\U*ށJcf# b핍I,-h6, `L/!I0 #Գ|wf-о7l1k簽{mԳa&e~k- 'bR.] 'gU i ӕ͏ '!\EtU8\=ɞu39X\ RT']یAs@Nf%Z!4䃫hNn3kĹotkaF떋A侣usaɬ[~AuBC>T"5|NB)m5>9Ab) X)0#=/Uo<1A;4XwGV$DuqQg1iA[sPBAs hvVEH!!`̀a3FvSO]´r2uv;Tu~r}|5H_JApE]FR.kHo;&j7MD}bS _.Χe*xc䥌CX+Vo^"$R / )f߶D(Fl{1Xp2hbep"K*Ծ8مpHa6Š/~8@hd&‹H nbzK~֐Vf ?H&,eaέ]K!SyuEŔ%V AxK{!4YFP18z ;EmǠ4R( xjpT7\\+2*J|ؓH}zFjh% #AW4t=@*{nbBJm} AЉpqiGbn=^'n<ԪF+Afb4cѺ2o!e[#z]!ds7ݜ/^j=zFH){.\{Ț4ѐr-|آ`Ug9B.h8c8?l DEһT)ghR"'Ѥ__L c|Ƣ$hB<܍A:IE\qƢG'>L<)]+{yf|ji)$'4NB&&FEū_"_3Y:Ah{睠ox@+h.ٗ$/͗-E-5)-)S; ឿI էD]W7_\/K6ʑ5p"*+*|]vJqi q_.딕#.I_6~}Ӫ]ԧ&K94݅&cHlcTvc?Oܭ"s-0E;8&"t֗`mwBWa9S{ ǻA,$EsGk^)u12,;6Y4,ٗ\˛Ukl, ;2MI 6|]*־j?hk/a P4E4:ACik v} )S z?4̀`Qi񅚋RǠ' Pn/4Rwwm)@@’h]]p uk}!WS@!W !V?OB~hER"<UV Q.xtu}_|6}[KC+0CA.i,Ǩeu{ Ŷ)lQ,GMZ]/cGGVtg8!ݡMT./q@'%/-/Nq|.&l]bþ'Hh.ʍ$C~<]U~Xir eo\ſ6=ׅ2 ,iM[_r̗ױ{7\P ::ѺL} p{y_=^nё?D[R{ǯD<B4?^= u S ' 9o8|M]օ f}j\SU\bVU/=/Y~\(fwer`l45Ԩ\j tNc.o?_׳~lSX7j}ܲ~\6w;X䈄F;݄;X7[[u{Ȁ9.`ؗǛƯW dO]4ħp ģ(%V֗Syvͱydt[3(>~[Ɏk[e *4- +eS\]`J6c?ï:nrЭ}qc꫎ Q٥Oo,Jهc7ڶ\cmʥ:nmKU$ə3'E<l^٘G/VM[_d>탦dک30l!p*a0q"-O>r]T~Y(*4^ۓ84il-Y p,8^x>,Lqpe~LIQnyvO7Y9)n8,?t/K{۾w{۾/Ƕ':. w-9 ZF0.y2whҝ@ M\3eC,,nʪ@ oj+UMSYm.lRѡ+CB8/i–-Cq;RKg;#A4O5i%Fܳ>,=+!585QlcƸD(”sNL9S + qs !~!{ QECJ]EXDp sQδRJ$<@8-%f:s4OT +6^ܠ" l{30S?ϬU"e 643#A n*Ҩt qӜvuaxLvYKvK.sVK47:Ar/-8tH[R\[LItrpDҀb Y "lD#p,%{0o+&a={Wr'ƌX{aGw^hx+Bܭ%🏬\P~*׍*T ]'*?#\qr%|}RूtWŸB }}GԠIW ״ҿuQ#SS&$Q1GoHB0({扵̢$BtcxL;.fғ藕غ196v̰ʪn |h#{oMY43P̿g3TYLe>dvpY7MQ5*I͗s/[iv";F Y݇ts&ފ(RHf/q4Pmvb@ǣ8QFyiRm<v;hwB ʬ #*UMrDZ(E*TBJ&T&1YHšARvԚ6Ւ6 5W})?25Z>'KI=z&ҋf)~,x?XOQUjYJanRwÛXU},@RBފ^tԣM%B+K/“b($C*\ESYԚb}&.JK:J ߯}xT rV 7q,-و;5[rܨ>DRp%;b/yJ~4P~l "k@&-Vsk(ʋ4.[9o~?׽ۻWa%'J*zm:N~^}P+/`I*0[8}2L0Q:Ђ0. J]ݮRWg_dq q`%; eX=\wN*!If/+zA r' /X_maS|*_ V=$@'BjGRcz|m`|5і ..@Cz!I^*ۗ8嘨NjHbJs.߃M6Ae61u5CIhPZ<ᖆ .H[B!Ec$ܵI mn:W jk@)q28" CB"bB0H&ԱahC-h$" Vyn0"lZPOn?Š(8n56 &iͦ僷o1zHA';N]%~hHllm,FJ +bBqcQ"(`(2HbfZ($1PjF67'1'!&4t3®o6>s=)/U%SY>%",1 #2[~ EU~juj\Ma|+"D| ؒt(xvO62Q)/ *~tMM`/ BWAø@ ?*)iwO0״mt@ ű[L!A~3͝{B۟J14zN5cNhځ#Z'8$T 1b>GU)*m7zN "Ή<"+e';n4pcH|2 {(+f"&};Xs83;pvps8X9WǏ<ܼyZ&F4N8XC|L8椥}S)΢+F"<;/A9Gb[Ͼ6v7ξ=L ̻JeH<\̀^Rm]5\o]5p~ / OZSL=T3%;7ZjnΖ xi3{FYa*׫~򫇃Y- pW6AK OVF{'rۯbCtVeHR [ |fgjj2;ZoCpD?|j+atJBYW뼟Wfsǣ%q-8l '8@Rt3N A/}5Z I-j{_LMrqRMc܏ZaI>xWRK g sq*nWԊ]*8Jn˛gV1LE5}U֔1pz2íbL9HàV+jbi:_\J/3] cO&E^dm|jeRjm5;r]nγ/zcOЩYׅIR6׹OSx@ni{BCu]WZbȟ8YMJѺ5Ձuu;%B3|۪uk!r6wGϤy[SP;X3NyRrS[<[ StSqc}RRK[J~`}c.{=qO4} nvY0}1G/Wp#Ż7nwD!~GUE/2U~,"HKci&ZYQ]7{R"XI&lRRاj?fRW^4K9c)gY zTZ|:^2K-KuEVK=JPlRY!R4Z3,h%reK)ci.WYʸKYy,e܏_z,ҏ\%ЫuNrLjNY*b>,u׹/;KK=JYWOf,+WHuĿpNw9aKJRk~=]8K2IaKci&:_2K-Kuو//*5W^6K1c)<W`)R!o!km1Bɣ3qD]&QD\2M"+q(M02H#CCL&s]F&Y"0$ "*A,A($ UAI[#h3KQDJEeE' fН "&*`«X 0;,yP|0KW+4^ ڌ*:S8x>M]\'c;NEx>ʫF7WM1~$ň2ۼm~;秛܈;32OƱ]} "гM_TSSS}YtnJݺ}K;j9f' Dpmtԩ'7l^>\1#.C/}g~xr+ܰ,o#Rz_`+  fLﳢF'S|5f2{n.zdu1A6u9',u6]T=*G['ϛθ3;~џ"Żrl5QԗN.Lk:G}2f8CQϞM}@.VU9~.]&sHJj\q-%f\ Iŧ8{t?0.߼]R>TL]wbs/! zЎNv]9QN)Q=:TuaMl'_84Uw^~x{joN<̊XdP6'?L&NW|o{&ctWjCfml+;Ec5!Vg\9VX 9h5?:TXmܮ(EɉP6:EJxb11y }Vy}xk-瀑nmTV`Iׇ.F+(ՇDj8*#%\MIJJ1σw6Ik\O06ѥYŹ)lm[EL)*/ZX0]'E 1'H[X E""IXhH3AiDY tuIiOj_?VE$rA C* auNYl9ٻF$W 3,}1Xvm{^<*xYEIū:He7Yʊ"222tbMD; kP~҅Z]Eg}߯ |o 7]y_VK8#Lz?%ΤBȻߟ,V)$C_F<"2ꟿ g/oh6_a9.;Q`#{p{LF%c䕝 faL+0oK7,9u6y9`qjoP@6l,Q=*B [TՔ&sފ@Fb70[J`ܥFEmWM$mѼLqɮ~FN!t[..~ 0rnsLd_|V77MnDo&_@|w9}6C;_/+<[e>+cFvyv:Un v"0=& 95`r2J>,A^I5ۛOӚUTň"N3E!6OMyyM3-4k5 _n'ċP6i#/કsSTj,1/(RԴ9 pC0}0v*#L +f^pnu1NjYuy\Ǒ=xLRQAz?r$#tđWVbѲxbX7AW͔ )w,3cZ3~\6ZI5[t>-)_䐎[V^ $mipJj) )5)/kY( E_B33-{gsP,?qp6f ^V둖:j<Q9B"8hC DaAwHuWād '#^2&hbġZEVJ+(cPN8?4 z O$EV3 0ƞ9%\"qxᝣ x'ȧJ(f",g̊ /_cԡaH(.h5Sa/B*È]N`j_!lb`gSxnt (m!%vy=|V_6wZu*z<"?d.C2%|a4w=77)'-2(A70fyOO]@Кu1< IQGxMk; uDt1{*Op!34A̿Iƍ կ9s4kykf9 =B-;,QcQ> 1*|IVX0g"Fآp=؅}5Ka?K'+?ד/]aU^VK#tOQW}U7}k>hZ~=XzO9CzbnlP#ҿA?7U߱c5pE`ANnU? >/|.g;{-E*Uͫ(WU\ fP+-%6`rq4^")\s%6Ȫ ;X*-MZA] >^C m Z,nB奟G7~L\$i^LhTS7SIobbBG WsNBcRGD1 C9^XAZx[8sDL@69;|-L}DM P>KIZ@1beS^۟jD e\9emk3T  4[Kd4E9 Di F#ΈR]y56X +w5.FHN2Z˼$FPX) S#y- s&02^M"i# ]4o 9]K/Dz6JbB$΋BK"c^DZhh= $iS37"/)3@lK1ᝣ24!Loxbg I'r2Z|ۭ[ZjTa3I>#;:#w9R>`d'>2󡒖 X#1VI/J]5y6 O)5Hb./f !xl.~x¢=x rjAW?fX=0^ٯ&o8T߳LG$(w~KxxHO6<#Lq}Hi^M+;}/WQ4h^EѼf5͐v92yk,yA H`.s^w@v5 d7}L"w+ *Y-g}bHU*,ΒX 1IH,scOhɱأ` SǝFZg)I롳h6xsf5 ?.VƳ夬JW9-5{ajYM [/ \LJs~TrVJpoXzJ@$uSb 9sR~PHAW2G7()'8>! j_'ϡ2Z~qS 3,;Ah A{Y' jK0ϓqx\YΏwy\+dM״ >llaM쯃rG73^“M6S@tWC)u;al /z] XɿM8Zdع{d|KPiW) MWKr.yctT 3uӻ Qձ;>vikCK8tL^F1ouo51dau1>[fDv)3#}[0ŧ3>걇Q[Ո0s;NxEhR1MPK3uJ=f,?ӽdO#X[Q(b'j{:8BGNs`41dq$ɽ7Bk Z9*w&'pr"Tj0$!2( ]-&щ鳐8^3i:I |40ž4IDP@ U-5oo6gדM$*r[V;30f}1U\ xII!X6>4)~ï|t=-җ4,w,/EȢS*{dJI`F'ÖTFLGeJ)lVYrŕmaِ%=OaK*-֏͒vϔv_)mnLeͳJGIK95G27/+=qkxdG1Zh<8h>cr$ʓ}gx03d#;}ڿ(}1~o޵6r#En|_drX%l/n2#ɳ,_%٭[I[-6*֏bD >3BRrY#gj1Uݲ-+~$?Fc;`T[!ڜׯX0{OPԸ?XݵW#UJc^]K1 E>T { חX TQWBb8sclK|'(y< BՅ!o34L;\z-pPCF3tl.y,J[8SK bk~*sNGQ> (^,Ƈ Ui_qPJ |3uMx]?YscAa'zR;$q`\*+!GAȣ Q]!yJ.ruKS9Nh)7$<T%[Cxꋒ*/?Zhx5.z$kN6 ^%ہΗIS'BRۢEn?gQo*7PH ?h:,\z9;oɜ=B?;mRP 5n ̕ipsPcrJw072aԦVCgQr;Iw8"#Cy m%ը;47HGQ4;h /Q*dk\OQ%dn=ƌ/m@VַV¬5Q,Q(Kž-R3AgU'qGJdh[ReEd&be .UkU]W1 ;|گC7a./)SC+)}}^d0OT['7k ʮ=J0mu%e=w{"ouAiH/V qN=%Q|4 <JШY"+hL`½<9r|g%sхh6d!#e!J4x[esS_kz\lU[4 <:ixV6:s"笤0kG1q< !O?|ͮlÒ7pd!yόa"tsN2hKJr ݋M}ܪmd ~zR B1ȎG>J4; h'3-Ĵ_X=# }g>xO&Dץ#JSy]'Ў𮯋B3SCેm#a_=SS)h&?*NXM8k{OռI %LW/PgߢzYr_Fϝ>~i\Rkw% 5`_׀Hip⽫ ( 28ͦ7'+ ]A82Q0Q0Q0Q݀ g^ WR2(+q v[A#FxbT}7EIS_:R5:rZGWrL2Dn op+dLIWZnu$).lShBZ<|pFTfBF~,:[Y%P%#ҔF*VUnw(\auh1bxqx_iʴ~P*ř@!?|vXf꫏>qb0'LfeuןH@o.ҨL7v}'\pLDQpI Sp`bg@\F4cRm? .r1]ro778X m׀(JT {ǀEfѣ8kT TS 8k$kHޕĕ %YUiךzQW&eKE)08hF]&̆b j\r);0aIQ]qiPB6B3ffmHe,E0Qbk@)X3GҎ܆E `z$"L:4닥]_,iӌhBj- iKEp\ mU 32gd{ECTh_[~:"iZw|%@g9%աl1~pq%fUo|3g# `@LM57X7.:@aj0|荋51ۣp;jqv{uyktA._AwZЀ<]i~&`X{n}dza=d!NXq ղeZ\>jkLZ:]U>QΥ{4*m$QN6V|s)nN{@Ռ1X3Ͽ:v]! r$D֠WMҬWm6Pr]1Hɰg4n߅ ,ǟIF<MAhlQ˷H0_c1@UW@tzJKy٨FHnAi*rUQ[zʰО;?rQlIC^&Tx`_C-FY-9wKAꤾw;_dZ;wj׏nmh W$Ҽw-1&IPɻҏGz64䅫NEthֲ 5O]u3ΫHHE%ȋA*n"W$ֶ\O.x/jLy^w*Rj]!x lhw}S7}J NdLm;AGpMy<QByg34kؼuc A:\ lC21.L-aQwn&-nAwѐӝKq4ZD7hrN&vZs~:&h왇Nwg'7gR+v>ᱱZl}ғݧz]O[mӘϿ1Y-O3|{ +'ѳTKyfvZ-YQLVr*IA-NVZۀlũ@tm֬u?Щ.x^kIjWpm4~מ\Ljx2ןr >J/{nEG1zǷgPxye]ޚ` @ ZId)hUmBEiȣ8y[QVsH2yQjsR!FZUYƠԮP!uE\ [<GJf9:l8^K '?_Sq6^rˮ8\v[IN/#LYCC܏'b!dZ fVDV8KP¶(]'odd/i"uflc+ J7:gD/Y(U>kyvdFbmo4'qNN}*1,4kY>#>= } x1׸Ce!`cK/LV" '5 -3%B2+Jg7Q@=m,cMBzdI*, ͌0',5/e IM\_nZf޾Mjf;r'03)hAQ̪fTH^^n~/6!=FCxjKl(advq9wEN`2 } hL5%>%x=j˻M+:][o#7+6K7E2@ IՖGjɒx`)J%ydny%_U.dJQ%ZPj'OTY PVQLe6Xi5::eMǰ]yBj)_*li>;|D Q'"N&@-V+u4pi3@uvyYj,'%CTBȪ݈92oVSJ5웯EƟZXxL(U[3(8ڏڙ&16_MTJ:X:- H*d|%[&StEƍRB}Ȥ8)I+@ĬFMkӨHP [iA[GȂGJzvPyc]3uֺ0}qq^+KZ:DIOr"D2e}KY3nWhYρ5!-HYYSVWK1" Vc#![ױh qyr Q1gGUU :xsż1h/DT Q1m)y(zK48#Cc2Yhk"UQFbh`2ä$5˘LWww>RsVqt탍(ȋѵ9@Bb9Q_2Hy1Bg1rSgsF/k"b># L*5]Jt7LF|2~4tJiuYߋ cbi"-2J/'3YX)Kҳh&m@aT2 YA1r͉3`BU!Rf ۄҬ^b~ ڡ_6TŪ`$i HhgܿT({*O...eͣ?d-QY0H͈AZn)]+ >Q/Jh:Rnc9A%I]Ddcv(=Sާ܄Ninyq]Nu*õ!2 /Q*Yda7D3*J.\=]1îǽ=0)SnV?pTD;P?Y͂]qh о7at( h齗ȗP+\1IIAE櫻XՃCqxb CΛfTC4b U!FUQ;C2y '}3."[dDhnDw ӜrڍؕjD?yw6s+1]c _Gj.HJd}Ph$zf:ifa^ 9~p@tMΡ^ڧr.7Z|kq[AƂd{HF hT'1Ck5rL(0azX\m%63@ KY(p#{ڪlGQK8yzR%V]-q.WcN>(-8Kճ͞%?ys2Of!på:Ni01&Q 2M 淉VFyjA*{KFnT~,h/[ݟ\/ BR%>R2 .MCtxmDzv&qӻóf+-lTlilaZwd9[8}p}$+U؉)|d!`|}+K%WsN'ldq,T%4}@vM%wpe͔%#$ {DJF9,1i`'7w5_i.\ALٚI0zyryΪARd:2 f0yg`/۵rR|&N‡4 |e9 Rt~ַ1jjv i6%&/seh3@\p%Cp!7]Jt@:LIsۇ!Dk8iޤpU&i6ŢL1rJ; 8]^}d*=^V—ͧc'unZhHDSJYd@hps-u'%~yS)eܣPsF~9-.CH9n9"`SlGsr99_spuq;=kgyIAF"nqdf8]7%7e7eHCͮo'1kSYnM|qO0V^W()~|cތaϷo-7߬U:A}X^[/8b3F鲂-ۇ5Zq!WIƘ: lMʀڦ3q2]6ܻva\[غb#B^9eGHx1gDA$zCI*ߋlVP_l Z{:r!-d !:hO8]jPZFLNP*j}%@vJ+hȸج%Jl}?H;]ܧ]V'}&>Uy/iHfi`Ŕr/M,*mT7:ߣ%=^N[+S`54#1N#HMSr͸-]eRM<]o\9J)e)gEb{Ju&& \eb:emJRjh <6шO0tƸ+<#8wdNØADy?\/o4gᙄP\d޲jWSr>\}Yc ͻ)z"C֐AH ]o4qS*, XrS,a ]梵q7M2d_Ig1@2PBƋmYMT@ZAj?6u"|~N>4r>ouG5k#҉σ$UrFhX钶HdnE>>յFM҄Dġ! (ְ!Z ו wkEO|A0[UVjQjG (Rs[NGem9+6^=)H0Z&`Hg1,EJ#?ᴧmOf#8K.~<}}/,f(Mp zf=BSӽ)t`qJz#zsל{&)3}3>¤vY'%t"#%7jM/u\&Cş-r :qzH_-l_)n)r=kEvw:)ٰY'Fyw+ \Qsrk<{Qɘ<8G_eU_}ۏm+{[w۲ }kbZ^zEj I\>vao0w޵4ŕ3ٿһYJTj=L|bVqCOJYZ?}FosiLFm2q&8smb\`Nӡuj1NA.E%N^ZR$af%Z~m?y'_H|a7Iuwҍ/siSMX}=F_=8R6g e8y"ݓb>~R׻~7}qq/SoA|_+AHۭA ƧԣU˿>\]o>w*'[\(ZV dCu`L Ͽ{asě% +!RҕDInŲ7Mߊd($%ycڌf&c}9wfxˁ~9eG '.3y=/8]{-$t;6-`'#g\Now{F͹sFz)-:_ι{| ;K5 hXXέ `{}H^IJ[ʘTrQ&NE,;=YH3vM\W盁'qvf0Xdܛ'IaWYk l\(>'"N 3Tv,B4bAo5")] /l44/ [Ox>*p>:p$:~etjUXS,l BF %Y,ۘCOL ;%6W'wT? SH̡?ɸxת"T6eN1zJa90\O>\&{W'D N/JZdb(ėAW `ٛm*\\Կ/1)׷$=93xp}z?箧 gs?p~%87GwihNַ _*-UE @9-2vdK񪿦g{M6!njC >P IC2 ύ|B- &% M-$W#*AJwfSyNrs IZ>Ћ%'A]AFcEe"o[BȷHɱ nFQp}gOzz!yAۡT5Y|r>(Y$~k587ͧV7Et't] d&d{.uρeKU(l{rer_}kʯvby'OV&̶]܉2:?9C Ҫ$$t&eָT3 ֓ F AnjuP\&Zj4pF^qNl|Cr|V- P.$jei-D;na*Jh p0:\rϨ)q]ADr![F~l}j{7sCU7wꐲN4*(*,` c rWSχoyUy%? gr6so}{} ey™~h gPX CǧgQI!(B4C6fUb+ߴEvOсTqYB*I?ԩn-V&*ba,>0#=:Ns8?|8U%RIN2FE#N/4Y#]MD=E{A^ߪ1[=K^w< ܋su>[}8Bv{sXUkxХZFHJSZAh,rwc;'!xQ?mmlT+YA3'}.ʞoŸyQ~RYEݮ..`qɟDgNUM>ZŁ8H%~n#I5 ˞<2WGGn &aLe@^iH]Xy.d#i*;!aRN@2Yo젝,A/) H<,0L-_w{8hBFzvZnjאI0ބv=JES@QM)/ws> ?ժH^$ j8RJǢRTZu"l!娶k"9Lt~z~?k xL Hsz㗭υ='0%uHwC驌O*>(NOƞ<[c1DLߙHvq0!ʴ !(+,6Iy_J*A#Ӕ9.k<2b%Q/[P/K=Ь|5nPx6~2un49xb 0(1=š^Ƹ~ݐF* -z/<IKe^3l-o'wbG^OƛESw{}z^FziXh"gpB0M!X4dkRi6a!Zw5*dշD2@YUXRrB\pgLz, adtMl*&j,䌡(__!M gzZ14J&0I<"4Ff1C7S#x^!gn}+lѕ5k O1”wv8b3%j%o巄[^>{[]HK`K8,_ ZT[$=K0$pb֋4I<vn)< 3`8 x}+^I_Z9`,2nWu.؞,2W=FޫAT`<W"w"olz. ҟVՔ1Ih(͊& dqc5Z]#d"4κ-k*C)`32DIfucX%YG5OQ7k;ey67x~!n!BfU\6ih{^m o{Ḱbfx[{S_,le`yH\vp<Ab>D*T]ayJaQT STkIJ&qHCT_R')}RJu)-t)!4QTJIJ&r&ՌSm,V-qIT_R:yO[JNR҇\Z}?yK)MԥC$G!%R:TV$~& d3O^~E7b0JWb{6|`pp3§7NA{˰?%')_ʷ~m7PZ??\%X!7gHu~NoJ~SBq̢V4wm吅<,/ Ƃo.b)'Z]{gU?|jhz4F{ aE3ܗ"^ֲ;Xb'Op%ě ۆR I_%@uǃW"rB*FdJ»\B1Ns { ZFZ݈'"|8"'%utzrT-`1u"da~< :xRj%쌔P4 sUn0bf HbiBT(s?s oΟ&J"),[9'aw?|ѫm 89D9 Ιd}|E6Xl$ :( XJW.CϕVoُ>= {/feǰ( =_|`fo͏϶&S#"+z ku>" 4FWR lc0ȞVdy"0BKъrlٜB ڐ ctc+ǍTâhu,19͖4.eIbц>^0֨F>y85N@A J`ض>^Ņ K1ZػiB!IOw#vB/'{>*}߲-ANe^#w]1EY!g8Τo_XV4h^EѼyU7l(\reaLΐ;|LurI xD+;槂];SzV5$c*<p1zA۴3^JE|5|E:D4 L1@u IYs Oe09v qVQ%〼\Ǻl ԁcz9cJ}Ὅs*XO^@4(2**nI)h"\ }V,fmnc"|o4 eR+ߨA1Y)'#U=qM\|I @ZkxNϽ2J#1R iU1‰ǵ~J[9L@S/GLԓjͽ-w˗{ \ȝ(AbA Gzgǘ;b9 oSa@rM`$ lt7S?b?nhKS>oYOXVOpsAHkBZ||I./n?n|F֓ oƾM<]|3:oDDv55d~~ !og>840Z|;9H!e,p+xiY@:`F4U`8G`C[@/ s,O#$^+W'uP"$p\' S#K,ȶT }R` 8!\}.1_Y"( #LR*5]+*^ /j -k*"UL?%@z_3ʊ筹䵊Syk6?zcBl3Xz֤n!>1͉cZVoNM%"d,Kzy-^Q^lbN׏~2O3fN|Neq\ 2""3_D|>3E Wj}8 mIQ ivX6$wHK[j\`V!Fj ;#Cbsؠ&zl2m7k?n;\%!hР) –5&{D۞b]_-{JYJD-/ gdAg1N09Omm:Jkh?Y{qeszfE6lWܫ+^rj=>nz[1,6`0DEۀjaAZFFȒ^{=;C`BY$us @?,=¹5XNb=(k̨ڐKNx<9ʚ2U9hܐ&fչb6e<ѳht}?+j0&^K3m~mf7⍦}uO$Lf}jizN5hvہr]lt;D=pɃ5F@9/ݏ@"ǽ*4;SEo<e= W3=] )O8:x8xّ F0K-^T^>3ubm&bf"b'b %}X͆UwGR׌PKhmV1J`M`f , 7%SB5|Z8`Bk=ZӖy]R%Rg kֲ]%s $:0p\WBk i&nk\LW;Oh=Wp|@ЈpI_B)`%xq$lbXssxfc~4հGib}{* e\4VgbiEl P#FSUj.h#|D~^YGRK cߏPԆY&<dĎDKm,K^D^NJ75Q+8?15Af,I `>-JaٷcMۛY0X'a`hJ`ζFH4qY<ќH"iyN"B 8IKŒL X #p8୙8oQ#܂EXO)Lw1QT"@0a˙03}bXF .TiIxJҚSHc(yc1QO)EMz@?2!$ g\fXSCm T2*WHaUZ]'f=_ϻuBiyV%5o)WR,\?Oe@jX8_~T5HDa0tV30{4 v б~ّ(bSrP3{JLl=~Hj>RiK]'b $S21Si NC|%?lHm- #'Hmt|4]z?t- MOINzgHTU Fm.ViD\gS3@4yꬃMMZ(LGMqlB\dr46b:HcTNUv_O^ݧnC=ƝK,|p )H}F[ |T'Mpng-|p )=Fdcn2Qw4n-z,FH-<ӺА;:E=uNv_bhy†G᫬var/home/core/zuul-output/logs/kubelet.log0000644000000000000000005153121515147025513017703 0ustar rootrootFeb 23 06:45:35 crc systemd[1]: Starting Kubernetes Kubelet... Feb 23 06:45:35 crc restorecon[4814]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:35 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:36 crc restorecon[4814]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 06:45:36 crc restorecon[4814]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 23 06:45:37 crc kubenswrapper[5118]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:45:37 crc kubenswrapper[5118]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 23 06:45:37 crc kubenswrapper[5118]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:45:37 crc kubenswrapper[5118]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:45:37 crc kubenswrapper[5118]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 23 06:45:37 crc kubenswrapper[5118]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.443714 5118 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449369 5118 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449414 5118 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449428 5118 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449444 5118 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449456 5118 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449466 5118 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449477 5118 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449490 5118 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449501 5118 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449512 5118 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449523 5118 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449534 5118 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449546 5118 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449557 5118 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449583 5118 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449593 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449603 5118 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449615 5118 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449625 5118 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449635 5118 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449645 5118 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449655 5118 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449666 5118 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449675 5118 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449685 5118 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449694 5118 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449701 5118 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449710 5118 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449721 5118 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449733 5118 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449745 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449761 5118 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449775 5118 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449786 5118 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449797 5118 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449807 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449818 5118 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449828 5118 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449841 5118 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449854 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449865 5118 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449877 5118 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449895 5118 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449907 5118 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449922 5118 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449937 5118 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449948 5118 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449959 5118 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449971 5118 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449982 5118 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.449995 5118 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450008 5118 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450018 5118 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450029 5118 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450039 5118 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450049 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450060 5118 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450073 5118 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450083 5118 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450131 5118 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450141 5118 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450151 5118 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450161 5118 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450170 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450181 5118 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450195 5118 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450209 5118 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450221 5118 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450230 5118 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450241 5118 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.450250 5118 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450444 5118 flags.go:64] FLAG: --address="0.0.0.0" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450467 5118 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450484 5118 flags.go:64] FLAG: --anonymous-auth="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450499 5118 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450515 5118 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450527 5118 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450542 5118 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450555 5118 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450567 5118 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450580 5118 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450593 5118 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450605 5118 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450617 5118 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450628 5118 flags.go:64] FLAG: --cgroup-root="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450639 5118 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450654 5118 flags.go:64] FLAG: --client-ca-file="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450665 5118 flags.go:64] FLAG: --cloud-config="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450676 5118 flags.go:64] FLAG: --cloud-provider="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450689 5118 flags.go:64] FLAG: --cluster-dns="[]" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450704 5118 flags.go:64] FLAG: --cluster-domain="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450715 5118 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450727 5118 flags.go:64] FLAG: --config-dir="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450738 5118 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450750 5118 flags.go:64] FLAG: --container-log-max-files="5" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450764 5118 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450775 5118 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450787 5118 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450799 5118 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450809 5118 flags.go:64] FLAG: --contention-profiling="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450821 5118 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450832 5118 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450845 5118 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450856 5118 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450871 5118 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450882 5118 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450897 5118 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450910 5118 flags.go:64] FLAG: --enable-load-reader="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450923 5118 flags.go:64] FLAG: --enable-server="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450935 5118 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450949 5118 flags.go:64] FLAG: --event-burst="100" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450962 5118 flags.go:64] FLAG: --event-qps="50" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450973 5118 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450984 5118 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.450995 5118 flags.go:64] FLAG: --eviction-hard="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451009 5118 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451020 5118 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451031 5118 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451043 5118 flags.go:64] FLAG: --eviction-soft="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451054 5118 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451065 5118 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451078 5118 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451090 5118 flags.go:64] FLAG: --experimental-mounter-path="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451139 5118 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451152 5118 flags.go:64] FLAG: --fail-swap-on="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451162 5118 flags.go:64] FLAG: --feature-gates="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451176 5118 flags.go:64] FLAG: --file-check-frequency="20s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451188 5118 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451200 5118 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451211 5118 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451223 5118 flags.go:64] FLAG: --healthz-port="10248" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451234 5118 flags.go:64] FLAG: --help="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451247 5118 flags.go:64] FLAG: --hostname-override="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451257 5118 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451269 5118 flags.go:64] FLAG: --http-check-frequency="20s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451281 5118 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451291 5118 flags.go:64] FLAG: --image-credential-provider-config="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451302 5118 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451314 5118 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451325 5118 flags.go:64] FLAG: --image-service-endpoint="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451336 5118 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451347 5118 flags.go:64] FLAG: --kube-api-burst="100" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451358 5118 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451370 5118 flags.go:64] FLAG: --kube-api-qps="50" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451381 5118 flags.go:64] FLAG: --kube-reserved="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451392 5118 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451403 5118 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451415 5118 flags.go:64] FLAG: --kubelet-cgroups="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451425 5118 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451436 5118 flags.go:64] FLAG: --lock-file="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451447 5118 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451458 5118 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451470 5118 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451487 5118 flags.go:64] FLAG: --log-json-split-stream="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451498 5118 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451510 5118 flags.go:64] FLAG: --log-text-split-stream="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451522 5118 flags.go:64] FLAG: --logging-format="text" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451534 5118 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451549 5118 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451561 5118 flags.go:64] FLAG: --manifest-url="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451572 5118 flags.go:64] FLAG: --manifest-url-header="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451586 5118 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451597 5118 flags.go:64] FLAG: --max-open-files="1000000" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451611 5118 flags.go:64] FLAG: --max-pods="110" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451624 5118 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451636 5118 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451646 5118 flags.go:64] FLAG: --memory-manager-policy="None" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451657 5118 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451670 5118 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451681 5118 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451693 5118 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451719 5118 flags.go:64] FLAG: --node-status-max-images="50" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451730 5118 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451742 5118 flags.go:64] FLAG: --oom-score-adj="-999" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451754 5118 flags.go:64] FLAG: --pod-cidr="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451766 5118 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451784 5118 flags.go:64] FLAG: --pod-manifest-path="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451796 5118 flags.go:64] FLAG: --pod-max-pids="-1" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451808 5118 flags.go:64] FLAG: --pods-per-core="0" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451819 5118 flags.go:64] FLAG: --port="10250" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451831 5118 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451841 5118 flags.go:64] FLAG: --provider-id="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451851 5118 flags.go:64] FLAG: --qos-reserved="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451861 5118 flags.go:64] FLAG: --read-only-port="10255" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451871 5118 flags.go:64] FLAG: --register-node="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451879 5118 flags.go:64] FLAG: --register-schedulable="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451888 5118 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451903 5118 flags.go:64] FLAG: --registry-burst="10" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451912 5118 flags.go:64] FLAG: --registry-qps="5" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451921 5118 flags.go:64] FLAG: --reserved-cpus="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451930 5118 flags.go:64] FLAG: --reserved-memory="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451941 5118 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451949 5118 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451958 5118 flags.go:64] FLAG: --rotate-certificates="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451968 5118 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451977 5118 flags.go:64] FLAG: --runonce="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451987 5118 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.451996 5118 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452005 5118 flags.go:64] FLAG: --seccomp-default="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452014 5118 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452022 5118 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452032 5118 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452040 5118 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452060 5118 flags.go:64] FLAG: --storage-driver-password="root" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452070 5118 flags.go:64] FLAG: --storage-driver-secure="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452079 5118 flags.go:64] FLAG: --storage-driver-table="stats" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452089 5118 flags.go:64] FLAG: --storage-driver-user="root" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452127 5118 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452136 5118 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452146 5118 flags.go:64] FLAG: --system-cgroups="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452155 5118 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452173 5118 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452182 5118 flags.go:64] FLAG: --tls-cert-file="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452192 5118 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452203 5118 flags.go:64] FLAG: --tls-min-version="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452213 5118 flags.go:64] FLAG: --tls-private-key-file="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452221 5118 flags.go:64] FLAG: --topology-manager-policy="none" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452230 5118 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452239 5118 flags.go:64] FLAG: --topology-manager-scope="container" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452248 5118 flags.go:64] FLAG: --v="2" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452259 5118 flags.go:64] FLAG: --version="false" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452270 5118 flags.go:64] FLAG: --vmodule="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452280 5118 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.452289 5118 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452506 5118 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452517 5118 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452526 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452535 5118 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452547 5118 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452555 5118 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452565 5118 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452573 5118 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452581 5118 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452589 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452600 5118 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.452613 5118 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453555 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453582 5118 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453597 5118 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453612 5118 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453624 5118 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453637 5118 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453647 5118 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453657 5118 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453667 5118 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453677 5118 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453687 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453697 5118 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453707 5118 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453716 5118 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453727 5118 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453737 5118 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453745 5118 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453753 5118 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453761 5118 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453768 5118 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453776 5118 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453784 5118 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453792 5118 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453800 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453807 5118 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453815 5118 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453823 5118 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453831 5118 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453838 5118 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453846 5118 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453856 5118 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453869 5118 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453877 5118 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453885 5118 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453893 5118 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453900 5118 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453909 5118 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453916 5118 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453924 5118 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453931 5118 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453939 5118 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453950 5118 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453959 5118 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453967 5118 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453975 5118 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453983 5118 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453991 5118 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.453999 5118 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454008 5118 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454016 5118 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454024 5118 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454032 5118 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454041 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454049 5118 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454056 5118 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454064 5118 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454072 5118 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454080 5118 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.454087 5118 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.454153 5118 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.465550 5118 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.465591 5118 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465712 5118 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465724 5118 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465730 5118 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465736 5118 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465741 5118 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465746 5118 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465752 5118 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465757 5118 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465762 5118 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465767 5118 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465771 5118 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465776 5118 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465781 5118 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465787 5118 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465792 5118 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465809 5118 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465814 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465820 5118 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465824 5118 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465829 5118 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465834 5118 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465839 5118 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465844 5118 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465849 5118 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465854 5118 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465859 5118 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465864 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465869 5118 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465875 5118 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465887 5118 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465893 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465900 5118 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465906 5118 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465913 5118 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465918 5118 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465923 5118 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465928 5118 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465934 5118 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465939 5118 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465944 5118 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465949 5118 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465956 5118 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465963 5118 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465968 5118 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465973 5118 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465978 5118 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465983 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465988 5118 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465993 5118 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.465998 5118 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466002 5118 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466015 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466021 5118 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466026 5118 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466030 5118 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466035 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466040 5118 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466044 5118 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466049 5118 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466056 5118 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466062 5118 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466067 5118 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466072 5118 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466077 5118 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466082 5118 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466087 5118 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466092 5118 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466113 5118 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466118 5118 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466123 5118 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.466127 5118 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.466138 5118 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470180 5118 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470228 5118 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470235 5118 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470246 5118 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470256 5118 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470264 5118 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470271 5118 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470277 5118 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470284 5118 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470290 5118 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470298 5118 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470305 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470312 5118 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470318 5118 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470323 5118 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470329 5118 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470334 5118 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470341 5118 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470347 5118 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470352 5118 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470358 5118 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470364 5118 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470369 5118 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470375 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470382 5118 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470389 5118 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470399 5118 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470405 5118 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470411 5118 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470418 5118 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470425 5118 feature_gate.go:330] unrecognized feature gate: Example Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470432 5118 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470438 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470444 5118 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470451 5118 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470457 5118 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470463 5118 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470469 5118 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470474 5118 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470480 5118 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470485 5118 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470492 5118 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470497 5118 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470503 5118 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470509 5118 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470515 5118 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470520 5118 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470526 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470532 5118 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470538 5118 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470544 5118 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470550 5118 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470555 5118 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470560 5118 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470566 5118 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470571 5118 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470576 5118 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470581 5118 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470587 5118 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470592 5118 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470598 5118 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470604 5118 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470609 5118 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470615 5118 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470620 5118 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470626 5118 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470631 5118 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470637 5118 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470643 5118 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470648 5118 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.470653 5118 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.470663 5118 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.471851 5118 server.go:940] "Client rotation is on, will bootstrap in background" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.477481 5118 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.477648 5118 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.479544 5118 server.go:997] "Starting client certificate rotation" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.479596 5118 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.479807 5118 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-27 21:57:09.799161529 +0000 UTC Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.479975 5118 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.504571 5118 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.508143 5118 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.509503 5118 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.526344 5118 log.go:25] "Validated CRI v1 runtime API" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.569568 5118 log.go:25] "Validated CRI v1 image API" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.572524 5118 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.578003 5118 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-23-06-35-52-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.578500 5118 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.603457 5118 manager.go:217] Machine: {Timestamp:2026-02-23 06:45:37.599854398 +0000 UTC m=+0.603639051 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9f1192b7-57d9-42cb-906c-9c985ef0a7ae BootID:e5e564e2-37a5-4e17-8e7d-53999163ef5a Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:51:49:20 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:51:49:20 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:8b:ea:9c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4c:39:80 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:28:49:ac Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:90:13:d1 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:87:33:70 Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:df:19:42 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:22:e8:ba:f0:71:38 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3a:7d:c7:9e:78:b3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.603935 5118 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.604187 5118 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.606815 5118 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.607152 5118 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.607207 5118 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.607583 5118 topology_manager.go:138] "Creating topology manager with none policy" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.607605 5118 container_manager_linux.go:303] "Creating device plugin manager" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.608202 5118 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.608261 5118 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.608542 5118 state_mem.go:36] "Initialized new in-memory state store" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.608675 5118 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.612522 5118 kubelet.go:418] "Attempting to sync node with API server" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.612562 5118 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.612607 5118 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.612632 5118 kubelet.go:324] "Adding apiserver pod source" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.612653 5118 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.617821 5118 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.617950 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.618052 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.618410 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.618544 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.619073 5118 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.621559 5118 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.623854 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.623898 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.623912 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.623926 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.623945 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.623958 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.623970 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.623996 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.624015 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.624025 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.624045 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.624055 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.625931 5118 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.626614 5118 server.go:1280] "Started kubelet" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.627701 5118 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.627703 5118 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.628503 5118 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.628504 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:37 crc systemd[1]: Started Kubernetes Kubelet. Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.630232 5118 server.go:460] "Adding debug handlers to kubelet server" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.630442 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.630495 5118 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.630679 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:24:55.818631025 +0000 UTC Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.631054 5118 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.631079 5118 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.631236 5118 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.631253 5118 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.631632 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="200ms" Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.631772 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.631867 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.633222 5118 factory.go:55] Registering systemd factory Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.633261 5118 factory.go:221] Registration of the systemd container factory successfully Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.633677 5118 factory.go:153] Registering CRI-O factory Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.633706 5118 factory.go:221] Registration of the crio container factory successfully Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.639181 5118 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.639225 5118 factory.go:103] Registering Raw factory Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.639247 5118 manager.go:1196] Started watching for new ooms in manager Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.641296 5118 manager.go:319] Starting recovery of all containers Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.640561 5118 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.46:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1896cd368eeb0741 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:45:37.626564417 +0000 UTC m=+0.630349000,LastTimestamp:2026-02-23 06:45:37.626564417 +0000 UTC m=+0.630349000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649387 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649457 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649476 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649493 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649507 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649521 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649540 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649552 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649567 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649581 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649594 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649607 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649621 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649641 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649656 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649675 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649695 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649713 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649734 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649747 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649798 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649841 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649858 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649904 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649920 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649934 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649952 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649967 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649982 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.649999 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650016 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650030 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650046 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650062 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650078 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650114 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650131 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650146 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650161 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650176 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650191 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650228 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650243 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650257 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650271 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650288 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650303 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650320 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650339 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650358 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650377 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650395 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650416 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650441 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650460 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650477 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650498 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650519 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650540 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650562 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650579 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650596 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650611 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650626 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650643 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650657 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650673 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650688 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650706 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650721 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650735 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650749 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650764 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650778 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650793 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650807 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650823 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650839 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650854 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650869 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650883 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650898 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650913 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650928 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650942 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650956 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650970 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.650983 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651000 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651014 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651029 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651043 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651056 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651072 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651117 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651133 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651147 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651161 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651177 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651190 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651205 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651219 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651233 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651246 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651266 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651282 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651297 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651313 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651328 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651344 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651358 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651373 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651388 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651402 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651420 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651434 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651446 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651461 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651477 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651491 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651506 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651529 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651543 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651557 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651570 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651585 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651598 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651613 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651628 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651650 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651664 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651677 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651690 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651704 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651718 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651730 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651744 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651758 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651770 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651784 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651803 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651820 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651834 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651847 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651860 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651874 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651889 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651904 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651917 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651932 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651949 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651964 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651983 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.651998 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652013 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652028 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652046 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652061 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652076 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652090 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652123 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652137 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652152 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652167 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652180 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652195 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652208 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652222 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652238 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652251 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652266 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652281 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652296 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652315 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652328 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652346 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652359 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652375 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652390 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652404 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652418 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652433 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652447 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652461 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652479 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652497 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652516 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652533 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652546 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652561 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652576 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652590 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652604 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652619 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652635 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652648 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652664 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652678 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652691 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652706 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652721 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652735 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.652748 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.654430 5118 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.654468 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.654491 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.654510 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.654528 5118 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.654542 5118 reconstruct.go:97] "Volume reconstruction finished" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.654551 5118 reconciler.go:26] "Reconciler: start to sync state" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.660262 5118 manager.go:324] Recovery completed Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.668494 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.669944 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.669994 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.670010 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.671390 5118 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.671509 5118 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.671619 5118 state_mem.go:36] "Initialized new in-memory state store" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.682329 5118 policy_none.go:49] "None policy: Start" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.683548 5118 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.683594 5118 state_mem.go:35] "Initializing new in-memory state store" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.692189 5118 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.695187 5118 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.695250 5118 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.695938 5118 kubelet.go:2335] "Starting kubelet main sync loop" Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.696170 5118 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 23 06:45:37 crc kubenswrapper[5118]: W0223 06:45:37.696292 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.696399 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.731520 5118 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.792730 5118 manager.go:334] "Starting Device Plugin manager" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.793175 5118 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.793209 5118 server.go:79] "Starting device plugin registration server" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.793755 5118 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.793781 5118 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.794713 5118 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.794924 5118 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.794940 5118 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.796577 5118 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.796708 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.798885 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.798931 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.798949 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.799260 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.800518 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.800615 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.801179 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.801225 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.801240 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.801403 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.801568 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.801617 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804219 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804243 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804255 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804268 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804276 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804293 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804315 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804356 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804376 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804518 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804547 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.804560 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.805813 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.805858 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.805875 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.806012 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.806041 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.806051 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.806227 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.807073 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.807118 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.807130 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.807261 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.807284 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.807450 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.807512 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.807890 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.807911 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.807920 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.808274 5118 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.809624 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.809903 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.810086 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.832889 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="400ms" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857342 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857469 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857516 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857553 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857589 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857633 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857734 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857830 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857879 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857917 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857952 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.857999 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.858059 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.858167 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.858220 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.894163 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.895859 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.895921 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.895939 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.895975 5118 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:37 crc kubenswrapper[5118]: E0223 06:45:37.896865 5118 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959461 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959568 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959625 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959672 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959718 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959760 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959778 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959848 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959876 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959766 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959848 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.959943 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960035 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960083 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960142 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960156 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960043 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960176 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960249 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960307 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960351 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960379 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960380 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960408 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960440 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960464 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960489 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960528 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960634 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:37 crc kubenswrapper[5118]: I0223 06:45:37.960647 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.097196 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.105324 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.105399 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.105413 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.105473 5118 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:38 crc kubenswrapper[5118]: E0223 06:45:38.106337 5118 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.128073 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.139206 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.172517 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:38 crc kubenswrapper[5118]: W0223 06:45:38.177627 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2cdd2a1c0ba6b2c4cd5f8ba04243554ba88b45ffccae0d754065826afce6db63 WatchSource:0}: Error finding container 2cdd2a1c0ba6b2c4cd5f8ba04243554ba88b45ffccae0d754065826afce6db63: Status 404 returned error can't find the container with id 2cdd2a1c0ba6b2c4cd5f8ba04243554ba88b45ffccae0d754065826afce6db63 Feb 23 06:45:38 crc kubenswrapper[5118]: W0223 06:45:38.182882 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-d47954b6c0eea942afee65679c1ae113c88477933fb942fc5738467b326a39ee WatchSource:0}: Error finding container d47954b6c0eea942afee65679c1ae113c88477933fb942fc5738467b326a39ee: Status 404 returned error can't find the container with id d47954b6c0eea942afee65679c1ae113c88477933fb942fc5738467b326a39ee Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.186451 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.190870 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:45:38 crc kubenswrapper[5118]: W0223 06:45:38.191704 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-23dc9eabc835f5f0459da6ca3c50f99f6467272f744f45af3aa4b30e6b3f5bde WatchSource:0}: Error finding container 23dc9eabc835f5f0459da6ca3c50f99f6467272f744f45af3aa4b30e6b3f5bde: Status 404 returned error can't find the container with id 23dc9eabc835f5f0459da6ca3c50f99f6467272f744f45af3aa4b30e6b3f5bde Feb 23 06:45:38 crc kubenswrapper[5118]: W0223 06:45:38.219715 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-9fe60780759835caabecec6249e22a42522ab7a2921d86eb3ea9c442854e0971 WatchSource:0}: Error finding container 9fe60780759835caabecec6249e22a42522ab7a2921d86eb3ea9c442854e0971: Status 404 returned error can't find the container with id 9fe60780759835caabecec6249e22a42522ab7a2921d86eb3ea9c442854e0971 Feb 23 06:45:38 crc kubenswrapper[5118]: E0223 06:45:38.234551 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="800ms" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.507015 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.509420 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.509490 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.509508 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.509550 5118 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:38 crc kubenswrapper[5118]: E0223 06:45:38.510318 5118 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.629546 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.631641 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 17:45:07.512058894 +0000 UTC Feb 23 06:45:38 crc kubenswrapper[5118]: W0223 06:45:38.649571 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:38 crc kubenswrapper[5118]: E0223 06:45:38.649699 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.701364 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d47954b6c0eea942afee65679c1ae113c88477933fb942fc5738467b326a39ee"} Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.704299 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2cdd2a1c0ba6b2c4cd5f8ba04243554ba88b45ffccae0d754065826afce6db63"} Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.707826 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9fe60780759835caabecec6249e22a42522ab7a2921d86eb3ea9c442854e0971"} Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.708993 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"366eba6c1ff14f2d0575c08ac62bae95dabe50e1a83a2535fbf6ca49bf62eb21"} Feb 23 06:45:38 crc kubenswrapper[5118]: I0223 06:45:38.709849 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"23dc9eabc835f5f0459da6ca3c50f99f6467272f744f45af3aa4b30e6b3f5bde"} Feb 23 06:45:39 crc kubenswrapper[5118]: E0223 06:45:39.035743 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="1.6s" Feb 23 06:45:39 crc kubenswrapper[5118]: W0223 06:45:39.073983 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:39 crc kubenswrapper[5118]: E0223 06:45:39.074165 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:39 crc kubenswrapper[5118]: W0223 06:45:39.163092 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:39 crc kubenswrapper[5118]: E0223 06:45:39.163254 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:39 crc kubenswrapper[5118]: W0223 06:45:39.221727 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:39 crc kubenswrapper[5118]: E0223 06:45:39.221858 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.310480 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.312441 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.312511 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.312531 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.312575 5118 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:39 crc kubenswrapper[5118]: E0223 06:45:39.313334 5118 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.629635 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.631985 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 07:33:24.19449522 +0000 UTC Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.698127 5118 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:45:39 crc kubenswrapper[5118]: E0223 06:45:39.704074 5118 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.715674 5118 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc" exitCode=0 Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.715756 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc"} Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.715878 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.717255 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.717306 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.717324 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.718225 5118 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f" exitCode=0 Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.718337 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f"} Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.718543 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.719965 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.720010 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.720029 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.720780 5118 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ecb8a412558b2e3e3df0e3a609fffc0a9f7798097289a9cbbe884e1ba514b059" exitCode=0 Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.720896 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ecb8a412558b2e3e3df0e3a609fffc0a9f7798097289a9cbbe884e1ba514b059"} Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.720964 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.722624 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.723547 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.723591 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.723603 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.724611 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.724641 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.724660 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.726536 5118 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="73a3a5320f48d96227f66df5acf532e7b02206494677605bb17e127dd1fceb50" exitCode=0 Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.726623 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"73a3a5320f48d96227f66df5acf532e7b02206494677605bb17e127dd1fceb50"} Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.726682 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.728498 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.728562 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.728578 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.732513 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7d222cb9a87bbfbcb33e86f0057d23be3c8706114b164516d5e9b63e0af94a65"} Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.732572 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"96e61f007646da29a9c70c05c5e11525eda1c87e3e1ed73002f6d004916d6af9"} Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.732591 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.732600 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"652ce8251f54f91e404de9c077c6fffb8161bfab7a27a04f5a396856397e5a75"} Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.732633 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ff2beab0cd75d0d4df4d459e85b7c74dbdbf8965df9638cc7f614fede66b4b7"} Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.733784 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.733825 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:39 crc kubenswrapper[5118]: I0223 06:45:39.733845 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.629442 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.632449 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 10:20:55.105299111 +0000 UTC Feb 23 06:45:40 crc kubenswrapper[5118]: E0223 06:45:40.638557 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="3.2s" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.737963 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1fa9b9c1f52c8cbe5449f5898752cc62c8b63a26834eb359b86c470818042849"} Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.738152 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.739949 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.739980 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.740018 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.740185 5118 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5" exitCode=0 Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.740302 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5"} Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.740317 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.741544 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.741567 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.741577 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.743881 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61"} Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.743924 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b"} Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.743933 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d"} Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.747063 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4a373caae03ae33651fc0507171a05c61d2ef46e65ab4bd2d194652132f61ec6"} Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.747106 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c316f6f42b61e6ee11abdee99648ca74a949e1beb4445787ac8cddc68b463f33"} Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.747117 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"836d1a68a6b52ad68fd79e9b99d7af9d95bceedcf7968039ebc1d0d075989bd6"} Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.747203 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.747227 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.748326 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.748356 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.748366 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.748717 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.748742 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.748756 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.914034 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.915740 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.915785 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.915797 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:40 crc kubenswrapper[5118]: I0223 06:45:40.915822 5118 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:40 crc kubenswrapper[5118]: E0223 06:45:40.916278 5118 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Feb 23 06:45:41 crc kubenswrapper[5118]: W0223 06:45:41.026125 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Feb 23 06:45:41 crc kubenswrapper[5118]: E0223 06:45:41.026228 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.632783 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 21:35:05.863278115 +0000 UTC Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.756708 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2cb77c0ebf9d6d51b3d4533b013c6dcdfa0d2cebf4e22dd80c42b0f8568167a5"} Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.756782 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e"} Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.756866 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.758843 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.758907 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.758932 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.760141 5118 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6" exitCode=0 Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.760248 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6"} Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.760285 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.760343 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.760390 5118 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.760457 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.761866 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.761932 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.761960 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.762498 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.762552 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.762570 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.762668 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.762749 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:41 crc kubenswrapper[5118]: I0223 06:45:41.762770 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.633349 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 21:21:34.58458173 +0000 UTC Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.771091 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"427a203a21051e361af4f4ca299111aa5e2279337557cf7d83e05216558ad594"} Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.771196 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3f2434820425ac9615407a8ba244560ae5d4c75881f7bf7bdcb405319ff0e6b4"} Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.771223 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"366161fd64e0a87e511b9216b8b1a36d0aa44ad8d46b53c339222c99d93aff2c"} Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.771254 5118 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.771345 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.773249 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.773331 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.773358 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.874021 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.874353 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.876130 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.876173 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:42 crc kubenswrapper[5118]: I0223 06:45:42.876190 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:43 crc kubenswrapper[5118]: I0223 06:45:43.634153 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 11:01:33.797724992 +0000 UTC Feb 23 06:45:43 crc kubenswrapper[5118]: I0223 06:45:43.792728 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"218f793d696acfac371a8415994ff3c3264392d23dad9615b9cef384c24b25d3"} Feb 23 06:45:43 crc kubenswrapper[5118]: I0223 06:45:43.792820 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d97834a43bfdaecc40978ad4e68df0062e68482eeb89f39ce769c2757d05defd"} Feb 23 06:45:43 crc kubenswrapper[5118]: I0223 06:45:43.792951 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:43 crc kubenswrapper[5118]: I0223 06:45:43.794902 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:43 crc kubenswrapper[5118]: I0223 06:45:43.794976 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:43 crc kubenswrapper[5118]: I0223 06:45:43.795005 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:43 crc kubenswrapper[5118]: I0223 06:45:43.832003 5118 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.116938 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.118823 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.118887 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.118911 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.118951 5118 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.437234 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.437497 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.439608 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.439663 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.439681 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.634578 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 21:14:46.302719575 +0000 UTC Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.795603 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.797159 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.797194 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:44 crc kubenswrapper[5118]: I0223 06:45:44.797206 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.347224 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.347489 5118 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.347547 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.349624 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.349689 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.349709 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.635243 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 19:24:37.046039524 +0000 UTC Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.767538 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.798616 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.799902 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.799954 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.799967 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.989788 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.990020 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.992077 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.992197 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:45 crc kubenswrapper[5118]: I0223 06:45:45.992224 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.289686 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.635558 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 20:08:56.792563441 +0000 UTC Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.802284 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.803680 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.803873 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.804033 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.821351 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.821583 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.823436 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.823472 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:46 crc kubenswrapper[5118]: I0223 06:45:46.823542 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:47 crc kubenswrapper[5118]: I0223 06:45:47.636749 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 19:26:05.414940038 +0000 UTC Feb 23 06:45:47 crc kubenswrapper[5118]: E0223 06:45:47.808399 5118 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:45:47 crc kubenswrapper[5118]: I0223 06:45:47.977312 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 23 06:45:47 crc kubenswrapper[5118]: I0223 06:45:47.977594 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:47 crc kubenswrapper[5118]: I0223 06:45:47.979290 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:47 crc kubenswrapper[5118]: I0223 06:45:47.979335 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:47 crc kubenswrapper[5118]: I0223 06:45:47.979349 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:48 crc kubenswrapper[5118]: I0223 06:45:48.637553 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 16:37:16.084360711 +0000 UTC Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.056922 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.057336 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.059589 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.059663 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.059689 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.066441 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.638629 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 08:49:10.97336538 +0000 UTC Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.813624 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.815192 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.815241 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.815295 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.820558 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.821547 5118 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:45:49 crc kubenswrapper[5118]: I0223 06:45:49.821666 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 06:45:50 crc kubenswrapper[5118]: I0223 06:45:50.639416 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 04:49:04.590338658 +0000 UTC Feb 23 06:45:50 crc kubenswrapper[5118]: I0223 06:45:50.819276 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:50 crc kubenswrapper[5118]: I0223 06:45:50.821588 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:50 crc kubenswrapper[5118]: I0223 06:45:50.821645 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:50 crc kubenswrapper[5118]: I0223 06:45:50.821663 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:51 crc kubenswrapper[5118]: W0223 06:45:51.386976 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 23 06:45:51 crc kubenswrapper[5118]: I0223 06:45:51.387145 5118 trace.go:236] Trace[163050639]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Feb-2026 06:45:41.385) (total time: 10001ms): Feb 23 06:45:51 crc kubenswrapper[5118]: Trace[163050639]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:45:51.386) Feb 23 06:45:51 crc kubenswrapper[5118]: Trace[163050639]: [10.001935163s] [10.001935163s] END Feb 23 06:45:51 crc kubenswrapper[5118]: E0223 06:45:51.387185 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 23 06:45:51 crc kubenswrapper[5118]: W0223 06:45:51.530955 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 23 06:45:51 crc kubenswrapper[5118]: I0223 06:45:51.531088 5118 trace.go:236] Trace[352455934]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Feb-2026 06:45:41.529) (total time: 10001ms): Feb 23 06:45:51 crc kubenswrapper[5118]: Trace[352455934]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:45:51.530) Feb 23 06:45:51 crc kubenswrapper[5118]: Trace[352455934]: [10.001837581s] [10.001837581s] END Feb 23 06:45:51 crc kubenswrapper[5118]: E0223 06:45:51.531146 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 23 06:45:51 crc kubenswrapper[5118]: I0223 06:45:51.629970 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 23 06:45:51 crc kubenswrapper[5118]: I0223 06:45:51.640381 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 22:56:47.231710955 +0000 UTC Feb 23 06:45:51 crc kubenswrapper[5118]: W0223 06:45:51.686717 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 23 06:45:51 crc kubenswrapper[5118]: I0223 06:45:51.686818 5118 trace.go:236] Trace[1960156201]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Feb-2026 06:45:41.684) (total time: 10001ms): Feb 23 06:45:51 crc kubenswrapper[5118]: Trace[1960156201]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:45:51.686) Feb 23 06:45:51 crc kubenswrapper[5118]: Trace[1960156201]: [10.001835971s] [10.001835971s] END Feb 23 06:45:51 crc kubenswrapper[5118]: E0223 06:45:51.686844 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 23 06:45:51 crc kubenswrapper[5118]: I0223 06:45:51.780620 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 23 06:45:51 crc kubenswrapper[5118]: I0223 06:45:51.780874 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:51 crc kubenswrapper[5118]: I0223 06:45:51.782362 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:51 crc kubenswrapper[5118]: I0223 06:45:51.782404 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:51 crc kubenswrapper[5118]: I0223 06:45:51.782416 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:52 crc kubenswrapper[5118]: E0223 06:45:52.594448 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:52Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 23 06:45:52 crc kubenswrapper[5118]: E0223 06:45:52.607150 5118 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:52Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:45:52 crc kubenswrapper[5118]: E0223 06:45:52.607469 5118 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:52 crc kubenswrapper[5118]: E0223 06:45:52.609770 5118 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:52Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1896cd368eeb0741 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:45:37.626564417 +0000 UTC m=+0.630349000,LastTimestamp:2026-02-23 06:45:37.626564417 +0000 UTC m=+0.630349000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:45:52 crc kubenswrapper[5118]: W0223 06:45:52.615180 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:52Z is after 2026-02-23T05:33:13Z Feb 23 06:45:52 crc kubenswrapper[5118]: E0223 06:45:52.615260 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.615549 5118 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.615653 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.621421 5118 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.621512 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.635204 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:52Z is after 2026-02-23T05:33:13Z Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.641413 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 08:26:48.967866457 +0000 UTC Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.827657 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.829584 5118 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2cb77c0ebf9d6d51b3d4533b013c6dcdfa0d2cebf4e22dd80c42b0f8568167a5" exitCode=255 Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.829634 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2cb77c0ebf9d6d51b3d4533b013c6dcdfa0d2cebf4e22dd80c42b0f8568167a5"} Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.829812 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.830684 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.830719 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.830729 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:52 crc kubenswrapper[5118]: I0223 06:45:52.831199 5118 scope.go:117] "RemoveContainer" containerID="2cb77c0ebf9d6d51b3d4533b013c6dcdfa0d2cebf4e22dd80c42b0f8568167a5" Feb 23 06:45:53 crc kubenswrapper[5118]: I0223 06:45:53.642514 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 23:44:33.235757705 +0000 UTC Feb 23 06:45:53 crc kubenswrapper[5118]: I0223 06:45:53.660128 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:53Z is after 2026-02-23T05:33:13Z Feb 23 06:45:53 crc kubenswrapper[5118]: I0223 06:45:53.835116 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 06:45:53 crc kubenswrapper[5118]: I0223 06:45:53.837395 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836"} Feb 23 06:45:53 crc kubenswrapper[5118]: I0223 06:45:53.837561 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:53 crc kubenswrapper[5118]: I0223 06:45:53.838570 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:53 crc kubenswrapper[5118]: I0223 06:45:53.838617 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:53 crc kubenswrapper[5118]: I0223 06:45:53.838634 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.633821 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:54Z is after 2026-02-23T05:33:13Z Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.643156 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 22:29:13.118156383 +0000 UTC Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.845408 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.846157 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.848594 5118 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836" exitCode=255 Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.848649 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836"} Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.848718 5118 scope.go:117] "RemoveContainer" containerID="2cb77c0ebf9d6d51b3d4533b013c6dcdfa0d2cebf4e22dd80c42b0f8568167a5" Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.848965 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.850357 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.850404 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.850420 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:54 crc kubenswrapper[5118]: I0223 06:45:54.851246 5118 scope.go:117] "RemoveContainer" containerID="38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836" Feb 23 06:45:54 crc kubenswrapper[5118]: E0223 06:45:54.851549 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.357331 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.634845 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:55Z is after 2026-02-23T05:33:13Z Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.643998 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 11:49:22.318598389 +0000 UTC Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.768288 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.854941 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.857676 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.859294 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.859408 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.859438 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.860852 5118 scope.go:117] "RemoveContainer" containerID="38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836" Feb 23 06:45:55 crc kubenswrapper[5118]: E0223 06:45:55.861339 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.865158 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:55 crc kubenswrapper[5118]: I0223 06:45:55.943289 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:45:56 crc kubenswrapper[5118]: I0223 06:45:56.633436 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:56Z is after 2026-02-23T05:33:13Z Feb 23 06:45:56 crc kubenswrapper[5118]: I0223 06:45:56.644219 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 00:30:47.058113119 +0000 UTC Feb 23 06:45:56 crc kubenswrapper[5118]: W0223 06:45:56.669208 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:56Z is after 2026-02-23T05:33:13Z Feb 23 06:45:56 crc kubenswrapper[5118]: E0223 06:45:56.669309 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:56 crc kubenswrapper[5118]: I0223 06:45:56.861211 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:56 crc kubenswrapper[5118]: I0223 06:45:56.862840 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:56 crc kubenswrapper[5118]: I0223 06:45:56.862891 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:56 crc kubenswrapper[5118]: I0223 06:45:56.862909 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:56 crc kubenswrapper[5118]: I0223 06:45:56.863695 5118 scope.go:117] "RemoveContainer" containerID="38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836" Feb 23 06:45:56 crc kubenswrapper[5118]: E0223 06:45:56.863957 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:57 crc kubenswrapper[5118]: W0223 06:45:57.402348 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:57Z is after 2026-02-23T05:33:13Z Feb 23 06:45:57 crc kubenswrapper[5118]: E0223 06:45:57.403045 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:57 crc kubenswrapper[5118]: W0223 06:45:57.514676 5118 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:57Z is after 2026-02-23T05:33:13Z Feb 23 06:45:57 crc kubenswrapper[5118]: E0223 06:45:57.514806 5118 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 23 06:45:57 crc kubenswrapper[5118]: I0223 06:45:57.632644 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:57Z is after 2026-02-23T05:33:13Z Feb 23 06:45:57 crc kubenswrapper[5118]: I0223 06:45:57.645023 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 05:21:33.254636119 +0000 UTC Feb 23 06:45:57 crc kubenswrapper[5118]: E0223 06:45:57.808681 5118 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 06:45:57 crc kubenswrapper[5118]: I0223 06:45:57.864361 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:57 crc kubenswrapper[5118]: I0223 06:45:57.866183 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:57 crc kubenswrapper[5118]: I0223 06:45:57.866274 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:57 crc kubenswrapper[5118]: I0223 06:45:57.866298 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:57 crc kubenswrapper[5118]: I0223 06:45:57.867380 5118 scope.go:117] "RemoveContainer" containerID="38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836" Feb 23 06:45:57 crc kubenswrapper[5118]: E0223 06:45:57.867715 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 23 06:45:58 crc kubenswrapper[5118]: I0223 06:45:58.635396 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:58Z is after 2026-02-23T05:33:13Z Feb 23 06:45:58 crc kubenswrapper[5118]: I0223 06:45:58.645729 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 15:15:25.590074967 +0000 UTC Feb 23 06:45:59 crc kubenswrapper[5118]: E0223 06:45:59.000852 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:59Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 23 06:45:59 crc kubenswrapper[5118]: I0223 06:45:59.008278 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:45:59 crc kubenswrapper[5118]: I0223 06:45:59.009869 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:45:59 crc kubenswrapper[5118]: I0223 06:45:59.009918 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:45:59 crc kubenswrapper[5118]: I0223 06:45:59.009931 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:45:59 crc kubenswrapper[5118]: I0223 06:45:59.009961 5118 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:45:59 crc kubenswrapper[5118]: E0223 06:45:59.013347 5118 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:59Z is after 2026-02-23T05:33:13Z" node="crc" Feb 23 06:45:59 crc kubenswrapper[5118]: I0223 06:45:59.634422 5118 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:59Z is after 2026-02-23T05:33:13Z Feb 23 06:45:59 crc kubenswrapper[5118]: I0223 06:45:59.646715 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 22:56:40.674969795 +0000 UTC Feb 23 06:45:59 crc kubenswrapper[5118]: I0223 06:45:59.816692 5118 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 23 06:45:59 crc kubenswrapper[5118]: I0223 06:45:59.822058 5118 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:45:59 crc kubenswrapper[5118]: I0223 06:45:59.822195 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:46:00 crc kubenswrapper[5118]: I0223 06:46:00.647358 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 06:01:35.971535996 +0000 UTC Feb 23 06:46:00 crc kubenswrapper[5118]: I0223 06:46:00.664172 5118 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:00 crc kubenswrapper[5118]: I0223 06:46:00.693009 5118 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 06:46:00 crc kubenswrapper[5118]: I0223 06:46:00.713433 5118 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.628285 5118 apiserver.go:52] "Watching apiserver" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.635040 5118 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.635480 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.636147 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.636253 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.637122 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.637847 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.638923 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.639017 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.639661 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.640219 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.640502 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.646300 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.646343 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.646406 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.646422 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.646510 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.646662 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.646337 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.646964 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.647044 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.647484 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 00:29:03.46800099 +0000 UTC Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.695458 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.717341 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.730780 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.732374 5118 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.747721 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.773602 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.801153 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.820203 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.821471 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.832518 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.832611 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.832654 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.832692 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.832727 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.833370 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.833382 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.833412 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.833483 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.833564 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.833633 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.834232 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835324 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835393 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835444 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.833985 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835476 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.834012 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835507 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.833859 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.834139 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835191 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835539 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835580 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835606 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835634 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835632 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835660 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835689 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835724 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835757 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835787 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835812 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835840 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835869 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835897 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835924 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835953 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835979 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.835982 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836008 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836043 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836066 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836090 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836135 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836161 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836192 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836216 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836239 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836262 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836291 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836316 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836338 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836359 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836388 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836414 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836440 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836470 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836509 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836534 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836560 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836588 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836617 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836640 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836662 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836684 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836705 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836727 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836748 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836771 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836794 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836816 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836841 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836867 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836920 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836943 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836967 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836992 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837020 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837045 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837073 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837125 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837152 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837179 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837206 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837233 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837265 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837294 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837328 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837358 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837388 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837415 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837441 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837474 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837502 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837528 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837555 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837580 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837604 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837628 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837654 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837678 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837704 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837732 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837757 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837781 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837804 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837829 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837853 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837882 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837911 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837937 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837964 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837992 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838024 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838050 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838076 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838118 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838142 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838167 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838190 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838216 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838241 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838263 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838287 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838314 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838344 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838369 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838396 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838424 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838451 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838477 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838504 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838531 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838559 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838586 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838611 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838637 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838661 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838686 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838710 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838733 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838758 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838782 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838808 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838834 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838862 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838886 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838910 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838934 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838958 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838982 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839007 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839034 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839058 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839082 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839130 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839159 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839187 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839211 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839236 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839261 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839288 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839313 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839340 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839363 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839389 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839413 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839438 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839463 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839489 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839514 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839541 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839567 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839595 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839623 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839651 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839678 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839703 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839731 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839757 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839783 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839809 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839835 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839861 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839887 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839911 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839936 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839962 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839990 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840016 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840043 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840218 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840249 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840276 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840304 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840331 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840357 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840383 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840411 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840455 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840482 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840510 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840539 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840566 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840635 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840674 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840705 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840737 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840765 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840799 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840826 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840855 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840907 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840940 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840971 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841001 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841027 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841054 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841181 5118 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841200 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841217 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841232 5118 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841247 5118 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841267 5118 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841281 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841296 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841312 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841325 5118 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836093 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836233 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836367 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836470 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.842267 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836522 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836781 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836854 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.836938 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837239 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837315 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837356 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837375 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837612 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837662 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837732 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837746 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837934 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837978 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837979 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.837978 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838002 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838187 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.838255 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839242 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839258 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839429 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839575 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.839803 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840010 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840347 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840359 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840615 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840758 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.840905 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841089 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841405 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.841723 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.842021 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.842183 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.842358 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.842415 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.842684 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.842692 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.842846 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.843511 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.843600 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.843804 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.844057 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.844443 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.845251 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.845283 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.846032 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.846600 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.846697 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.847047 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.847059 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.847155 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.847364 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.847842 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.848010 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.848331 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.848764 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.849605 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.850143 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.850656 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.851001 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.851505 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.851833 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.852030 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.852334 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.852447 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.852778 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.852892 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.852973 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.853258 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.853822 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.853858 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.853861 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.853904 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.853932 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.853950 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.854446 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.854857 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.854940 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.855029 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.855380 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.855789 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.855953 5118 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.856041 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:02.356013822 +0000 UTC m=+25.359798405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.856445 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.856706 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.856726 5118 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.856774 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:02.356763249 +0000 UTC m=+25.360547832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.857397 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.857500 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.857630 5118 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.857716 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.858030 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.858448 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.858658 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.858801 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.858860 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.859420 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.859345 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.859704 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.859766 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.859955 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.859954 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.860649 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.860705 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.861183 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.861261 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.861648 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.861729 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.861809 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.862735 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.862773 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.862857 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.863043 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.864434 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.864587 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.865055 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.865563 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.865774 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.865807 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.865809 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.865889 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.865953 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.866432 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.866530 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.866599 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.866913 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.866932 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.867617 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.867607 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.867618 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.868004 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.868168 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.868356 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.868393 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.868452 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.868521 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.868711 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.868843 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.869177 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.869306 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.869348 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.869523 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.869563 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.869828 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.869884 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.870112 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.870459 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.870542 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.870649 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.870666 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.870695 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.870765 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.871070 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.871532 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:02.371511213 +0000 UTC m=+25.375295796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.871531 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.871895 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.872441 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.872657 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.872667 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.873018 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.873215 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.873436 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.873699 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.873918 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.874091 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.874950 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.875482 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.876783 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.876850 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.876880 5118 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.876984 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:02.376953875 +0000 UTC m=+25.380738488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.879717 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.880395 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.883652 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.883692 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.883713 5118 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.883784 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:02.383760758 +0000 UTC m=+25.387545541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.885563 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.885582 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.885698 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.886024 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.886011 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.886505 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.887896 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.888284 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.888310 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.888498 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.888909 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.889039 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.889628 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.889892 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.889926 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.889999 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.890811 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.890839 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.891711 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.891924 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.892454 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.892615 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.892762 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.900896 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.905552 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.908393 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.915925 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.918817 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.922591 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.931455 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.941959 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942297 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942432 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942456 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942534 5118 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942559 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942354 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942581 5118 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942678 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942698 5118 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942715 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942730 5118 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942745 5118 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942758 5118 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942771 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942787 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942802 5118 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942817 5118 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942831 5118 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942845 5118 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942859 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942873 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942886 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942898 5118 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942911 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942924 5118 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942937 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942949 5118 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942962 5118 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942976 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.942989 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943002 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943015 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943028 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943041 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943056 5118 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943068 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943082 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943129 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943142 5118 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943154 5118 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943167 5118 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943180 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943193 5118 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943206 5118 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943218 5118 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943231 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943244 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943256 5118 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943267 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943282 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943294 5118 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943307 5118 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943318 5118 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943333 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943350 5118 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943367 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943384 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943398 5118 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943410 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943424 5118 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943436 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943448 5118 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943461 5118 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943474 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943490 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943508 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943526 5118 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943545 5118 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943560 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943573 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943585 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943600 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943618 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943638 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943656 5118 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943670 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943683 5118 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943696 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943662 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943708 5118 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943843 5118 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943858 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943873 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943885 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943899 5118 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943911 5118 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943925 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943937 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943950 5118 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943962 5118 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943975 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943987 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.943998 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944011 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944024 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944036 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944047 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944060 5118 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944074 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944087 5118 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944127 5118 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944140 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944153 5118 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944166 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944178 5118 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944190 5118 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944201 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944214 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944226 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944237 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944249 5118 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944260 5118 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944272 5118 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944283 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944295 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944306 5118 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944317 5118 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944329 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944341 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944352 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944363 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944375 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944386 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944399 5118 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944411 5118 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944422 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944441 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944453 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944464 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944476 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944490 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944503 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944515 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944526 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944537 5118 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944550 5118 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944561 5118 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944574 5118 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944587 5118 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944623 5118 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944635 5118 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944645 5118 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944658 5118 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944703 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944717 5118 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944730 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944742 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944754 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944767 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944779 5118 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944790 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944802 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944821 5118 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944845 5118 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944857 5118 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944869 5118 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944883 5118 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944896 5118 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944908 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944919 5118 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944931 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944944 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944957 5118 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944979 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.944993 5118 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945005 5118 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945018 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945030 5118 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945042 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945055 5118 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945067 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945078 5118 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945119 5118 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945133 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945146 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945159 5118 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945171 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945184 5118 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945197 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945210 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945221 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945246 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945260 5118 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945272 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945285 5118 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945298 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945310 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.945321 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.955721 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.962574 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.976933 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.978240 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228d1c0a-4133-43da-a328-063f42677662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f2434820425ac9615407a8ba244560ae5d4c75881f7bf7bdcb405319ff0e6b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://427a203a21051e361af4f4ca299111aa5e2279337557cf7d83e05216558ad594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97834a43bfdaecc40978ad4e68df0062e68482eeb89f39ce769c2757d05defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://218f793d696acfac371a8415994ff3c3264392d23dad9615b9cef384c24b25d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://366161fd64e0a87e511b9216b8b1a36d0aa44ad8d46b53c339222c99d93aff2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: W0223 06:46:01.978520 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-f20a8a0ebfeb2b14d0bbc5c0eb975b98e59119a383739eb31cfea0c1556c4526 WatchSource:0}: Error finding container f20a8a0ebfeb2b14d0bbc5c0eb975b98e59119a383739eb31cfea0c1556c4526: Status 404 returned error can't find the container with id f20a8a0ebfeb2b14d0bbc5c0eb975b98e59119a383739eb31cfea0c1556c4526 Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.983130 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:46:01 crc kubenswrapper[5118]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 23 06:46:01 crc kubenswrapper[5118]: set -o allexport Feb 23 06:46:01 crc kubenswrapper[5118]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 23 06:46:01 crc kubenswrapper[5118]: source /etc/kubernetes/apiserver-url.env Feb 23 06:46:01 crc kubenswrapper[5118]: else Feb 23 06:46:01 crc kubenswrapper[5118]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 23 06:46:01 crc kubenswrapper[5118]: exit 1 Feb 23 06:46:01 crc kubenswrapper[5118]: fi Feb 23 06:46:01 crc kubenswrapper[5118]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 23 06:46:01 crc kubenswrapper[5118]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:46:01 crc kubenswrapper[5118]: > logger="UnhandledError" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.984576 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 23 06:46:01 crc kubenswrapper[5118]: W0223 06:46:01.990253 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6fd9878f5e500050b2d5fced456d00a3187da71596f15a5097d916dd9fad3c41 WatchSource:0}: Error finding container 6fd9878f5e500050b2d5fced456d00a3187da71596f15a5097d916dd9fad3c41: Status 404 returned error can't find the container with id 6fd9878f5e500050b2d5fced456d00a3187da71596f15a5097d916dd9fad3c41 Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.991064 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:01 crc kubenswrapper[5118]: I0223 06:46:01.992841 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.993132 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:46:01 crc kubenswrapper[5118]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 23 06:46:01 crc kubenswrapper[5118]: if [[ -f "/env/_master" ]]; then Feb 23 06:46:01 crc kubenswrapper[5118]: set -o allexport Feb 23 06:46:01 crc kubenswrapper[5118]: source "/env/_master" Feb 23 06:46:01 crc kubenswrapper[5118]: set +o allexport Feb 23 06:46:01 crc kubenswrapper[5118]: fi Feb 23 06:46:01 crc kubenswrapper[5118]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 23 06:46:01 crc kubenswrapper[5118]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 23 06:46:01 crc kubenswrapper[5118]: ho_enable="--enable-hybrid-overlay" Feb 23 06:46:01 crc kubenswrapper[5118]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 23 06:46:01 crc kubenswrapper[5118]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 23 06:46:01 crc kubenswrapper[5118]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 23 06:46:01 crc kubenswrapper[5118]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 23 06:46:01 crc kubenswrapper[5118]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 23 06:46:01 crc kubenswrapper[5118]: --webhook-host=127.0.0.1 \ Feb 23 06:46:01 crc kubenswrapper[5118]: --webhook-port=9743 \ Feb 23 06:46:01 crc kubenswrapper[5118]: ${ho_enable} \ Feb 23 06:46:01 crc kubenswrapper[5118]: --enable-interconnect \ Feb 23 06:46:01 crc kubenswrapper[5118]: --disable-approver \ Feb 23 06:46:01 crc kubenswrapper[5118]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 23 06:46:01 crc kubenswrapper[5118]: --wait-for-kubernetes-api=200s \ Feb 23 06:46:01 crc kubenswrapper[5118]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 23 06:46:01 crc kubenswrapper[5118]: --loglevel="${LOGLEVEL}" Feb 23 06:46:01 crc kubenswrapper[5118]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:46:01 crc kubenswrapper[5118]: > logger="UnhandledError" Feb 23 06:46:01 crc kubenswrapper[5118]: E0223 06:46:01.998850 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:46:01 crc kubenswrapper[5118]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 23 06:46:01 crc kubenswrapper[5118]: if [[ -f "/env/_master" ]]; then Feb 23 06:46:01 crc kubenswrapper[5118]: set -o allexport Feb 23 06:46:01 crc kubenswrapper[5118]: source "/env/_master" Feb 23 06:46:01 crc kubenswrapper[5118]: set +o allexport Feb 23 06:46:01 crc kubenswrapper[5118]: fi Feb 23 06:46:01 crc kubenswrapper[5118]: Feb 23 06:46:01 crc kubenswrapper[5118]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 23 06:46:01 crc kubenswrapper[5118]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 23 06:46:01 crc kubenswrapper[5118]: --disable-webhook \ Feb 23 06:46:01 crc kubenswrapper[5118]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 23 06:46:01 crc kubenswrapper[5118]: --loglevel="${LOGLEVEL}" Feb 23 06:46:01 crc kubenswrapper[5118]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:46:01 crc kubenswrapper[5118]: > logger="UnhandledError" Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.000020 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.004880 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:02 crc kubenswrapper[5118]: W0223 06:46:02.007146 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-3effdac3cdbb645aeacdd9593995323b1813de26e71ea22fe733cff076df286b WatchSource:0}: Error finding container 3effdac3cdbb645aeacdd9593995323b1813de26e71ea22fe733cff076df286b: Status 404 returned error can't find the container with id 3effdac3cdbb645aeacdd9593995323b1813de26e71ea22fe733cff076df286b Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.010629 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.012007 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.017463 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.034157 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.450707 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.450852 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.450892 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.450926 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451042 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:03.450996995 +0000 UTC m=+26.454781608 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451080 5118 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451215 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:03.45118674 +0000 UTC m=+26.454971323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.451206 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451369 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451391 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451444 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451472 5118 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451506 5118 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451395 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451650 5118 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451546 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:03.451529997 +0000 UTC m=+26.455314600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451835 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:03.451735992 +0000 UTC m=+26.455520635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.451898 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:03.451880855 +0000 UTC m=+26.455665468 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.648504 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 09:25:57.576345657 +0000 UTC Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.881433 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6fd9878f5e500050b2d5fced456d00a3187da71596f15a5097d916dd9fad3c41"} Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.882840 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f20a8a0ebfeb2b14d0bbc5c0eb975b98e59119a383739eb31cfea0c1556c4526"} Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.884344 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3effdac3cdbb645aeacdd9593995323b1813de26e71ea22fe733cff076df286b"} Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.884663 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:46:02 crc kubenswrapper[5118]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 23 06:46:02 crc kubenswrapper[5118]: if [[ -f "/env/_master" ]]; then Feb 23 06:46:02 crc kubenswrapper[5118]: set -o allexport Feb 23 06:46:02 crc kubenswrapper[5118]: source "/env/_master" Feb 23 06:46:02 crc kubenswrapper[5118]: set +o allexport Feb 23 06:46:02 crc kubenswrapper[5118]: fi Feb 23 06:46:02 crc kubenswrapper[5118]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Feb 23 06:46:02 crc kubenswrapper[5118]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Feb 23 06:46:02 crc kubenswrapper[5118]: ho_enable="--enable-hybrid-overlay" Feb 23 06:46:02 crc kubenswrapper[5118]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Feb 23 06:46:02 crc kubenswrapper[5118]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Feb 23 06:46:02 crc kubenswrapper[5118]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Feb 23 06:46:02 crc kubenswrapper[5118]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 23 06:46:02 crc kubenswrapper[5118]: --webhook-cert-dir="/etc/webhook-cert" \ Feb 23 06:46:02 crc kubenswrapper[5118]: --webhook-host=127.0.0.1 \ Feb 23 06:46:02 crc kubenswrapper[5118]: --webhook-port=9743 \ Feb 23 06:46:02 crc kubenswrapper[5118]: ${ho_enable} \ Feb 23 06:46:02 crc kubenswrapper[5118]: --enable-interconnect \ Feb 23 06:46:02 crc kubenswrapper[5118]: --disable-approver \ Feb 23 06:46:02 crc kubenswrapper[5118]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Feb 23 06:46:02 crc kubenswrapper[5118]: --wait-for-kubernetes-api=200s \ Feb 23 06:46:02 crc kubenswrapper[5118]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Feb 23 06:46:02 crc kubenswrapper[5118]: --loglevel="${LOGLEVEL}" Feb 23 06:46:02 crc kubenswrapper[5118]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:46:02 crc kubenswrapper[5118]: > logger="UnhandledError" Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.885321 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:46:02 crc kubenswrapper[5118]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Feb 23 06:46:02 crc kubenswrapper[5118]: set -o allexport Feb 23 06:46:02 crc kubenswrapper[5118]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Feb 23 06:46:02 crc kubenswrapper[5118]: source /etc/kubernetes/apiserver-url.env Feb 23 06:46:02 crc kubenswrapper[5118]: else Feb 23 06:46:02 crc kubenswrapper[5118]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Feb 23 06:46:02 crc kubenswrapper[5118]: exit 1 Feb 23 06:46:02 crc kubenswrapper[5118]: fi Feb 23 06:46:02 crc kubenswrapper[5118]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Feb 23 06:46:02 crc kubenswrapper[5118]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:46:02 crc kubenswrapper[5118]: > logger="UnhandledError" Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.886839 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.887273 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.887682 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 06:46:02 crc kubenswrapper[5118]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Feb 23 06:46:02 crc kubenswrapper[5118]: if [[ -f "/env/_master" ]]; then Feb 23 06:46:02 crc kubenswrapper[5118]: set -o allexport Feb 23 06:46:02 crc kubenswrapper[5118]: source "/env/_master" Feb 23 06:46:02 crc kubenswrapper[5118]: set +o allexport Feb 23 06:46:02 crc kubenswrapper[5118]: fi Feb 23 06:46:02 crc kubenswrapper[5118]: Feb 23 06:46:02 crc kubenswrapper[5118]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Feb 23 06:46:02 crc kubenswrapper[5118]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Feb 23 06:46:02 crc kubenswrapper[5118]: --disable-webhook \ Feb 23 06:46:02 crc kubenswrapper[5118]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Feb 23 06:46:02 crc kubenswrapper[5118]: --loglevel="${LOGLEVEL}" Feb 23 06:46:02 crc kubenswrapper[5118]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Feb 23 06:46:02 crc kubenswrapper[5118]: > logger="UnhandledError" Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.888452 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Feb 23 06:46:02 crc kubenswrapper[5118]: E0223 06:46:02.888944 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.901291 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.915460 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.938304 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228d1c0a-4133-43da-a328-063f42677662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f2434820425ac9615407a8ba244560ae5d4c75881f7bf7bdcb405319ff0e6b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://427a203a21051e361af4f4ca299111aa5e2279337557cf7d83e05216558ad594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97834a43bfdaecc40978ad4e68df0062e68482eeb89f39ce769c2757d05defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://218f793d696acfac371a8415994ff3c3264392d23dad9615b9cef384c24b25d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://366161fd64e0a87e511b9216b8b1a36d0aa44ad8d46b53c339222c99d93aff2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.955032 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.969429 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.977964 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:02 crc kubenswrapper[5118]: I0223 06:46:02.987444 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.003917 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.020699 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.032901 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.044031 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.061261 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228d1c0a-4133-43da-a328-063f42677662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f2434820425ac9615407a8ba244560ae5d4c75881f7bf7bdcb405319ff0e6b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://427a203a21051e361af4f4ca299111aa5e2279337557cf7d83e05216558ad594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97834a43bfdaecc40978ad4e68df0062e68482eeb89f39ce769c2757d05defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://218f793d696acfac371a8415994ff3c3264392d23dad9615b9cef384c24b25d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://366161fd64e0a87e511b9216b8b1a36d0aa44ad8d46b53c339222c99d93aff2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.077261 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.092906 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.460176 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.460275 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.460318 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.460357 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460436 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:05.460388017 +0000 UTC m=+28.464172630 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460493 5118 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.460519 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460579 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:05.46055019 +0000 UTC m=+28.464334803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460579 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460624 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460644 5118 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460712 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:05.460694334 +0000 UTC m=+28.464478947 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460713 5118 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460730 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460782 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460792 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:05.460771536 +0000 UTC m=+28.464556219 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460802 5118 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.460904 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:05.460876788 +0000 UTC m=+28.464661401 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.649499 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 05:08:03.112546567 +0000 UTC Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.697311 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.697401 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.697341 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.697549 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.697716 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:03 crc kubenswrapper[5118]: E0223 06:46:03.697967 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.706077 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.707954 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.710736 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.712553 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.714838 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.716341 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.717852 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.720052 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.721397 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.722704 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.723866 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.725330 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.726503 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.727614 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.730811 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.731913 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.733927 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.734743 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.735899 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.737856 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.738837 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.739996 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.740896 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.742274 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.743238 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.744489 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.745853 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.746833 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.748987 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.750180 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.751137 5118 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.751345 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.754471 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.755848 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.757804 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.761815 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.763278 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.765207 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.766714 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.769131 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.770423 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.772819 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.774397 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.777220 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.778521 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.780870 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.782242 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.784555 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.785859 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.788148 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.789617 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.791645 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.792842 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 23 06:46:03 crc kubenswrapper[5118]: I0223 06:46:03.793842 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 23 06:46:04 crc kubenswrapper[5118]: I0223 06:46:04.649713 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 22:29:06.814797419 +0000 UTC Feb 23 06:46:05 crc kubenswrapper[5118]: I0223 06:46:05.480657 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:05 crc kubenswrapper[5118]: I0223 06:46:05.480763 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:05 crc kubenswrapper[5118]: I0223 06:46:05.480806 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.480832 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:09.480799097 +0000 UTC m=+32.484583670 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:05 crc kubenswrapper[5118]: I0223 06:46:05.480891 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:05 crc kubenswrapper[5118]: I0223 06:46:05.480956 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.480900 5118 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.481036 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.481148 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.481168 5118 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.481058 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.481244 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.481258 5118 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.480997 5118 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.481089 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:09.481061434 +0000 UTC m=+32.484846037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.481361 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:09.48134039 +0000 UTC m=+32.485125003 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.481392 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:09.481377851 +0000 UTC m=+32.485162464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.481420 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:09.481407751 +0000 UTC m=+32.485192354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:05 crc kubenswrapper[5118]: I0223 06:46:05.650303 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 05:23:38.009681854 +0000 UTC Feb 23 06:46:05 crc kubenswrapper[5118]: I0223 06:46:05.697306 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:05 crc kubenswrapper[5118]: I0223 06:46:05.697408 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:05 crc kubenswrapper[5118]: I0223 06:46:05.697353 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.697555 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.697676 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:05 crc kubenswrapper[5118]: E0223 06:46:05.697909 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.013755 5118 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.016506 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.016582 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.016609 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.016746 5118 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.026787 5118 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.027059 5118 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.028715 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.028775 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.028865 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.028901 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.028931 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: E0223 06:46:06.052300 5118 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e5e564e2-37a5-4e17-8e7d-53999163ef5a\\\",\\\"systemUUID\\\":\\\"9f1192b7-57d9-42cb-906c-9c985ef0a7ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.058167 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.058235 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.058258 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.058288 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.058312 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: E0223 06:46:06.084327 5118 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e5e564e2-37a5-4e17-8e7d-53999163ef5a\\\",\\\"systemUUID\\\":\\\"9f1192b7-57d9-42cb-906c-9c985ef0a7ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.091573 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.091639 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.091653 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.091676 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.091692 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: E0223 06:46:06.104660 5118 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e5e564e2-37a5-4e17-8e7d-53999163ef5a\\\",\\\"systemUUID\\\":\\\"9f1192b7-57d9-42cb-906c-9c985ef0a7ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.110167 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.110232 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.110256 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.110290 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.110318 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: E0223 06:46:06.129908 5118 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e5e564e2-37a5-4e17-8e7d-53999163ef5a\\\",\\\"systemUUID\\\":\\\"9f1192b7-57d9-42cb-906c-9c985ef0a7ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.136242 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.136321 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.136351 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.136384 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.136413 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: E0223 06:46:06.153905 5118 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e5e564e2-37a5-4e17-8e7d-53999163ef5a\\\",\\\"systemUUID\\\":\\\"9f1192b7-57d9-42cb-906c-9c985ef0a7ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:06 crc kubenswrapper[5118]: E0223 06:46:06.154160 5118 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.157200 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.157275 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.157289 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.157314 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.157328 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.261258 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.261349 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.261368 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.261395 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.261417 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.364862 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.364934 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.364952 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.364977 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.364998 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.468367 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.468440 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.468457 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.468488 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.468507 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.571328 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.571412 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.571435 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.571462 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.571481 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.650613 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 22:47:30.686197452 +0000 UTC Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.674574 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.674641 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.674654 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.674683 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.674697 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.778147 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.778215 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.778233 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.778258 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.778276 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.880388 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.880431 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.880443 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.880460 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.880472 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.984257 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.984326 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.984343 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.984369 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:06 crc kubenswrapper[5118]: I0223 06:46:06.984386 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:06Z","lastTransitionTime":"2026-02-23T06:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.067359 5118 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.088385 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.088456 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.088475 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.088504 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.088526 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:07Z","lastTransitionTime":"2026-02-23T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.192331 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.192411 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.192435 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.192469 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.192494 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:07Z","lastTransitionTime":"2026-02-23T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.296498 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.296568 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.296587 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.296616 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.296656 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:07Z","lastTransitionTime":"2026-02-23T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.354864 5118 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.400429 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.400487 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.400500 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.400518 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.400530 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:07Z","lastTransitionTime":"2026-02-23T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.503693 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.503773 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.503796 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.503827 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.503851 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:07Z","lastTransitionTime":"2026-02-23T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.606734 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.606789 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.606807 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.606839 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.606856 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:07Z","lastTransitionTime":"2026-02-23T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.651068 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 17:18:09.173082165 +0000 UTC Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.696953 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.697052 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.696975 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:07 crc kubenswrapper[5118]: E0223 06:46:07.697194 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:07 crc kubenswrapper[5118]: E0223 06:46:07.697464 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:07 crc kubenswrapper[5118]: E0223 06:46:07.697697 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.709827 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.709904 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.709929 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.709962 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.709987 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:07Z","lastTransitionTime":"2026-02-23T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.713209 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.729697 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.759586 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228d1c0a-4133-43da-a328-063f42677662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f2434820425ac9615407a8ba244560ae5d4c75881f7bf7bdcb405319ff0e6b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://427a203a21051e361af4f4ca299111aa5e2279337557cf7d83e05216558ad594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97834a43bfdaecc40978ad4e68df0062e68482eeb89f39ce769c2757d05defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://218f793d696acfac371a8415994ff3c3264392d23dad9615b9cef384c24b25d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://366161fd64e0a87e511b9216b8b1a36d0aa44ad8d46b53c339222c99d93aff2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.778850 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.790876 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.805943 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.812286 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.812418 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.812459 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.812499 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.812524 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:07Z","lastTransitionTime":"2026-02-23T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.818852 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.915071 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.915166 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.915185 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.915214 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:07 crc kubenswrapper[5118]: I0223 06:46:07.915232 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:07Z","lastTransitionTime":"2026-02-23T06:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.018362 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.018439 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.018465 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.018498 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.018521 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:08Z","lastTransitionTime":"2026-02-23T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.121351 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.121419 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.121440 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.121467 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.121486 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:08Z","lastTransitionTime":"2026-02-23T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.225193 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.225280 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.225303 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.225331 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.225349 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:08Z","lastTransitionTime":"2026-02-23T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.328628 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.328698 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.328717 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.328745 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.328764 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:08Z","lastTransitionTime":"2026-02-23T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.432876 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.432955 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.432979 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.433007 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.433029 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:08Z","lastTransitionTime":"2026-02-23T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.536665 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.536728 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.536742 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.536766 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.536782 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:08Z","lastTransitionTime":"2026-02-23T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.640260 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.640304 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.640318 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.640332 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.640342 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:08Z","lastTransitionTime":"2026-02-23T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.651679 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 23:05:54.458391811 +0000 UTC Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.708808 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.709406 5118 scope.go:117] "RemoveContainer" containerID="38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.743995 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.744071 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.744160 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.744212 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.744243 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:08Z","lastTransitionTime":"2026-02-23T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.847853 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.847918 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.847935 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.847963 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.847982 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:08Z","lastTransitionTime":"2026-02-23T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.951153 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.951261 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.951284 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.951317 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:08 crc kubenswrapper[5118]: I0223 06:46:08.951337 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:08Z","lastTransitionTime":"2026-02-23T06:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.053804 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.053878 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.053907 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.053949 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.053982 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:09Z","lastTransitionTime":"2026-02-23T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.164244 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.164306 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.164321 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.164347 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.164363 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:09Z","lastTransitionTime":"2026-02-23T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.187463 5118 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.268081 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.268195 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.268214 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.268241 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.268261 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:09Z","lastTransitionTime":"2026-02-23T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.371935 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.371992 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.372010 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.372039 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.372063 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:09Z","lastTransitionTime":"2026-02-23T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.475175 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.475230 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.475240 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.475281 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.475295 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:09Z","lastTransitionTime":"2026-02-23T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.523810 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.523902 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.523983 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:17.52394695 +0000 UTC m=+40.527731553 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524026 5118 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.524057 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524072 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:17.524063563 +0000 UTC m=+40.527848136 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.524169 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.524208 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524349 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524373 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524393 5118 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524395 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524415 5118 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524447 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:17.524426871 +0000 UTC m=+40.528211474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524445 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524612 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:17.524568264 +0000 UTC m=+40.528352867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524616 5118 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.524749 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:17.524715218 +0000 UTC m=+40.528499941 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.578317 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.578420 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.578440 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.578468 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.578524 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:09Z","lastTransitionTime":"2026-02-23T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.652185 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 06:31:37.801062426 +0000 UTC Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.681736 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.681807 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.681826 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.681855 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.681876 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:09Z","lastTransitionTime":"2026-02-23T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.697373 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.697444 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.697504 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.697444 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.697732 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:09 crc kubenswrapper[5118]: E0223 06:46:09.697924 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.787057 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.787168 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.787190 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.787223 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.787251 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:09Z","lastTransitionTime":"2026-02-23T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.822312 5118 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.822432 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.822523 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.840975 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.841061 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"652ce8251f54f91e404de9c077c6fffb8161bfab7a27a04f5a396856397e5a75"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.841462 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://652ce8251f54f91e404de9c077c6fffb8161bfab7a27a04f5a396856397e5a75" gracePeriod=30 Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.890796 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.890848 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.890860 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.890881 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.890894 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:09Z","lastTransitionTime":"2026-02-23T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.909369 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.912520 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.912956 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.962716 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228d1c0a-4133-43da-a328-063f42677662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f2434820425ac9615407a8ba244560ae5d4c75881f7bf7bdcb405319ff0e6b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://427a203a21051e361af4f4ca299111aa5e2279337557cf7d83e05216558ad594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97834a43bfdaecc40978ad4e68df0062e68482eeb89f39ce769c2757d05defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://218f793d696acfac371a8415994ff3c3264392d23dad9615b9cef384c24b25d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://366161fd64e0a87e511b9216b8b1a36d0aa44ad8d46b53c339222c99d93aff2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.983078 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.993804 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.993856 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.993873 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.993890 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.993902 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:09Z","lastTransitionTime":"2026-02-23T06:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:09 crc kubenswrapper[5118]: I0223 06:46:09.995700 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.013204 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1845c79-b386-4abe-a0c6-dff68eafa20f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:53Z\\\",\\\"message\\\":\\\"W0223 06:45:53.094151 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:53.094751 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829153 cert, and key in /tmp/serving-cert-3990055708/serving-signer.crt, /tmp/serving-cert-3990055708/serving-signer.key\\\\nI0223 06:45:53.463180 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:53.469521 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:53Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:53.469767 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:53.470624 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3990055708/tls.crt::/tmp/serving-cert-3990055708/tls.key\\\\\\\"\\\\nF0223 06:45:53.870886 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:53Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.030153 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6d992b-1e55-498b-b8bc-7479a5277c98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ce8251f54f91e404de9c077c6fffb8161bfab7a27a04f5a396856397e5a75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ff2beab0cd75d0d4df4d459e85b7c74dbdbf8965df9638cc7f614fede66b4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e61f007646da29a9c70c05c5e11525eda1c87e3e1ed73002f6d004916d6af9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d222cb9a87bbfbcb33e86f0057d23be3c8706114b164516d5e9b63e0af94a65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.050036 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.066060 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.078998 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.090634 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.096874 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.096914 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.096928 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.096947 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.096961 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:10Z","lastTransitionTime":"2026-02-23T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.200268 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.200304 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.200316 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.200334 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.200346 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:10Z","lastTransitionTime":"2026-02-23T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.303501 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.303567 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.303589 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.303613 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.303628 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:10Z","lastTransitionTime":"2026-02-23T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.406474 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.406563 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.406576 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.406600 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.406613 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:10Z","lastTransitionTime":"2026-02-23T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.510626 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.510717 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.510746 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.510778 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.510802 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:10Z","lastTransitionTime":"2026-02-23T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.614154 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.614227 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.614251 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.614280 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.614303 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:10Z","lastTransitionTime":"2026-02-23T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.652730 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 07:06:13.824741388 +0000 UTC Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.718033 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.718128 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.718161 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.718189 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.718208 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:10Z","lastTransitionTime":"2026-02-23T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.822241 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.822304 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.822321 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.822346 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.822364 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:10Z","lastTransitionTime":"2026-02-23T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.920293 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.920767 5118 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="652ce8251f54f91e404de9c077c6fffb8161bfab7a27a04f5a396856397e5a75" exitCode=255 Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.920842 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"652ce8251f54f91e404de9c077c6fffb8161bfab7a27a04f5a396856397e5a75"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.920916 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5f7124df0e9b44b5c7b5fcd6d06cb57117b059534e7a13fe7ab7f8a4a294a185"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.925961 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.926077 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.926135 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.926168 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.926205 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:10Z","lastTransitionTime":"2026-02-23T06:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.938564 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.955959 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.981480 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228d1c0a-4133-43da-a328-063f42677662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f2434820425ac9615407a8ba244560ae5d4c75881f7bf7bdcb405319ff0e6b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://427a203a21051e361af4f4ca299111aa5e2279337557cf7d83e05216558ad594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97834a43bfdaecc40978ad4e68df0062e68482eeb89f39ce769c2757d05defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://218f793d696acfac371a8415994ff3c3264392d23dad9615b9cef384c24b25d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://366161fd64e0a87e511b9216b8b1a36d0aa44ad8d46b53c339222c99d93aff2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:10 crc kubenswrapper[5118]: I0223 06:46:10.998328 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6d992b-1e55-498b-b8bc-7479a5277c98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f7124df0e9b44b5c7b5fcd6d06cb57117b059534e7a13fe7ab7f8a4a294a185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652ce8251f54f91e404de9c077c6fffb8161bfab7a27a04f5a396856397e5a75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:46:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0223 06:45:39.747656 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0223 06:45:39.750925 1 observer_polling.go:159] Starting file observer\\\\nI0223 06:45:39.788444 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0223 06:45:39.793188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0223 06:46:09.847344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0223 06:46:09.847455 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ff2beab0cd75d0d4df4d459e85b7c74dbdbf8965df9638cc7f614fede66b4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e61f007646da29a9c70c05c5e11525eda1c87e3e1ed73002f6d004916d6af9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d222cb9a87bbfbcb33e86f0057d23be3c8706114b164516d5e9b63e0af94a65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.014924 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.027682 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.029366 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.029410 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.029422 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.029440 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.029455 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.043006 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.056998 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.073120 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1845c79-b386-4abe-a0c6-dff68eafa20f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:53Z\\\",\\\"message\\\":\\\"W0223 06:45:53.094151 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:53.094751 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829153 cert, and key in /tmp/serving-cert-3990055708/serving-signer.crt, /tmp/serving-cert-3990055708/serving-signer.key\\\\nI0223 06:45:53.463180 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:53.469521 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:53Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:53.469767 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:53.470624 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3990055708/tls.crt::/tmp/serving-cert-3990055708/tls.key\\\\\\\"\\\\nF0223 06:45:53.870886 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:53Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.132303 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.132379 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.132397 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.132430 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.132449 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.234972 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.235058 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.235075 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.235151 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.235179 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.338578 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.338629 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.338641 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.338659 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.338671 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.441651 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.441713 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.441730 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.441754 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.441771 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.544566 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.544626 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.544641 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.544659 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.544670 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.648890 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.648956 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.648978 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.649009 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.649040 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.653134 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 13:56:26.498188417 +0000 UTC Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.696709 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.696792 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:11 crc kubenswrapper[5118]: E0223 06:46:11.696915 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.696939 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:11 crc kubenswrapper[5118]: E0223 06:46:11.697172 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:11 crc kubenswrapper[5118]: E0223 06:46:11.697396 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.751702 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.751772 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.751789 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.751819 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.751839 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.855527 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.855602 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.855621 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.855643 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.855660 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.959329 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.959394 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.959412 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.959450 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:11 crc kubenswrapper[5118]: I0223 06:46:11.959470 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:11Z","lastTransitionTime":"2026-02-23T06:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.062427 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.062505 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.062531 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.062557 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.062575 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:12Z","lastTransitionTime":"2026-02-23T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.165349 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.165402 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.165416 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.165438 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.165455 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:12Z","lastTransitionTime":"2026-02-23T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.269145 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.269193 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.269204 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.269221 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.269234 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:12Z","lastTransitionTime":"2026-02-23T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.375431 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.375484 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.375500 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.375518 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.375536 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:12Z","lastTransitionTime":"2026-02-23T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.479490 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.479558 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.479574 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.479596 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.479608 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:12Z","lastTransitionTime":"2026-02-23T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.583186 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.583255 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.583276 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.583303 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.583323 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:12Z","lastTransitionTime":"2026-02-23T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.654052 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 23:54:51.621774624 +0000 UTC Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.686448 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.686504 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.686514 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.686534 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.686548 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:12Z","lastTransitionTime":"2026-02-23T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.790426 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.790505 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.790522 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.790547 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.790565 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:12Z","lastTransitionTime":"2026-02-23T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.874809 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.893196 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.893271 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.893288 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.893312 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.893330 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:12Z","lastTransitionTime":"2026-02-23T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.995786 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.995844 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.995860 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.995887 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:12 crc kubenswrapper[5118]: I0223 06:46:12.995904 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:12Z","lastTransitionTime":"2026-02-23T06:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.099432 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.099500 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.099517 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.099542 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.099562 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.202287 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.202351 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.202369 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.202393 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.202412 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.305670 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.305737 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.305756 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.305781 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.305800 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.408816 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.408875 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.408891 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.408915 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.408932 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.511829 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.511904 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.511926 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.511959 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.511980 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.614670 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.614718 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.614740 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.614768 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.614785 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.654872 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 01:34:44.596393275 +0000 UTC Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.696786 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.696805 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.696937 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:13 crc kubenswrapper[5118]: E0223 06:46:13.697187 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:13 crc kubenswrapper[5118]: E0223 06:46:13.697391 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:13 crc kubenswrapper[5118]: E0223 06:46:13.697758 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.718838 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.718932 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.718950 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.718978 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.718996 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.821942 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.822026 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.822049 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.822078 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.822145 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.924810 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.924891 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.924912 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.924941 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:13 crc kubenswrapper[5118]: I0223 06:46:13.924966 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:13Z","lastTransitionTime":"2026-02-23T06:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.028159 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.028200 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.028209 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.028227 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.028237 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.131613 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.131690 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.131705 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.131730 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.131747 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.235110 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.235185 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.235199 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.235221 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.235235 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.338206 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.338277 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.338295 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.338322 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.338338 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.441835 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.441914 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.441939 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.441970 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.441994 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.545458 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.545536 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.545569 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.545600 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.545622 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.648991 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.649064 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.649091 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.649151 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.649168 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.655308 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 06:06:28.103320263 +0000 UTC Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.752230 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.752316 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.752344 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.752397 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.752422 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.861400 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.861465 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.861484 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.861510 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.861528 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.964578 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.964667 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.964698 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.964742 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:14 crc kubenswrapper[5118]: I0223 06:46:14.964772 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:14Z","lastTransitionTime":"2026-02-23T06:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.069439 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.069562 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.069580 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.069604 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.069621 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.172684 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.172747 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.172759 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.172781 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.172796 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.275754 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.276337 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.276354 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.276381 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.276395 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.379499 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.379556 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.379570 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.379593 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.379610 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.484408 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.484478 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.484522 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.484549 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.484566 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.588151 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.588229 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.588247 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.588276 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.588296 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.656032 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 17:23:50.895270524 +0000 UTC Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.692274 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.692346 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.692357 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.692376 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.692390 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.696627 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.696697 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.696749 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:15 crc kubenswrapper[5118]: E0223 06:46:15.696813 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:15 crc kubenswrapper[5118]: E0223 06:46:15.696897 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:15 crc kubenswrapper[5118]: E0223 06:46:15.696967 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.795189 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.795270 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.795286 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.795307 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.795346 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.898205 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.898257 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.898266 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.898285 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.898297 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:15Z","lastTransitionTime":"2026-02-23T06:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.939325 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a04020cdeb0b3a20e89fcfc4003c5a19c2429d0a33795485b1c99959c136d7d4"} Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.954136 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1845c79-b386-4abe-a0c6-dff68eafa20f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:53Z\\\",\\\"message\\\":\\\"W0223 06:45:53.094151 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:53.094751 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829153 cert, and key in /tmp/serving-cert-3990055708/serving-signer.crt, /tmp/serving-cert-3990055708/serving-signer.key\\\\nI0223 06:45:53.463180 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:53.469521 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:53Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:53.469767 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:53.470624 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3990055708/tls.crt::/tmp/serving-cert-3990055708/tls.key\\\\\\\"\\\\nF0223 06:45:53.870886 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:53Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.971073 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6d992b-1e55-498b-b8bc-7479a5277c98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f7124df0e9b44b5c7b5fcd6d06cb57117b059534e7a13fe7ab7f8a4a294a185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652ce8251f54f91e404de9c077c6fffb8161bfab7a27a04f5a396856397e5a75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:46:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0223 06:45:39.747656 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0223 06:45:39.750925 1 observer_polling.go:159] Starting file observer\\\\nI0223 06:45:39.788444 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0223 06:45:39.793188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0223 06:46:09.847344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0223 06:46:09.847455 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ff2beab0cd75d0d4df4d459e85b7c74dbdbf8965df9638cc7f614fede66b4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e61f007646da29a9c70c05c5e11525eda1c87e3e1ed73002f6d004916d6af9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d222cb9a87bbfbcb33e86f0057d23be3c8706114b164516d5e9b63e0af94a65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.983400 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a04020cdeb0b3a20e89fcfc4003c5a19c2429d0a33795485b1c99959c136d7d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:15 crc kubenswrapper[5118]: I0223 06:46:15.999616 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.001252 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.001333 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.001353 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.001385 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.001408 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.015493 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.035302 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.067391 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228d1c0a-4133-43da-a328-063f42677662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f2434820425ac9615407a8ba244560ae5d4c75881f7bf7bdcb405319ff0e6b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://427a203a21051e361af4f4ca299111aa5e2279337557cf7d83e05216558ad594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97834a43bfdaecc40978ad4e68df0062e68482eeb89f39ce769c2757d05defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://218f793d696acfac371a8415994ff3c3264392d23dad9615b9cef384c24b25d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://366161fd64e0a87e511b9216b8b1a36d0aa44ad8d46b53c339222c99d93aff2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.081260 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.096700 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.104426 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.104485 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.104500 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.104524 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.104536 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.207556 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.207805 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.207864 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.207933 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.207991 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.311366 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.311441 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.311462 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.311495 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.311521 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.357840 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.357887 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.357899 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.357919 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.357930 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: E0223 06:46:16.372834 5118 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e5e564e2-37a5-4e17-8e7d-53999163ef5a\\\",\\\"systemUUID\\\":\\\"9f1192b7-57d9-42cb-906c-9c985ef0a7ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.378956 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.378999 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.379013 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.379034 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.379048 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: E0223 06:46:16.390576 5118 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e5e564e2-37a5-4e17-8e7d-53999163ef5a\\\",\\\"systemUUID\\\":\\\"9f1192b7-57d9-42cb-906c-9c985ef0a7ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.395426 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.395649 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.395801 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.395963 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.396126 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: E0223 06:46:16.415701 5118 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e5e564e2-37a5-4e17-8e7d-53999163ef5a\\\",\\\"systemUUID\\\":\\\"9f1192b7-57d9-42cb-906c-9c985ef0a7ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.421119 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.421171 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.421183 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.421213 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.421225 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: E0223 06:46:16.435058 5118 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e5e564e2-37a5-4e17-8e7d-53999163ef5a\\\",\\\"systemUUID\\\":\\\"9f1192b7-57d9-42cb-906c-9c985ef0a7ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.439920 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.439977 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.439992 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.440016 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.440029 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: E0223 06:46:16.454243 5118 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e5e564e2-37a5-4e17-8e7d-53999163ef5a\\\",\\\"systemUUID\\\":\\\"9f1192b7-57d9-42cb-906c-9c985ef0a7ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: E0223 06:46:16.454420 5118 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.456696 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.456738 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.456750 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.456771 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.456794 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.559792 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.559861 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.559871 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.559892 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.559907 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.657013 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 00:58:46.428943472 +0000 UTC Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.662326 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.662392 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.662408 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.662429 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.662442 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.765996 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.766069 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.766135 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.766184 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.766211 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.821598 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.830021 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.844743 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1845c79-b386-4abe-a0c6-dff68eafa20f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:45:53Z\\\",\\\"message\\\":\\\"W0223 06:45:53.094151 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0223 06:45:53.094751 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771829153 cert, and key in /tmp/serving-cert-3990055708/serving-signer.crt, /tmp/serving-cert-3990055708/serving-signer.key\\\\nI0223 06:45:53.463180 1 observer_polling.go:159] Starting file observer\\\\nW0223 06:45:53.469521 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:53Z is after 2026-02-23T05:33:16Z\\\\nI0223 06:45:53.469767 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 06:45:53.470624 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3990055708/tls.crt::/tmp/serving-cert-3990055708/tls.key\\\\\\\"\\\\nF0223 06:45:53.870886 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T06:45:53Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.857886 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f6d992b-1e55-498b-b8bc-7479a5277c98\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f7124df0e9b44b5c7b5fcd6d06cb57117b059534e7a13fe7ab7f8a4a294a185\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://652ce8251f54f91e404de9c077c6fffb8161bfab7a27a04f5a396856397e5a75\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T06:46:09Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0223 06:45:39.747656 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0223 06:45:39.750925 1 observer_polling.go:159] Starting file observer\\\\nI0223 06:45:39.788444 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0223 06:45:39.793188 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0223 06:46:09.847344 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0223 06:46:09.847455 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ff2beab0cd75d0d4df4d459e85b7c74dbdbf8965df9638cc7f614fede66b4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e61f007646da29a9c70c05c5e11525eda1c87e3e1ed73002f6d004916d6af9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d222cb9a87bbfbcb33e86f0057d23be3c8706114b164516d5e9b63e0af94a65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.868994 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.869055 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.869072 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.869126 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.869150 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.875707 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a04020cdeb0b3a20e89fcfc4003c5a19c2429d0a33795485b1c99959c136d7d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.888454 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.905169 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.918840 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.937535 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"228d1c0a-4133-43da-a328-063f42677662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T06:45:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f2434820425ac9615407a8ba244560ae5d4c75881f7bf7bdcb405319ff0e6b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://427a203a21051e361af4f4ca299111aa5e2279337557cf7d83e05216558ad594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d97834a43bfdaecc40978ad4e68df0062e68482eeb89f39ce769c2757d05defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://218f793d696acfac371a8415994ff3c3264392d23dad9615b9cef384c24b25d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://366161fd64e0a87e511b9216b8b1a36d0aa44ad8d46b53c339222c99d93aff2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T06:45:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a2e6d14530caa423405afe950f227909c89009e218bc5deab3b1fd4b95e2bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f46c850303a1f37ac40252ec2b02d7ea300de1476a537e83bb59a2bb38061d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:39Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408376c6ff3b97ba64da3056a8aae3a2f37880104e6f1b5e08128fabd4bf06f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T06:45:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T06:45:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T06:45:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.952594 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"efdb6088b0d1436044f20b95d7a6401c34c9c24e5784f18b28b11e933414093b"} Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.954320 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.964790 5118 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T06:46:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.971561 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.971594 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.971607 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.971626 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:16 crc kubenswrapper[5118]: I0223 06:46:16.971639 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:16Z","lastTransitionTime":"2026-02-23T06:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.074354 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.074448 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.074475 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.074510 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.074545 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.177297 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.177365 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.177378 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.177402 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.177414 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.281428 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.281483 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.281500 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.281520 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.281532 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.384709 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.384770 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.384787 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.384812 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.384829 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.489412 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.489815 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.489827 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.489846 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.489860 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.592644 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.592699 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.592708 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.592729 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.592742 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.607196 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.607302 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.607337 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.607360 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.607387 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607501 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:33.607449451 +0000 UTC m=+56.611234034 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607534 5118 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607560 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607583 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607603 5118 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607591 5118 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607634 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:33.607605216 +0000 UTC m=+56.611389969 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607661 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:33.607652237 +0000 UTC m=+56.611437060 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607743 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607757 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607767 5118 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607781 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:33.607725899 +0000 UTC m=+56.611510622 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.607828 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:46:33.60780724 +0000 UTC m=+56.611592043 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.658042 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 06:25:39.164944935 +0000 UTC Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.695440 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.695501 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.695514 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.695534 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.695546 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.696741 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.696776 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.696741 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.696883 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.697018 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:17 crc kubenswrapper[5118]: E0223 06:46:17.697140 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.801820 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=16.801797832 podStartE2EDuration="16.801797832s" podCreationTimestamp="2026-02-23 06:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:17.801374832 +0000 UTC m=+40.805159445" watchObservedRunningTime="2026-02-23 06:46:17.801797832 +0000 UTC m=+40.805582405" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.809129 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.809195 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.809218 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.809250 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.809276 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.902695 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=9.902661827 podStartE2EDuration="9.902661827s" podCreationTimestamp="2026-02-23 06:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:17.902264058 +0000 UTC m=+40.906048631" watchObservedRunningTime="2026-02-23 06:46:17.902661827 +0000 UTC m=+40.906446410" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.912600 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.912648 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.912663 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.912682 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.912699 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:17Z","lastTransitionTime":"2026-02-23T06:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.959093 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0f1df4f1a1325886d0018cc2bf0a75b9336444045f4e5765bb3af02b4030df83"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.961379 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2cf023995799cd4b154a4e10247f0b7c6eb08e01002b073734a2656e743ac539"} Feb 23 06:46:17 crc kubenswrapper[5118]: I0223 06:46:17.973944 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=8.973928977 podStartE2EDuration="8.973928977s" podCreationTimestamp="2026-02-23 06:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:17.920001649 +0000 UTC m=+40.923786242" watchObservedRunningTime="2026-02-23 06:46:17.973928977 +0000 UTC m=+40.977713560" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.016039 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.016115 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.016131 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.016149 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.016162 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.119568 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.119637 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.119652 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.119672 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.119691 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.223161 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.223240 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.223259 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.223301 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.223323 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.326458 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.326529 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.326541 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.326564 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.326579 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.429837 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.429914 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.429937 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.429970 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.429989 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.533715 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.533819 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.533839 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.533871 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.533890 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.637548 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.637626 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.637645 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.637669 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.637683 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.661502 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 13:22:43.03590064 +0000 UTC Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.740721 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.740784 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.740799 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.740822 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.740835 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.843804 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.843858 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.843867 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.843882 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.843891 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.947298 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.947365 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.947383 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.947411 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:18 crc kubenswrapper[5118]: I0223 06:46:18.947433 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:18Z","lastTransitionTime":"2026-02-23T06:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.050036 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.050133 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.050155 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.050189 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.050211 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.153601 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.153650 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.153662 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.153691 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.153705 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.181519 5118 csr.go:261] certificate signing request csr-7prkp is approved, waiting to be issued Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.197796 5118 csr.go:257] certificate signing request csr-7prkp is issued Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.256375 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.256448 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.256467 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.256490 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.256506 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.359226 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.359276 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.359286 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.359304 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.359316 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.462205 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.462248 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.462261 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.462280 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.462299 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.565536 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.565602 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.565625 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.565664 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.565687 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.662549 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 18:14:32.612697591 +0000 UTC Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.669266 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.669324 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.669344 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.669372 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.669391 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.696667 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.696779 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.696849 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:19 crc kubenswrapper[5118]: E0223 06:46:19.697023 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:19 crc kubenswrapper[5118]: E0223 06:46:19.697224 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:19 crc kubenswrapper[5118]: E0223 06:46:19.697393 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.772339 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.772399 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.772412 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.772435 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.772449 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.875272 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.875315 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.875324 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.875343 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.875354 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.978206 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.978593 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.978711 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.978827 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:19 crc kubenswrapper[5118]: I0223 06:46:19.978934 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:19Z","lastTransitionTime":"2026-02-23T06:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.082237 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.082513 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.082577 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.082647 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.082717 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.185036 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.185310 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.185383 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.185462 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.185525 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.199224 5118 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-23 06:41:19 +0000 UTC, rotation deadline is 2027-01-08 07:55:31.584648285 +0000 UTC Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.199278 5118 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7657h9m11.385373802s for next certificate rotation Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.288469 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.288520 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.288530 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.288549 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.288559 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.391391 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.391438 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.391447 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.391467 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.391482 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.494265 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.494520 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.494603 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.494692 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.494764 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.597494 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.597875 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.597988 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.598142 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.598267 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.663479 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 03:35:59.23301606 +0000 UTC Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.701862 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.702310 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.702453 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.702579 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.702714 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.806041 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.806180 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.806208 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.806245 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.806281 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.909053 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.909169 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.909188 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.909218 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:20 crc kubenswrapper[5118]: I0223 06:46:20.909238 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:20Z","lastTransitionTime":"2026-02-23T06:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.012763 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.012826 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.012842 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.012868 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.012889 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.115786 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.115850 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.115869 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.115893 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.115910 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.218917 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.218993 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.219013 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.219041 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.219080 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.323222 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.323304 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.323326 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.323356 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.323376 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.426469 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.426542 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.426560 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.426589 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.426610 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.530757 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.530827 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.530849 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.530879 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.530902 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.634593 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.634658 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.634673 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.634696 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.634711 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.664534 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 01:26:43.515635107 +0000 UTC Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.697327 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.697389 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.697333 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:21 crc kubenswrapper[5118]: E0223 06:46:21.697579 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:21 crc kubenswrapper[5118]: E0223 06:46:21.697943 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:21 crc kubenswrapper[5118]: E0223 06:46:21.698045 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.739042 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.739125 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.739138 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.739159 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.739175 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.842285 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.842354 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.842374 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.842397 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.842440 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.946668 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.946731 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.946753 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.946785 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:21 crc kubenswrapper[5118]: I0223 06:46:21.946807 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:21Z","lastTransitionTime":"2026-02-23T06:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.051371 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.051452 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.051478 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.051521 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.051550 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.155739 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.155831 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.155851 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.155882 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.155903 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.259829 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.259904 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.259926 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.259957 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.259979 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.363624 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.363678 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.363691 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.363711 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.363726 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.471956 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.472033 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.472053 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.472084 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.472132 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.575494 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.575576 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.575596 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.575629 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.575649 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.665393 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 18:46:30.970450911 +0000 UTC Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.679858 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.679943 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.679962 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.679995 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.680022 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.783477 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.783555 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.783574 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.783603 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.783623 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.880506 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.886185 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.886244 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.886262 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.886288 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.886310 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.989699 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.989747 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.989762 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.989784 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:22 crc kubenswrapper[5118]: I0223 06:46:22.989798 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:22Z","lastTransitionTime":"2026-02-23T06:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.093788 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.093860 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.093879 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.093910 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.093931 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.197053 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.197130 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.197147 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.197204 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.197221 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.300216 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.300287 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.300305 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.300327 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.300345 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.403758 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.403822 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.403839 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.403871 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.403893 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.508243 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.508327 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.508348 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.508382 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.508402 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.611788 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.611860 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.611879 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.611911 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.611931 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.666334 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 22:38:34.038132036 +0000 UTC Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.697000 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.697090 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.697161 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:23 crc kubenswrapper[5118]: E0223 06:46:23.697222 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:23 crc kubenswrapper[5118]: E0223 06:46:23.697394 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:23 crc kubenswrapper[5118]: E0223 06:46:23.697564 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.714963 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.715039 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.715063 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.715093 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.715158 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.818130 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.818175 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.818184 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.818201 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.818213 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.921566 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.921638 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.921650 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.921669 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:23 crc kubenswrapper[5118]: I0223 06:46:23.921680 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:23Z","lastTransitionTime":"2026-02-23T06:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.025227 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.025292 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.025312 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.025340 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.025362 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.128524 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.128632 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.128652 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.128716 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.128748 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.232576 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.232659 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.232683 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.232718 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.232741 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.335493 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.335568 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.335581 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.335615 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.335629 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.438432 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.438519 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.438546 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.438585 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.438605 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.443860 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.455439 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.541007 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.541054 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.541066 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.541083 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.541111 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.644008 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.644066 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.644084 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.644144 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.644167 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.666764 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 20:06:34.187017792 +0000 UTC Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.747024 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.747085 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.747127 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.747153 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.747172 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.850377 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.850451 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.850461 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.850484 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.850498 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.954238 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.954286 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.954301 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.954322 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:24 crc kubenswrapper[5118]: I0223 06:46:24.954338 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:24Z","lastTransitionTime":"2026-02-23T06:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.058378 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.058696 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.058763 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.058855 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.058951 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.163138 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.163198 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.163210 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.163235 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.163253 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.267833 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.267907 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.267926 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.267953 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.267972 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.371609 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.371677 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.371694 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.371722 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.371740 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.475521 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.475595 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.475613 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.475642 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.475664 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.578690 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.578773 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.578795 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.578826 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.578851 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.667461 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 04:27:00.344774906 +0000 UTC Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.682659 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.682731 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.682755 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.682790 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.682816 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.696304 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.696304 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.696628 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:25 crc kubenswrapper[5118]: E0223 06:46:25.696766 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:25 crc kubenswrapper[5118]: E0223 06:46:25.696956 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:25 crc kubenswrapper[5118]: E0223 06:46:25.697270 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.775204 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.786294 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.786356 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.786374 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.786402 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.786423 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.801205 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.801183319 podStartE2EDuration="1.801183319s" podCreationTimestamp="2026-02-23 06:46:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:25.800275759 +0000 UTC m=+48.804060332" watchObservedRunningTime="2026-02-23 06:46:25.801183319 +0000 UTC m=+48.804967922" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.889690 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.889751 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.889764 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.889787 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.889802 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.992725 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.992789 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.992799 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.992822 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:25 crc kubenswrapper[5118]: I0223 06:46:25.992851 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:25Z","lastTransitionTime":"2026-02-23T06:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.096681 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.096824 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.096844 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.096870 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.096890 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.200357 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.200479 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.200502 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.200571 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.200593 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.304255 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.304314 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.304326 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.304346 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.304361 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.408219 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.408290 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.408308 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.408335 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.408354 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.512258 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.512340 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.512363 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.512398 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.512420 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.543885 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.543958 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.543975 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.543998 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.544012 5118 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T06:46:26Z","lastTransitionTime":"2026-02-23T06:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.667825 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 16:11:13.956600327 +0000 UTC Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.667923 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 23 06:46:26 crc kubenswrapper[5118]: I0223 06:46:26.678724 5118 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 06:46:27 crc kubenswrapper[5118]: I0223 06:46:27.482055 5118 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 23 06:46:27 crc kubenswrapper[5118]: W0223 06:46:27.482858 5118 reflector.go:484] k8s.io/client-go/tools/watch/informerwatcher.go:146: watch of *v1.CertificateSigningRequest ended with: very short watch: k8s.io/client-go/tools/watch/informerwatcher.go:146: Unexpected watch close - watch lasted less than a second and no items received Feb 23 06:46:27 crc kubenswrapper[5118]: I0223 06:46:27.697340 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:27 crc kubenswrapper[5118]: E0223 06:46:27.697562 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:27 crc kubenswrapper[5118]: I0223 06:46:27.697657 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:27 crc kubenswrapper[5118]: E0223 06:46:27.697881 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:27 crc kubenswrapper[5118]: I0223 06:46:27.698160 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:27 crc kubenswrapper[5118]: E0223 06:46:27.698345 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:28 crc kubenswrapper[5118]: I0223 06:46:28.779524 5118 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 06:46:29 crc kubenswrapper[5118]: I0223 06:46:29.697127 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:29 crc kubenswrapper[5118]: I0223 06:46:29.697160 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:29 crc kubenswrapper[5118]: I0223 06:46:29.697287 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:29 crc kubenswrapper[5118]: E0223 06:46:29.697378 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:29 crc kubenswrapper[5118]: E0223 06:46:29.697491 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:29 crc kubenswrapper[5118]: E0223 06:46:29.697682 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:31 crc kubenswrapper[5118]: I0223 06:46:31.697274 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:31 crc kubenswrapper[5118]: I0223 06:46:31.697285 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:31 crc kubenswrapper[5118]: E0223 06:46:31.697486 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:31 crc kubenswrapper[5118]: E0223 06:46:31.697652 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:31 crc kubenswrapper[5118]: I0223 06:46:31.697285 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:31 crc kubenswrapper[5118]: E0223 06:46:31.697835 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.607692 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lp8vs"] Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.608364 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lp8vs" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.611090 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.611925 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.613562 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.633606 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qlxj9"] Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.634304 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.635510 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-vrbqq"] Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.637180 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p48pl"] Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.638299 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.638342 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.638376 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.638432 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.639941 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xzr6d"] Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.640311 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.640494 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.644953 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.645072 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.645218 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.645493 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.646214 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.648367 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.648687 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.648745 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.648836 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.649308 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.649947 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.650576 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.650777 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.651293 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.652193 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.657660 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.748800 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-wwxh8"] Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.749342 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.752167 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.752410 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.752832 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.753593 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.762832 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-systemd-units\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.762910 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-var-lib-cni-multus\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.762969 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-daemon-config\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763168 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-kubelet\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763217 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-bin\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763252 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb17ee96-c045-4fa3-88c4-196083c286b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763289 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-system-cni-dir\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763327 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/542be1be-130f-46d0-9284-80695c2b17b4-ovn-node-metrics-cert\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763356 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/282b5bbe-e1e1-4b22-a815-bb27d70e550d-cni-binary-copy\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763379 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-run-netns\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763400 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfbm5\" (UniqueName: \"kubernetes.io/projected/542be1be-130f-46d0-9284-80695c2b17b4-kube-api-access-sfbm5\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763422 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-os-release\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763441 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-var-lib-cni-bin\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763464 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763501 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-cni-dir\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763527 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-hostroot\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763559 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5pc\" (UniqueName: \"kubernetes.io/projected/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-kube-api-access-pg5pc\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763595 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bb17ee96-c045-4fa3-88c4-196083c286b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763630 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-netns\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763660 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-ovn\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763691 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-env-overrides\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763736 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-script-lib\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763764 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-var-lib-kubelet\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763796 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-socket-dir-parent\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763827 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr2gt\" (UniqueName: \"kubernetes.io/projected/9c07ac33-34ef-4ece-951f-2369fef2b726-kube-api-access-nr2gt\") pod \"node-resolver-lp8vs\" (UID: \"9c07ac33-34ef-4ece-951f-2369fef2b726\") " pod="openshift-dns/node-resolver-lp8vs" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763862 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-netd\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763893 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-etc-openvswitch\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763918 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-node-log\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.763944 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-config\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764028 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764055 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-conf-dir\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764077 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88qwx\" (UniqueName: \"kubernetes.io/projected/282b5bbe-e1e1-4b22-a815-bb27d70e550d-kube-api-access-88qwx\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764125 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-mcd-auth-proxy-config\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764148 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-systemd\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764167 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrrfx\" (UniqueName: \"kubernetes.io/projected/bb17ee96-c045-4fa3-88c4-196083c286b5-kube-api-access-mrrfx\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764196 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-os-release\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764219 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-run-k8s-cni-cncf-io\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764238 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c07ac33-34ef-4ece-951f-2369fef2b726-hosts-file\") pod \"node-resolver-lp8vs\" (UID: \"9c07ac33-34ef-4ece-951f-2369fef2b726\") " pod="openshift-dns/node-resolver-lp8vs" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764260 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-cnibin\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764278 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-etc-kubernetes\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764303 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-var-lib-openvswitch\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764324 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-log-socket\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764344 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764367 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-rootfs\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764386 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-slash\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764405 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-openvswitch\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764425 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-system-cni-dir\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764446 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-proxy-tls\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764466 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-cnibin\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.764675 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-run-multus-certs\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.800498 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf"] Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.801164 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.804041 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.806185 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.806189 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.806454 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.865602 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-systemd\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.865670 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrrfx\" (UniqueName: \"kubernetes.io/projected/bb17ee96-c045-4fa3-88c4-196083c286b5-kube-api-access-mrrfx\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.865711 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-os-release\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.865750 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-run-k8s-cni-cncf-io\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.865782 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c07ac33-34ef-4ece-951f-2369fef2b726-hosts-file\") pod \"node-resolver-lp8vs\" (UID: \"9c07ac33-34ef-4ece-951f-2369fef2b726\") " pod="openshift-dns/node-resolver-lp8vs" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.865790 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-systemd\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.865820 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-cnibin\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.865985 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-run-k8s-cni-cncf-io\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.865960 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c07ac33-34ef-4ece-951f-2369fef2b726-hosts-file\") pod \"node-resolver-lp8vs\" (UID: \"9c07ac33-34ef-4ece-951f-2369fef2b726\") " pod="openshift-dns/node-resolver-lp8vs" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866062 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-etc-kubernetes\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.865949 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-os-release\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866138 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-etc-kubernetes\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866056 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-cnibin\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866163 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-var-lib-openvswitch\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866270 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-log-socket\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866196 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-var-lib-openvswitch\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866337 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866396 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-log-socket\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866402 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-rootfs\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866455 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4cqk\" (UniqueName: \"kubernetes.io/projected/99701be3-406d-49bd-a103-8fd45233d1b7-kube-api-access-n4cqk\") pod \"node-ca-wwxh8\" (UID: \"99701be3-406d-49bd-a103-8fd45233d1b7\") " pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866493 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-slash\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866558 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-openvswitch\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866582 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-slash\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866598 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-system-cni-dir\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866607 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-rootfs\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866635 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-proxy-tls\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866668 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-system-cni-dir\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866676 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-cnibin\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866638 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-openvswitch\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866706 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-run-multus-certs\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866731 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99701be3-406d-49bd-a103-8fd45233d1b7-host\") pod \"node-ca-wwxh8\" (UID: \"99701be3-406d-49bd-a103-8fd45233d1b7\") " pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866736 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-cnibin\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866767 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-systemd-units\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866777 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-run-multus-certs\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866809 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-systemd-units\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866827 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-kubelet\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866849 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-bin\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866868 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb17ee96-c045-4fa3-88c4-196083c286b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866871 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-kubelet\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866885 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-system-cni-dir\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866905 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-var-lib-cni-multus\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866922 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-bin\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866930 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-daemon-config\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.866988 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/542be1be-130f-46d0-9284-80695c2b17b4-ovn-node-metrics-cert\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867006 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/282b5bbe-e1e1-4b22-a815-bb27d70e550d-cni-binary-copy\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867001 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-var-lib-cni-multus\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867023 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-run-netns\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867038 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-system-cni-dir\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867068 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfbm5\" (UniqueName: \"kubernetes.io/projected/542be1be-130f-46d0-9284-80695c2b17b4-kube-api-access-sfbm5\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867124 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-os-release\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867156 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-var-lib-cni-bin\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867186 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867235 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-cni-dir\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867263 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-hostroot\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867294 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5pc\" (UniqueName: \"kubernetes.io/projected/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-kube-api-access-pg5pc\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867330 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/99701be3-406d-49bd-a103-8fd45233d1b7-serviceca\") pod \"node-ca-wwxh8\" (UID: \"99701be3-406d-49bd-a103-8fd45233d1b7\") " pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867374 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-netns\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867405 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-ovn\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867440 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-env-overrides\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867474 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bb17ee96-c045-4fa3-88c4-196083c286b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867702 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-script-lib\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867744 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-var-lib-kubelet\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867795 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-netd\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867834 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-socket-dir-parent\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867906 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr2gt\" (UniqueName: \"kubernetes.io/projected/9c07ac33-34ef-4ece-951f-2369fef2b726-kube-api-access-nr2gt\") pod \"node-resolver-lp8vs\" (UID: \"9c07ac33-34ef-4ece-951f-2369fef2b726\") " pod="openshift-dns/node-resolver-lp8vs" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.867047 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-run-netns\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868010 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-etc-openvswitch\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868066 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-etc-openvswitch\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868145 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/282b5bbe-e1e1-4b22-a815-bb27d70e550d-cni-binary-copy\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868164 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-node-log\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868211 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-daemon-config\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868219 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-netns\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868253 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-os-release\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868251 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-config\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868287 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-hostroot\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868324 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-cni-dir\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868327 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb17ee96-c045-4fa3-88c4-196083c286b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868341 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868370 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-var-lib-cni-bin\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868191 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-node-log\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868390 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-conf-dir\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868406 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868400 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb17ee96-c045-4fa3-88c4-196083c286b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868454 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88qwx\" (UniqueName: \"kubernetes.io/projected/282b5bbe-e1e1-4b22-a815-bb27d70e550d-kube-api-access-88qwx\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868465 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-conf-dir\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868510 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-mcd-auth-proxy-config\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868536 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-host-var-lib-kubelet\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.868547 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.869053 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-script-lib\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.869079 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-netd\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.869086 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-env-overrides\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.869146 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/282b5bbe-e1e1-4b22-a815-bb27d70e550d-multus-socket-dir-parent\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.869189 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-ovn\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.869587 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-mcd-auth-proxy-config\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.869781 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bb17ee96-c045-4fa3-88c4-196083c286b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.870019 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-config\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.876334 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-proxy-tls\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.877620 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/542be1be-130f-46d0-9284-80695c2b17b4-ovn-node-metrics-cert\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.888740 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5pc\" (UniqueName: \"kubernetes.io/projected/d3ecfa2c-410e-49e5-86ab-f386efab9cf6-kube-api-access-pg5pc\") pod \"machine-config-daemon-qlxj9\" (UID: \"d3ecfa2c-410e-49e5-86ab-f386efab9cf6\") " pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.891398 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88qwx\" (UniqueName: \"kubernetes.io/projected/282b5bbe-e1e1-4b22-a815-bb27d70e550d-kube-api-access-88qwx\") pod \"multus-xzr6d\" (UID: \"282b5bbe-e1e1-4b22-a815-bb27d70e550d\") " pod="openshift-multus/multus-xzr6d" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.892128 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrrfx\" (UniqueName: \"kubernetes.io/projected/bb17ee96-c045-4fa3-88c4-196083c286b5-kube-api-access-mrrfx\") pod \"multus-additional-cni-plugins-vrbqq\" (UID: \"bb17ee96-c045-4fa3-88c4-196083c286b5\") " pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.893893 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfbm5\" (UniqueName: \"kubernetes.io/projected/542be1be-130f-46d0-9284-80695c2b17b4-kube-api-access-sfbm5\") pod \"ovnkube-node-p48pl\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.894888 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr2gt\" (UniqueName: \"kubernetes.io/projected/9c07ac33-34ef-4ece-951f-2369fef2b726-kube-api-access-nr2gt\") pod \"node-resolver-lp8vs\" (UID: \"9c07ac33-34ef-4ece-951f-2369fef2b726\") " pod="openshift-dns/node-resolver-lp8vs" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.930901 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lp8vs" Feb 23 06:46:32 crc kubenswrapper[5118]: W0223 06:46:32.950702 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c07ac33_34ef_4ece_951f_2369fef2b726.slice/crio-898b3a7ee7757a7d7716fd86c725a9dfcb02e3ca7f800a630a3d12621e972246 WatchSource:0}: Error finding container 898b3a7ee7757a7d7716fd86c725a9dfcb02e3ca7f800a630a3d12621e972246: Status 404 returned error can't find the container with id 898b3a7ee7757a7d7716fd86c725a9dfcb02e3ca7f800a630a3d12621e972246 Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.969273 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4cqk\" (UniqueName: \"kubernetes.io/projected/99701be3-406d-49bd-a103-8fd45233d1b7-kube-api-access-n4cqk\") pod \"node-ca-wwxh8\" (UID: \"99701be3-406d-49bd-a103-8fd45233d1b7\") " pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.969332 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99701be3-406d-49bd-a103-8fd45233d1b7-host\") pod \"node-ca-wwxh8\" (UID: \"99701be3-406d-49bd-a103-8fd45233d1b7\") " pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.969377 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/253bee8e-cc2a-4586-817f-a254b5d7e319-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.969419 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/253bee8e-cc2a-4586-817f-a254b5d7e319-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.969443 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/253bee8e-cc2a-4586-817f-a254b5d7e319-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.969454 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99701be3-406d-49bd-a103-8fd45233d1b7-host\") pod \"node-ca-wwxh8\" (UID: \"99701be3-406d-49bd-a103-8fd45233d1b7\") " pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.969465 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/253bee8e-cc2a-4586-817f-a254b5d7e319-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.969669 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/99701be3-406d-49bd-a103-8fd45233d1b7-serviceca\") pod \"node-ca-wwxh8\" (UID: \"99701be3-406d-49bd-a103-8fd45233d1b7\") " pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.969731 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/253bee8e-cc2a-4586-817f-a254b5d7e319-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.974240 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.981018 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/99701be3-406d-49bd-a103-8fd45233d1b7-serviceca\") pod \"node-ca-wwxh8\" (UID: \"99701be3-406d-49bd-a103-8fd45233d1b7\") " pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.984201 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.993460 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4cqk\" (UniqueName: \"kubernetes.io/projected/99701be3-406d-49bd-a103-8fd45233d1b7-kube-api-access-n4cqk\") pod \"node-ca-wwxh8\" (UID: \"99701be3-406d-49bd-a103-8fd45233d1b7\") " pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:32 crc kubenswrapper[5118]: I0223 06:46:32.993764 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.003914 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xzr6d" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.012068 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lp8vs" event={"ID":"9c07ac33-34ef-4ece-951f-2369fef2b726","Type":"ContainerStarted","Data":"898b3a7ee7757a7d7716fd86c725a9dfcb02e3ca7f800a630a3d12621e972246"} Feb 23 06:46:33 crc kubenswrapper[5118]: W0223 06:46:33.020425 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb17ee96_c045_4fa3_88c4_196083c286b5.slice/crio-465c9e0e081d3a5f31c040a6cdbbc10e64f2f2b2c67c8d9d30ecfc28cf82e5f0 WatchSource:0}: Error finding container 465c9e0e081d3a5f31c040a6cdbbc10e64f2f2b2c67c8d9d30ecfc28cf82e5f0: Status 404 returned error can't find the container with id 465c9e0e081d3a5f31c040a6cdbbc10e64f2f2b2c67c8d9d30ecfc28cf82e5f0 Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.063495 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wwxh8" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.067566 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6"] Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.068149 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.070471 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.070530 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/253bee8e-cc2a-4586-817f-a254b5d7e319-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.070619 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/253bee8e-cc2a-4586-817f-a254b5d7e319-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.070661 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/253bee8e-cc2a-4586-817f-a254b5d7e319-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.070724 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/253bee8e-cc2a-4586-817f-a254b5d7e319-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.070737 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/253bee8e-cc2a-4586-817f-a254b5d7e319-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.070816 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/253bee8e-cc2a-4586-817f-a254b5d7e319-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.070759 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/253bee8e-cc2a-4586-817f-a254b5d7e319-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.071883 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/253bee8e-cc2a-4586-817f-a254b5d7e319-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.073130 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.077724 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/253bee8e-cc2a-4586-817f-a254b5d7e319-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.089358 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/253bee8e-cc2a-4586-817f-a254b5d7e319-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t79jf\" (UID: \"253bee8e-cc2a-4586-817f-a254b5d7e319\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.119502 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.121772 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9qwhq"] Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.123127 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.123214 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qwhq" podUID="57709b5a-89ef-4120-af9b-d96006564eff" Feb 23 06:46:33 crc kubenswrapper[5118]: W0223 06:46:33.145056 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99701be3_406d_49bd_a103_8fd45233d1b7.slice/crio-bb5e25d6cb1df3ee82bbd00183b1ab09fbc42e5df123ac0f8c61bca087132711 WatchSource:0}: Error finding container bb5e25d6cb1df3ee82bbd00183b1ab09fbc42e5df123ac0f8c61bca087132711: Status 404 returned error can't find the container with id bb5e25d6cb1df3ee82bbd00183b1ab09fbc42e5df123ac0f8c61bca087132711 Feb 23 06:46:33 crc kubenswrapper[5118]: W0223 06:46:33.145767 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod253bee8e_cc2a_4586_817f_a254b5d7e319.slice/crio-accde6f1fa7b909e72a563884d9b93863bc37e9afcbaf1fce50a5e1202eccc8c WatchSource:0}: Error finding container accde6f1fa7b909e72a563884d9b93863bc37e9afcbaf1fce50a5e1202eccc8c: Status 404 returned error can't find the container with id accde6f1fa7b909e72a563884d9b93863bc37e9afcbaf1fce50a5e1202eccc8c Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.172455 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.172520 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2lgx\" (UniqueName: \"kubernetes.io/projected/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-kube-api-access-b2lgx\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.172625 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.172707 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.274233 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.274299 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2lgx\" (UniqueName: \"kubernetes.io/projected/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-kube-api-access-b2lgx\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.274338 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.274369 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzz5z\" (UniqueName: \"kubernetes.io/projected/57709b5a-89ef-4120-af9b-d96006564eff-kube-api-access-rzz5z\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.274393 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.274413 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.275173 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.275881 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.278148 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.290866 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2lgx\" (UniqueName: \"kubernetes.io/projected/6e7d32a1-e633-44f2-af39-4ff2fa7b45df-kube-api-access-b2lgx\") pod \"ovnkube-control-plane-749d76644c-5n4w6\" (UID: \"6e7d32a1-e633-44f2-af39-4ff2fa7b45df\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.375290 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.375435 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzz5z\" (UniqueName: \"kubernetes.io/projected/57709b5a-89ef-4120-af9b-d96006564eff-kube-api-access-rzz5z\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.375915 5118 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.375995 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs podName:57709b5a-89ef-4120-af9b-d96006564eff nodeName:}" failed. No retries permitted until 2026-02-23 06:46:33.875972038 +0000 UTC m=+56.879756621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs") pod "network-metrics-daemon-9qwhq" (UID: "57709b5a-89ef-4120-af9b-d96006564eff") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.396026 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzz5z\" (UniqueName: \"kubernetes.io/projected/57709b5a-89ef-4120-af9b-d96006564eff-kube-api-access-rzz5z\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.488681 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" Feb 23 06:46:33 crc kubenswrapper[5118]: W0223 06:46:33.503505 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e7d32a1_e633_44f2_af39_4ff2fa7b45df.slice/crio-0f0ec17de44cccb5e68467a59751f1c320c660f99a5cb5a923c1ff510254a682 WatchSource:0}: Error finding container 0f0ec17de44cccb5e68467a59751f1c320c660f99a5cb5a923c1ff510254a682: Status 404 returned error can't find the container with id 0f0ec17de44cccb5e68467a59751f1c320c660f99a5cb5a923c1ff510254a682 Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.679643 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.679817 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:47:05.679780852 +0000 UTC m=+88.683565435 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.680230 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.680299 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.680339 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.680370 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680456 5118 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680509 5118 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680518 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680612 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680640 5118 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680519 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680692 5118 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680708 5118 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680533 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:47:05.680511459 +0000 UTC m=+88.684296052 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680780 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 06:47:05.680767204 +0000 UTC m=+88.684551787 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680800 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 06:47:05.680791265 +0000 UTC m=+88.684575848 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.680815 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 06:47:05.680807925 +0000 UTC m=+88.684592508 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.696303 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.696336 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.696357 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.696466 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.696629 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.696786 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:33 crc kubenswrapper[5118]: I0223 06:46:33.881669 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.881909 5118 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:33 crc kubenswrapper[5118]: E0223 06:46:33.882027 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs podName:57709b5a-89ef-4120-af9b-d96006564eff nodeName:}" failed. No retries permitted until 2026-02-23 06:46:34.881993904 +0000 UTC m=+57.885778677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs") pod "network-metrics-daemon-9qwhq" (UID: "57709b5a-89ef-4120-af9b-d96006564eff") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.020142 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wwxh8" event={"ID":"99701be3-406d-49bd-a103-8fd45233d1b7","Type":"ContainerStarted","Data":"a69a33f7fbda6846369ebab93f9560f93b87c7c2bd527467c2703df7c09b7028"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.020216 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wwxh8" event={"ID":"99701be3-406d-49bd-a103-8fd45233d1b7","Type":"ContainerStarted","Data":"bb5e25d6cb1df3ee82bbd00183b1ab09fbc42e5df123ac0f8c61bca087132711"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.023479 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lp8vs" event={"ID":"9c07ac33-34ef-4ece-951f-2369fef2b726","Type":"ContainerStarted","Data":"1cfcf66bdedba787dedeaab734c8077fd1014eebf00bb7ca8f1dd2e5dbd956b3"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.026337 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzr6d" event={"ID":"282b5bbe-e1e1-4b22-a815-bb27d70e550d","Type":"ContainerStarted","Data":"ffa6846e24eb1902b29fbc8f9415724c2183c693000f6a00607cfed96c60878b"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.026391 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzr6d" event={"ID":"282b5bbe-e1e1-4b22-a815-bb27d70e550d","Type":"ContainerStarted","Data":"1067dfe71d154a2ec330e77fff830abfa269abf86157e9c6a06a94eebf586607"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.029081 5118 generic.go:334] "Generic (PLEG): container finished" podID="bb17ee96-c045-4fa3-88c4-196083c286b5" containerID="bae2ed07f87da1be898e4aa3aaa1f9a0548b5e2649111354cff7fdd74c41836d" exitCode=0 Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.029214 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" event={"ID":"bb17ee96-c045-4fa3-88c4-196083c286b5","Type":"ContainerDied","Data":"bae2ed07f87da1be898e4aa3aaa1f9a0548b5e2649111354cff7fdd74c41836d"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.029271 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" event={"ID":"bb17ee96-c045-4fa3-88c4-196083c286b5","Type":"ContainerStarted","Data":"465c9e0e081d3a5f31c040a6cdbbc10e64f2f2b2c67c8d9d30ecfc28cf82e5f0"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.033273 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"06aa1ec42b2794d7eb29a2166a4728d755e5e2799050950ee4d96651a83c3ebb"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.033319 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"e16b5ff6d4e5c69b2f860e4229e93318a9d047d8241bd58ca765a2e5ab3beeeb"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.033336 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"ded167ba7b7a0c7da44ebae7e5d21848d78b6acaef05e6c5956b8790fa2d73fc"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.036323 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" event={"ID":"6e7d32a1-e633-44f2-af39-4ff2fa7b45df","Type":"ContainerStarted","Data":"7ec933dda594356528cccbf6abc48fd60e248c48c80a5a883e5bf168ca8b351d"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.036383 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" event={"ID":"6e7d32a1-e633-44f2-af39-4ff2fa7b45df","Type":"ContainerStarted","Data":"caf14dc7c677b4d1dead57f7a6cd8c943d44d7cba70085b26c243980c182d392"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.036397 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" event={"ID":"6e7d32a1-e633-44f2-af39-4ff2fa7b45df","Type":"ContainerStarted","Data":"0f0ec17de44cccb5e68467a59751f1c320c660f99a5cb5a923c1ff510254a682"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.038606 5118 generic.go:334] "Generic (PLEG): container finished" podID="542be1be-130f-46d0-9284-80695c2b17b4" containerID="71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15" exitCode=0 Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.038685 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerDied","Data":"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.038712 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerStarted","Data":"32d3b59891e6e435f7646b74dfdc7379a80499da5f75691c15033df1edb67a62"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.041662 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" event={"ID":"253bee8e-cc2a-4586-817f-a254b5d7e319","Type":"ContainerStarted","Data":"c6249771d1daa48c6cfc9be0cc94cbf6086f596890ea7a71cb10f7b7b1225903"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.041732 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" event={"ID":"253bee8e-cc2a-4586-817f-a254b5d7e319","Type":"ContainerStarted","Data":"accde6f1fa7b909e72a563884d9b93863bc37e9afcbaf1fce50a5e1202eccc8c"} Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.047794 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wwxh8" podStartSLOduration=15.047751651 podStartE2EDuration="15.047751651s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:34.044690971 +0000 UTC m=+57.048475554" watchObservedRunningTime="2026-02-23 06:46:34.047751651 +0000 UTC m=+57.051536254" Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.104705 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xzr6d" podStartSLOduration=15.10467132 podStartE2EDuration="15.10467132s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:34.104576258 +0000 UTC m=+57.108360841" watchObservedRunningTime="2026-02-23 06:46:34.10467132 +0000 UTC m=+57.108455923" Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.123734 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t79jf" podStartSLOduration=15.123699391 podStartE2EDuration="15.123699391s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:34.123617238 +0000 UTC m=+57.127401831" watchObservedRunningTime="2026-02-23 06:46:34.123699391 +0000 UTC m=+57.127483974" Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.180385 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lp8vs" podStartSLOduration=15.180354772 podStartE2EDuration="15.180354772s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:34.14916518 +0000 UTC m=+57.152949783" watchObservedRunningTime="2026-02-23 06:46:34.180354772 +0000 UTC m=+57.184139345" Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.215039 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podStartSLOduration=15.215010685 podStartE2EDuration="15.215010685s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:34.198710167 +0000 UTC m=+57.202494760" watchObservedRunningTime="2026-02-23 06:46:34.215010685 +0000 UTC m=+57.218795258" Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.215643 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5n4w6" podStartSLOduration=14.215639069 podStartE2EDuration="14.215639069s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:34.214989644 +0000 UTC m=+57.218774217" watchObservedRunningTime="2026-02-23 06:46:34.215639069 +0000 UTC m=+57.219423642" Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.696676 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:34 crc kubenswrapper[5118]: E0223 06:46:34.697241 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qwhq" podUID="57709b5a-89ef-4120-af9b-d96006564eff" Feb 23 06:46:34 crc kubenswrapper[5118]: I0223 06:46:34.893183 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:34 crc kubenswrapper[5118]: E0223 06:46:34.893813 5118 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:34 crc kubenswrapper[5118]: E0223 06:46:34.893884 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs podName:57709b5a-89ef-4120-af9b-d96006564eff nodeName:}" failed. No retries permitted until 2026-02-23 06:46:36.893864163 +0000 UTC m=+59.897648756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs") pod "network-metrics-daemon-9qwhq" (UID: "57709b5a-89ef-4120-af9b-d96006564eff") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:35 crc kubenswrapper[5118]: I0223 06:46:35.057225 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerStarted","Data":"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9"} Feb 23 06:46:35 crc kubenswrapper[5118]: I0223 06:46:35.057290 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerStarted","Data":"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03"} Feb 23 06:46:35 crc kubenswrapper[5118]: I0223 06:46:35.057304 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerStarted","Data":"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43"} Feb 23 06:46:35 crc kubenswrapper[5118]: I0223 06:46:35.057316 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerStarted","Data":"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5"} Feb 23 06:46:35 crc kubenswrapper[5118]: I0223 06:46:35.059072 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" event={"ID":"bb17ee96-c045-4fa3-88c4-196083c286b5","Type":"ContainerStarted","Data":"5de7228eb7e647f2e693bbeba433be5779425b204ca15637f79cd0c47fa73948"} Feb 23 06:46:35 crc kubenswrapper[5118]: I0223 06:46:35.697413 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:35 crc kubenswrapper[5118]: I0223 06:46:35.697508 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:35 crc kubenswrapper[5118]: I0223 06:46:35.697439 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:35 crc kubenswrapper[5118]: E0223 06:46:35.697667 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:35 crc kubenswrapper[5118]: E0223 06:46:35.697802 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:35 crc kubenswrapper[5118]: E0223 06:46:35.697930 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:36 crc kubenswrapper[5118]: I0223 06:46:36.066929 5118 generic.go:334] "Generic (PLEG): container finished" podID="bb17ee96-c045-4fa3-88c4-196083c286b5" containerID="5de7228eb7e647f2e693bbeba433be5779425b204ca15637f79cd0c47fa73948" exitCode=0 Feb 23 06:46:36 crc kubenswrapper[5118]: I0223 06:46:36.067001 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" event={"ID":"bb17ee96-c045-4fa3-88c4-196083c286b5","Type":"ContainerDied","Data":"5de7228eb7e647f2e693bbeba433be5779425b204ca15637f79cd0c47fa73948"} Feb 23 06:46:36 crc kubenswrapper[5118]: I0223 06:46:36.074626 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerStarted","Data":"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273"} Feb 23 06:46:36 crc kubenswrapper[5118]: I0223 06:46:36.074705 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerStarted","Data":"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5"} Feb 23 06:46:36 crc kubenswrapper[5118]: I0223 06:46:36.696833 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:36 crc kubenswrapper[5118]: E0223 06:46:36.697385 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qwhq" podUID="57709b5a-89ef-4120-af9b-d96006564eff" Feb 23 06:46:36 crc kubenswrapper[5118]: I0223 06:46:36.912379 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:36 crc kubenswrapper[5118]: E0223 06:46:36.912647 5118 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:36 crc kubenswrapper[5118]: E0223 06:46:36.912762 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs podName:57709b5a-89ef-4120-af9b-d96006564eff nodeName:}" failed. No retries permitted until 2026-02-23 06:46:40.912729668 +0000 UTC m=+63.916514281 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs") pod "network-metrics-daemon-9qwhq" (UID: "57709b5a-89ef-4120-af9b-d96006564eff") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:37 crc kubenswrapper[5118]: I0223 06:46:37.084471 5118 generic.go:334] "Generic (PLEG): container finished" podID="bb17ee96-c045-4fa3-88c4-196083c286b5" containerID="4addaea7bf7c6ba5ae786bcb5be62d10c3cc629f202bc3d7f400d22d3c73c404" exitCode=0 Feb 23 06:46:37 crc kubenswrapper[5118]: I0223 06:46:37.084576 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" event={"ID":"bb17ee96-c045-4fa3-88c4-196083c286b5","Type":"ContainerDied","Data":"4addaea7bf7c6ba5ae786bcb5be62d10c3cc629f202bc3d7f400d22d3c73c404"} Feb 23 06:46:37 crc kubenswrapper[5118]: I0223 06:46:37.696372 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:37 crc kubenswrapper[5118]: I0223 06:46:37.696372 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:37 crc kubenswrapper[5118]: I0223 06:46:37.696448 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:37 crc kubenswrapper[5118]: E0223 06:46:37.698989 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:37 crc kubenswrapper[5118]: E0223 06:46:37.699238 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:37 crc kubenswrapper[5118]: E0223 06:46:37.699449 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:38 crc kubenswrapper[5118]: I0223 06:46:38.094983 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerStarted","Data":"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc"} Feb 23 06:46:38 crc kubenswrapper[5118]: I0223 06:46:38.099977 5118 generic.go:334] "Generic (PLEG): container finished" podID="bb17ee96-c045-4fa3-88c4-196083c286b5" containerID="ec837e7733e048df4c225250801dd10d38e364e0910e9dccdb60cadd7caf3e5c" exitCode=0 Feb 23 06:46:38 crc kubenswrapper[5118]: I0223 06:46:38.100030 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" event={"ID":"bb17ee96-c045-4fa3-88c4-196083c286b5","Type":"ContainerDied","Data":"ec837e7733e048df4c225250801dd10d38e364e0910e9dccdb60cadd7caf3e5c"} Feb 23 06:46:38 crc kubenswrapper[5118]: I0223 06:46:38.696749 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:38 crc kubenswrapper[5118]: E0223 06:46:38.697052 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qwhq" podUID="57709b5a-89ef-4120-af9b-d96006564eff" Feb 23 06:46:39 crc kubenswrapper[5118]: I0223 06:46:39.109851 5118 generic.go:334] "Generic (PLEG): container finished" podID="bb17ee96-c045-4fa3-88c4-196083c286b5" containerID="9755425cb74b70e4683a8cdce7a478374b71bb268350e9c92dc30f73d47362a7" exitCode=0 Feb 23 06:46:39 crc kubenswrapper[5118]: I0223 06:46:39.109957 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" event={"ID":"bb17ee96-c045-4fa3-88c4-196083c286b5","Type":"ContainerDied","Data":"9755425cb74b70e4683a8cdce7a478374b71bb268350e9c92dc30f73d47362a7"} Feb 23 06:46:39 crc kubenswrapper[5118]: I0223 06:46:39.696706 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:39 crc kubenswrapper[5118]: E0223 06:46:39.697406 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:39 crc kubenswrapper[5118]: I0223 06:46:39.696872 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:39 crc kubenswrapper[5118]: E0223 06:46:39.697507 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:39 crc kubenswrapper[5118]: I0223 06:46:39.696725 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:39 crc kubenswrapper[5118]: E0223 06:46:39.697588 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.123065 5118 generic.go:334] "Generic (PLEG): container finished" podID="bb17ee96-c045-4fa3-88c4-196083c286b5" containerID="3d1298f53e8944006bbafa20e9bcd4cf08e98d01b6e27d3d2882ffad3520e143" exitCode=0 Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.123200 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" event={"ID":"bb17ee96-c045-4fa3-88c4-196083c286b5","Type":"ContainerDied","Data":"3d1298f53e8944006bbafa20e9bcd4cf08e98d01b6e27d3d2882ffad3520e143"} Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.135349 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerStarted","Data":"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb"} Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.137441 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.137542 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.137762 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.189168 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.190804 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.210717 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" podStartSLOduration=21.210678288 podStartE2EDuration="21.210678288s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:40.210580776 +0000 UTC m=+63.214365359" watchObservedRunningTime="2026-02-23 06:46:40.210678288 +0000 UTC m=+63.214462901" Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.697253 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:40 crc kubenswrapper[5118]: E0223 06:46:40.697767 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qwhq" podUID="57709b5a-89ef-4120-af9b-d96006564eff" Feb 23 06:46:40 crc kubenswrapper[5118]: I0223 06:46:40.957934 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:40 crc kubenswrapper[5118]: E0223 06:46:40.958255 5118 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:40 crc kubenswrapper[5118]: E0223 06:46:40.958355 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs podName:57709b5a-89ef-4120-af9b-d96006564eff nodeName:}" failed. No retries permitted until 2026-02-23 06:46:48.958325229 +0000 UTC m=+71.962109842 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs") pod "network-metrics-daemon-9qwhq" (UID: "57709b5a-89ef-4120-af9b-d96006564eff") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 06:46:41 crc kubenswrapper[5118]: I0223 06:46:41.147052 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" event={"ID":"bb17ee96-c045-4fa3-88c4-196083c286b5","Type":"ContainerStarted","Data":"dabda1d2c1e49e9c0b3f98d337724356eab01840dd883f711cd32f8d7befd328"} Feb 23 06:46:41 crc kubenswrapper[5118]: I0223 06:46:41.184715 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vrbqq" podStartSLOduration=22.184685181 podStartE2EDuration="22.184685181s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:41.181739123 +0000 UTC m=+64.185523736" watchObservedRunningTime="2026-02-23 06:46:41.184685181 +0000 UTC m=+64.188469784" Feb 23 06:46:41 crc kubenswrapper[5118]: I0223 06:46:41.697300 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:41 crc kubenswrapper[5118]: I0223 06:46:41.697332 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:41 crc kubenswrapper[5118]: I0223 06:46:41.697371 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:41 crc kubenswrapper[5118]: E0223 06:46:41.697498 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:41 crc kubenswrapper[5118]: E0223 06:46:41.697639 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:41 crc kubenswrapper[5118]: E0223 06:46:41.697790 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:42 crc kubenswrapper[5118]: I0223 06:46:42.013537 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9qwhq"] Feb 23 06:46:42 crc kubenswrapper[5118]: I0223 06:46:42.013717 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:42 crc kubenswrapper[5118]: E0223 06:46:42.013884 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qwhq" podUID="57709b5a-89ef-4120-af9b-d96006564eff" Feb 23 06:46:42 crc kubenswrapper[5118]: I0223 06:46:42.718185 5118 csr.go:261] certificate signing request csr-rmpcj is approved, waiting to be issued Feb 23 06:46:42 crc kubenswrapper[5118]: I0223 06:46:42.726593 5118 csr.go:257] certificate signing request csr-rmpcj is issued Feb 23 06:46:43 crc kubenswrapper[5118]: I0223 06:46:43.696501 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:43 crc kubenswrapper[5118]: I0223 06:46:43.696549 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:43 crc kubenswrapper[5118]: I0223 06:46:43.696529 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:43 crc kubenswrapper[5118]: I0223 06:46:43.696501 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:43 crc kubenswrapper[5118]: E0223 06:46:43.696686 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 06:46:43 crc kubenswrapper[5118]: E0223 06:46:43.696803 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 06:46:43 crc kubenswrapper[5118]: E0223 06:46:43.697207 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9qwhq" podUID="57709b5a-89ef-4120-af9b-d96006564eff" Feb 23 06:46:43 crc kubenswrapper[5118]: E0223 06:46:43.697270 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 06:46:43 crc kubenswrapper[5118]: I0223 06:46:43.728883 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-23 06:41:42 +0000 UTC, rotation deadline is 2026-12-14 01:47:19.167275217 +0000 UTC Feb 23 06:46:43 crc kubenswrapper[5118]: I0223 06:46:43.730195 5118 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7051h0m35.437122638s for next certificate rotation Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.306315 5118 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.306879 5118 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.350015 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m4mzh"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.350812 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.356930 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.357255 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.357361 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.357417 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.358573 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.358710 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.364743 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.356937 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmbrq"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.365357 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.366057 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wtljz"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.366594 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.366776 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.367100 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.367195 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.367197 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.368290 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.370676 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bqs2t"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.370963 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.371436 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.371600 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.371893 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.376273 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h6hnk"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.377283 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.378315 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.378634 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.379430 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.378778 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.380144 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gvkls"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.380645 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.381510 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.382241 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-r9t74"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.382543 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.382244 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.384000 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.385967 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.386647 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.386679 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.396296 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sz8sg"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.397331 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.398213 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.416800 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.417805 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-f7t79"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.418628 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.434966 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f7t79" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.435461 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.436084 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.436418 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.436685 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.436970 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.438611 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.438858 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.439005 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.439455 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.452247 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.452654 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.452757 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.452839 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.452969 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.453552 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.455482 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.455643 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.455715 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.455872 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.456028 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.456841 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.457002 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.457125 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458351 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458500 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458751 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458817 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458849 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/871941dc-2eb3-4bb4-8db4-e1a726f1171e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6kgbs\" (UID: \"871941dc-2eb3-4bb4-8db4-e1a726f1171e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458873 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x47f\" (UniqueName: \"kubernetes.io/projected/144b5994-4ccc-47ba-9c33-4d94119f2a07-kube-api-access-2x47f\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458896 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b9c8e3-352b-4295-b407-6121f10f878f-service-ca-bundle\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458916 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71c2245d-dcc3-4d90-abaf-381b5784bcc5-serving-cert\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458934 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458958 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjkxx\" (UniqueName: \"kubernetes.io/projected/90b8cdf6-2770-4311-87a9-55c70e7967cf-kube-api-access-pjkxx\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458963 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458973 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3bd945-b0c4-404c-a066-c7fd19e177f6-config\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.458993 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-trusted-ca-bundle\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459013 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/144b5994-4ccc-47ba-9c33-4d94119f2a07-trusted-ca\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459022 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459029 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/71c2245d-dcc3-4d90-abaf-381b5784bcc5-node-pullsecrets\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459046 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/04a64a02-6f03-45e0-8c06-47d5f452adc8-encryption-config\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459064 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh6ms\" (UniqueName: \"kubernetes.io/projected/0808f45e-026f-464a-8e52-db360912b7a5-kube-api-access-qh6ms\") pod \"dns-operator-744455d44c-gvkls\" (UID: \"0808f45e-026f-464a-8e52-db360912b7a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459086 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459108 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/144b5994-4ccc-47ba-9c33-4d94119f2a07-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459138 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/71c2245d-dcc3-4d90-abaf-381b5784bcc5-etcd-client\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459175 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459276 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459328 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459410 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459671 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459681 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459789 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459877 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.459973 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.461127 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.461288 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.461384 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.461621 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.462246 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.463045 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.463074 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.463646 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.463814 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.469617 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sh4c"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.470298 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.473969 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.474034 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.474097 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.474284 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.474395 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.474486 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.474817 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.474841 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.474984 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.475323 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.475726 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.476158 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.477543 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.477855 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.477963 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.478055 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.478926 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.479079 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.479213 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.479386 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.479473 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.486190 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9kjcn"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.486878 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.487394 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.487715 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.487782 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.488452 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.488631 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.488849 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.488957 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.489027 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.489035 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.489188 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.489148 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.489314 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.489421 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.490659 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.491097 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495425 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-serving-cert\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495506 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-service-ca\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495526 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xpcs\" (UniqueName: \"kubernetes.io/projected/d8fc5dad-079f-4768-a10e-616ff7228ccd-kube-api-access-5xpcs\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495557 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl9zl\" (UniqueName: \"kubernetes.io/projected/300dcadc-c269-45fb-b9b9-7ea4cca524b5-kube-api-access-cl9zl\") pod \"openshift-apiserver-operator-796bbdcf4f-4h2kq\" (UID: \"300dcadc-c269-45fb-b9b9-7ea4cca524b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495578 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495596 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495616 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-dir\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495634 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495653 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dd3bd945-b0c4-404c-a066-c7fd19e177f6-machine-approver-tls\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495672 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-config\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495692 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b9c8e3-352b-4295-b407-6121f10f878f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495712 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/71c2245d-dcc3-4d90-abaf-381b5784bcc5-encryption-config\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495734 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/509bc0a3-3bfa-4d55-b42d-3f584823ba57-images\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495775 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/144b5994-4ccc-47ba-9c33-4d94119f2a07-metrics-tls\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495798 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/300dcadc-c269-45fb-b9b9-7ea4cca524b5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4h2kq\" (UID: \"300dcadc-c269-45fb-b9b9-7ea4cca524b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495818 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqc4v\" (UniqueName: \"kubernetes.io/projected/e6b9c8e3-352b-4295-b407-6121f10f878f-kube-api-access-wqc4v\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495850 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495871 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-config\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495889 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/300dcadc-c269-45fb-b9b9-7ea4cca524b5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4h2kq\" (UID: \"300dcadc-c269-45fb-b9b9-7ea4cca524b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495907 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5kf4\" (UniqueName: \"kubernetes.io/projected/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-kube-api-access-s5kf4\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495922 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zphm\" (UniqueName: \"kubernetes.io/projected/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-kube-api-access-8zphm\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495941 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495969 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.495986 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496003 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-config\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496022 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-policies\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496037 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/509bc0a3-3bfa-4d55-b42d-3f584823ba57-config\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496054 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a64a02-6f03-45e0-8c06-47d5f452adc8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496070 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-oauth-config\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496086 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-oauth-serving-cert\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496127 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/871941dc-2eb3-4bb4-8db4-e1a726f1171e-serving-cert\") pod \"openshift-config-operator-7777fb866f-6kgbs\" (UID: \"871941dc-2eb3-4bb4-8db4-e1a726f1171e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496146 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-trusted-ca\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496162 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496182 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/04a64a02-6f03-45e0-8c06-47d5f452adc8-audit-dir\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496220 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6s2p\" (UniqueName: \"kubernetes.io/projected/509bc0a3-3bfa-4d55-b42d-3f584823ba57-kube-api-access-m6s2p\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496238 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-image-import-ca\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496254 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2x4r\" (UniqueName: \"kubernetes.io/projected/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-kube-api-access-q2x4r\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496270 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnpft\" (UniqueName: \"kubernetes.io/projected/dd3bd945-b0c4-404c-a066-c7fd19e177f6-kube-api-access-hnpft\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496285 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/04a64a02-6f03-45e0-8c06-47d5f452adc8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496301 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496315 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b9c8e3-352b-4295-b407-6121f10f878f-config\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496330 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-config\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496350 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p72v7\" (UniqueName: \"kubernetes.io/projected/871941dc-2eb3-4bb4-8db4-e1a726f1171e-kube-api-access-p72v7\") pod \"openshift-config-operator-7777fb866f-6kgbs\" (UID: \"871941dc-2eb3-4bb4-8db4-e1a726f1171e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496368 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6b9c8e3-352b-4295-b407-6121f10f878f-serving-cert\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496386 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496401 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd3bd945-b0c4-404c-a066-c7fd19e177f6-auth-proxy-config\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496415 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-etcd-serving-ca\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496431 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496448 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8n4c\" (UniqueName: \"kubernetes.io/projected/f790193b-0058-41ab-8320-0819ec673fb9-kube-api-access-z8n4c\") pod \"cluster-samples-operator-665b6dd947-gkztb\" (UID: \"f790193b-0058-41ab-8320-0819ec673fb9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496465 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-serving-cert\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496480 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/04a64a02-6f03-45e0-8c06-47d5f452adc8-audit-policies\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496498 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/509bc0a3-3bfa-4d55-b42d-3f584823ba57-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496514 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496534 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0808f45e-026f-464a-8e52-db360912b7a5-metrics-tls\") pod \"dns-operator-744455d44c-gvkls\" (UID: \"0808f45e-026f-464a-8e52-db360912b7a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496549 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f790193b-0058-41ab-8320-0819ec673fb9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gkztb\" (UID: \"f790193b-0058-41ab-8320-0819ec673fb9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496564 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71c2245d-dcc3-4d90-abaf-381b5784bcc5-audit-dir\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496583 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-serving-cert\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496615 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496631 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwfpl\" (UniqueName: \"kubernetes.io/projected/71c2245d-dcc3-4d90-abaf-381b5784bcc5-kube-api-access-fwfpl\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496645 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a64a02-6f03-45e0-8c06-47d5f452adc8-serving-cert\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496663 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496677 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-audit\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496698 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-client-ca\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496712 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/04a64a02-6f03-45e0-8c06-47d5f452adc8-etcd-client\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.496727 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqgc9\" (UniqueName: \"kubernetes.io/projected/04a64a02-6f03-45e0-8c06-47d5f452adc8-kube-api-access-bqgc9\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.497811 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.498762 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.498863 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.499049 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.503882 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rck9k"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.505422 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.505854 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.506218 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.507008 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-sqp4p"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.507314 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.507517 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.507635 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.507840 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.507998 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.509454 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.509747 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.509918 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.510039 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.510309 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.510471 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.510579 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.510690 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.510806 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.513469 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.539088 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-548rl"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.541360 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.543236 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.543853 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.544364 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.545771 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-77kkb"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.546062 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.548706 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.557459 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.557850 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.575462 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.578196 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.578369 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.580751 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f42lm"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.581667 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.581804 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5v8wn"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.582641 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.583191 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m4mzh"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.583764 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wtljz"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.584322 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.585067 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmbrq"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.586007 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h6hnk"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.586664 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.587121 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7zksp"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.587791 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.587898 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.588332 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.588714 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9kjcn"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.589638 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.589754 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.589817 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.590906 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.592978 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.593400 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.597430 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.597542 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd3bd945-b0c4-404c-a066-c7fd19e177f6-auth-proxy-config\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.597653 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-etcd-serving-ca\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.597743 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.597850 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8n4c\" (UniqueName: \"kubernetes.io/projected/f790193b-0058-41ab-8320-0819ec673fb9-kube-api-access-z8n4c\") pod \"cluster-samples-operator-665b6dd947-gkztb\" (UID: \"f790193b-0058-41ab-8320-0819ec673fb9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.597959 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-serving-cert\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.598061 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/04a64a02-6f03-45e0-8c06-47d5f452adc8-audit-policies\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.598172 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/509bc0a3-3bfa-4d55-b42d-3f584823ba57-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.598265 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.598371 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587284f2-ccb4-46bb-8851-aa4e346530de-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hlfkx\" (UID: \"587284f2-ccb4-46bb-8851-aa4e346530de\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.598455 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-etcd-serving-ca\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.598471 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2621c2c6-b102-4d43-8733-095ef4181f5f-profile-collector-cert\") pod \"catalog-operator-68c6474976-82cw7\" (UID: \"2621c2c6-b102-4d43-8733-095ef4181f5f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.598635 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2621c2c6-b102-4d43-8733-095ef4181f5f-srv-cert\") pod \"catalog-operator-68c6474976-82cw7\" (UID: \"2621c2c6-b102-4d43-8733-095ef4181f5f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.598749 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0808f45e-026f-464a-8e52-db360912b7a5-metrics-tls\") pod \"dns-operator-744455d44c-gvkls\" (UID: \"0808f45e-026f-464a-8e52-db360912b7a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.598853 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f790193b-0058-41ab-8320-0819ec673fb9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gkztb\" (UID: \"f790193b-0058-41ab-8320-0819ec673fb9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.598945 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71c2245d-dcc3-4d90-abaf-381b5784bcc5-audit-dir\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.599048 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-serving-cert\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.599186 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b57717-c607-4420-8944-8aadad02680e-config\") pod \"kube-controller-manager-operator-78b949d7b-zrd7k\" (UID: \"21b57717-c607-4420-8944-8aadad02680e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.599265 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/04a64a02-6f03-45e0-8c06-47d5f452adc8-audit-policies\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.599366 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.599456 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwfpl\" (UniqueName: \"kubernetes.io/projected/71c2245d-dcc3-4d90-abaf-381b5784bcc5-kube-api-access-fwfpl\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.599557 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a64a02-6f03-45e0-8c06-47d5f452adc8-serving-cert\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.599653 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.599816 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-audit\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.599923 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-client-ca\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600019 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/04a64a02-6f03-45e0-8c06-47d5f452adc8-etcd-client\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600124 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqgc9\" (UniqueName: \"kubernetes.io/projected/04a64a02-6f03-45e0-8c06-47d5f452adc8-kube-api-access-bqgc9\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600231 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71c2245d-dcc3-4d90-abaf-381b5784bcc5-serving-cert\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600328 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/871941dc-2eb3-4bb4-8db4-e1a726f1171e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6kgbs\" (UID: \"871941dc-2eb3-4bb4-8db4-e1a726f1171e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600417 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x47f\" (UniqueName: \"kubernetes.io/projected/144b5994-4ccc-47ba-9c33-4d94119f2a07-kube-api-access-2x47f\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600508 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b9c8e3-352b-4295-b407-6121f10f878f-service-ca-bundle\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600601 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/813a6df4-23c6-4969-a911-9f74fa603ccf-images\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600697 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600794 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-trusted-ca-bundle\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600885 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjkxx\" (UniqueName: \"kubernetes.io/projected/90b8cdf6-2770-4311-87a9-55c70e7967cf-kube-api-access-pjkxx\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600976 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3bd945-b0c4-404c-a066-c7fd19e177f6-config\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601063 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/144b5994-4ccc-47ba-9c33-4d94119f2a07-trusted-ca\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601225 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/71c2245d-dcc3-4d90-abaf-381b5784bcc5-node-pullsecrets\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601333 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/04a64a02-6f03-45e0-8c06-47d5f452adc8-encryption-config\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601427 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21b57717-c607-4420-8944-8aadad02680e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zrd7k\" (UID: \"21b57717-c607-4420-8944-8aadad02680e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601519 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh6ms\" (UniqueName: \"kubernetes.io/projected/0808f45e-026f-464a-8e52-db360912b7a5-kube-api-access-qh6ms\") pod \"dns-operator-744455d44c-gvkls\" (UID: \"0808f45e-026f-464a-8e52-db360912b7a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601624 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601707 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd3bd945-b0c4-404c-a066-c7fd19e177f6-auth-proxy-config\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601716 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587284f2-ccb4-46bb-8851-aa4e346530de-config\") pod \"kube-apiserver-operator-766d6c64bb-hlfkx\" (UID: \"587284f2-ccb4-46bb-8851-aa4e346530de\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601795 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xpcs\" (UniqueName: \"kubernetes.io/projected/d8fc5dad-079f-4768-a10e-616ff7228ccd-kube-api-access-5xpcs\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601829 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/144b5994-4ccc-47ba-9c33-4d94119f2a07-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601854 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/71c2245d-dcc3-4d90-abaf-381b5784bcc5-etcd-client\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601873 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-serving-cert\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601890 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-service-ca\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601915 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b57717-c607-4420-8944-8aadad02680e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zrd7k\" (UID: \"21b57717-c607-4420-8944-8aadad02680e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601943 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl9zl\" (UniqueName: \"kubernetes.io/projected/300dcadc-c269-45fb-b9b9-7ea4cca524b5-kube-api-access-cl9zl\") pod \"openshift-apiserver-operator-796bbdcf4f-4h2kq\" (UID: \"300dcadc-c269-45fb-b9b9-7ea4cca524b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601961 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/587284f2-ccb4-46bb-8851-aa4e346530de-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hlfkx\" (UID: \"587284f2-ccb4-46bb-8851-aa4e346530de\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.601983 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602002 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602021 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-dir\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602038 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602056 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dd3bd945-b0c4-404c-a066-c7fd19e177f6-machine-approver-tls\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602072 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-config\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602090 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/509bc0a3-3bfa-4d55-b42d-3f584823ba57-images\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602130 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b9c8e3-352b-4295-b407-6121f10f878f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602133 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602147 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/71c2245d-dcc3-4d90-abaf-381b5784bcc5-encryption-config\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602166 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqc4v\" (UniqueName: \"kubernetes.io/projected/e6b9c8e3-352b-4295-b407-6121f10f878f-kube-api-access-wqc4v\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602201 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/144b5994-4ccc-47ba-9c33-4d94119f2a07-metrics-tls\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602220 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/300dcadc-c269-45fb-b9b9-7ea4cca524b5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4h2kq\" (UID: \"300dcadc-c269-45fb-b9b9-7ea4cca524b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602244 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602264 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-config\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602284 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602304 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/300dcadc-c269-45fb-b9b9-7ea4cca524b5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4h2kq\" (UID: \"300dcadc-c269-45fb-b9b9-7ea4cca524b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602323 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5kf4\" (UniqueName: \"kubernetes.io/projected/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-kube-api-access-s5kf4\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602340 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zphm\" (UniqueName: \"kubernetes.io/projected/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-kube-api-access-8zphm\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602360 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-969gb\" (UniqueName: \"kubernetes.io/projected/813a6df4-23c6-4969-a911-9f74fa603ccf-kube-api-access-969gb\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602381 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-config\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602403 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602425 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602448 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-oauth-serving-cert\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602471 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-policies\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602488 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/509bc0a3-3bfa-4d55-b42d-3f584823ba57-config\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602506 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a64a02-6f03-45e0-8c06-47d5f452adc8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602525 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-oauth-config\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602550 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/871941dc-2eb3-4bb4-8db4-e1a726f1171e-serving-cert\") pod \"openshift-config-operator-7777fb866f-6kgbs\" (UID: \"871941dc-2eb3-4bb4-8db4-e1a726f1171e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602572 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-trusted-ca\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602600 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602629 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/04a64a02-6f03-45e0-8c06-47d5f452adc8-audit-dir\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602651 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6gnd\" (UniqueName: \"kubernetes.io/projected/2621c2c6-b102-4d43-8733-095ef4181f5f-kube-api-access-z6gnd\") pod \"catalog-operator-68c6474976-82cw7\" (UID: \"2621c2c6-b102-4d43-8733-095ef4181f5f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602671 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/813a6df4-23c6-4969-a911-9f74fa603ccf-proxy-tls\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602690 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vg89\" (UniqueName: \"kubernetes.io/projected/b79b4aad-40b3-45ae-a757-36638cbb4571-kube-api-access-8vg89\") pod \"downloads-7954f5f757-f7t79\" (UID: \"b79b4aad-40b3-45ae-a757-36638cbb4571\") " pod="openshift-console/downloads-7954f5f757-f7t79" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602709 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6s2p\" (UniqueName: \"kubernetes.io/projected/509bc0a3-3bfa-4d55-b42d-3f584823ba57-kube-api-access-m6s2p\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602751 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-image-import-ca\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602772 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2x4r\" (UniqueName: \"kubernetes.io/projected/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-kube-api-access-q2x4r\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602793 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnpft\" (UniqueName: \"kubernetes.io/projected/dd3bd945-b0c4-404c-a066-c7fd19e177f6-kube-api-access-hnpft\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602812 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/04a64a02-6f03-45e0-8c06-47d5f452adc8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602834 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602853 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b9c8e3-352b-4295-b407-6121f10f878f-config\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602873 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-config\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602895 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p72v7\" (UniqueName: \"kubernetes.io/projected/871941dc-2eb3-4bb4-8db4-e1a726f1171e-kube-api-access-p72v7\") pod \"openshift-config-operator-7777fb866f-6kgbs\" (UID: \"871941dc-2eb3-4bb4-8db4-e1a726f1171e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602910 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6b9c8e3-352b-4295-b407-6121f10f878f-serving-cert\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.602933 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/813a6df4-23c6-4969-a911-9f74fa603ccf-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.603470 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.600071 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.620450 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71c2245d-dcc3-4d90-abaf-381b5784bcc5-audit-dir\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.621690 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/509bc0a3-3bfa-4d55-b42d-3f584823ba57-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.621804 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b9c8e3-352b-4295-b407-6121f10f878f-service-ca-bundle\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.623620 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-serving-cert\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.624338 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/71c2245d-dcc3-4d90-abaf-381b5784bcc5-node-pullsecrets\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.624381 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/871941dc-2eb3-4bb4-8db4-e1a726f1171e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6kgbs\" (UID: \"871941dc-2eb3-4bb4-8db4-e1a726f1171e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.625027 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f790193b-0058-41ab-8320-0819ec673fb9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gkztb\" (UID: \"f790193b-0058-41ab-8320-0819ec673fb9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.625135 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.625710 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/71c2245d-dcc3-4d90-abaf-381b5784bcc5-etcd-client\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.625825 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04a64a02-6f03-45e0-8c06-47d5f452adc8-serving-cert\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.626217 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/04a64a02-6f03-45e0-8c06-47d5f452adc8-etcd-client\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.626435 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0808f45e-026f-464a-8e52-db360912b7a5-metrics-tls\") pod \"dns-operator-744455d44c-gvkls\" (UID: \"0808f45e-026f-464a-8e52-db360912b7a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.626869 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71c2245d-dcc3-4d90-abaf-381b5784bcc5-serving-cert\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.627333 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.627392 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f7t79"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.627416 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.627492 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.627808 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-serving-cert\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.628135 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-serving-cert\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.630352 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.632829 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.635138 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.635210 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.635989 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-service-ca\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.636561 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a64a02-6f03-45e0-8c06-47d5f452adc8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.640010 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/144b5994-4ccc-47ba-9c33-4d94119f2a07-trusted-ca\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.640740 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-dir\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.644923 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/04a64a02-6f03-45e0-8c06-47d5f452adc8-encryption-config\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.641010 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-trusted-ca-bundle\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.646896 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-config\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.647465 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-config\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.648035 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.649724 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-policies\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.659992 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.660346 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-client-ca\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.661546 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.662282 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-audit\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.662325 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/300dcadc-c269-45fb-b9b9-7ea4cca524b5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4h2kq\" (UID: \"300dcadc-c269-45fb-b9b9-7ea4cca524b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.662786 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b9c8e3-352b-4295-b407-6121f10f878f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.663571 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.664004 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b9c8e3-352b-4295-b407-6121f10f878f-config\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.664036 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-config\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.664306 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/04a64a02-6f03-45e0-8c06-47d5f452adc8-audit-dir\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.664460 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.664614 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.664641 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.664763 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/04a64a02-6f03-45e0-8c06-47d5f452adc8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.664766 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/509bc0a3-3bfa-4d55-b42d-3f584823ba57-config\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.666247 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.666392 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/871941dc-2eb3-4bb4-8db4-e1a726f1171e-serving-cert\") pod \"openshift-config-operator-7777fb866f-6kgbs\" (UID: \"871941dc-2eb3-4bb4-8db4-e1a726f1171e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.666437 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-r9t74"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.666582 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.668049 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/144b5994-4ccc-47ba-9c33-4d94119f2a07-metrics-tls\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.668098 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-config\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.668532 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-oauth-config\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.668829 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-oauth-serving-cert\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.668837 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/509bc0a3-3bfa-4d55-b42d-3f584823ba57-images\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.668875 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gvkls"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.669148 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/71c2245d-dcc3-4d90-abaf-381b5784bcc5-encryption-config\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.670014 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-trusted-ca\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.670324 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.670343 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/71c2245d-dcc3-4d90-abaf-381b5784bcc5-image-import-ca\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.671286 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.671530 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/300dcadc-c269-45fb-b9b9-7ea4cca524b5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4h2kq\" (UID: \"300dcadc-c269-45fb-b9b9-7ea4cca524b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.671575 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-44jsw"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.672436 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.672916 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.672950 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dd3bd945-b0c4-404c-a066-c7fd19e177f6-machine-approver-tls\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.672964 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd3bd945-b0c4-404c-a066-c7fd19e177f6-config\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.673416 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.674192 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lbv8k"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.674465 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.675142 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lbv8k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.675576 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6b9c8e3-352b-4295-b407-6121f10f878f-serving-cert\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.676226 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.676905 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.677935 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.678950 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.681406 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x8c6x"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.682252 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.683531 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rck9k"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.684555 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.685816 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.686909 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.688120 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.690393 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.691588 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sz8sg"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.692647 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.693220 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.693927 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.695175 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.696346 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-77kkb"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.696579 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.696639 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.696807 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.697291 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.701866 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bqs2t"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.701896 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.701908 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f42lm"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.701918 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5v8wn"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.702792 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.703690 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-969gb\" (UniqueName: \"kubernetes.io/projected/813a6df4-23c6-4969-a911-9f74fa603ccf-kube-api-access-969gb\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.703749 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6gnd\" (UniqueName: \"kubernetes.io/projected/2621c2c6-b102-4d43-8733-095ef4181f5f-kube-api-access-z6gnd\") pod \"catalog-operator-68c6474976-82cw7\" (UID: \"2621c2c6-b102-4d43-8733-095ef4181f5f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.703775 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/813a6df4-23c6-4969-a911-9f74fa603ccf-proxy-tls\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.703794 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vg89\" (UniqueName: \"kubernetes.io/projected/b79b4aad-40b3-45ae-a757-36638cbb4571-kube-api-access-8vg89\") pod \"downloads-7954f5f757-f7t79\" (UID: \"b79b4aad-40b3-45ae-a757-36638cbb4571\") " pod="openshift-console/downloads-7954f5f757-f7t79" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.703851 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/813a6df4-23c6-4969-a911-9f74fa603ccf-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.703883 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587284f2-ccb4-46bb-8851-aa4e346530de-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hlfkx\" (UID: \"587284f2-ccb4-46bb-8851-aa4e346530de\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.703901 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2621c2c6-b102-4d43-8733-095ef4181f5f-profile-collector-cert\") pod \"catalog-operator-68c6474976-82cw7\" (UID: \"2621c2c6-b102-4d43-8733-095ef4181f5f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.703918 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2621c2c6-b102-4d43-8733-095ef4181f5f-srv-cert\") pod \"catalog-operator-68c6474976-82cw7\" (UID: \"2621c2c6-b102-4d43-8733-095ef4181f5f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.703937 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b57717-c607-4420-8944-8aadad02680e-config\") pod \"kube-controller-manager-operator-78b949d7b-zrd7k\" (UID: \"21b57717-c607-4420-8944-8aadad02680e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.703989 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/813a6df4-23c6-4969-a911-9f74fa603ccf-images\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.704015 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21b57717-c607-4420-8944-8aadad02680e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zrd7k\" (UID: \"21b57717-c607-4420-8944-8aadad02680e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.704038 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587284f2-ccb4-46bb-8851-aa4e346530de-config\") pod \"kube-apiserver-operator-766d6c64bb-hlfkx\" (UID: \"587284f2-ccb4-46bb-8851-aa4e346530de\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.704066 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b57717-c607-4420-8944-8aadad02680e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zrd7k\" (UID: \"21b57717-c607-4420-8944-8aadad02680e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.704092 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/587284f2-ccb4-46bb-8851-aa4e346530de-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hlfkx\" (UID: \"587284f2-ccb4-46bb-8851-aa4e346530de\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.705078 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-548rl"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.705208 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/813a6df4-23c6-4969-a911-9f74fa603ccf-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.706446 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sh4c"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.707609 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.709580 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kf2x5"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.709684 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587284f2-ccb4-46bb-8851-aa4e346530de-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hlfkx\" (UID: \"587284f2-ccb4-46bb-8851-aa4e346530de\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.710609 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x8c6x"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.710686 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.711291 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.712403 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kf2x5"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.713459 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.719750 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lbv8k"] Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.733396 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.736105 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587284f2-ccb4-46bb-8851-aa4e346530de-config\") pod \"kube-apiserver-operator-766d6c64bb-hlfkx\" (UID: \"587284f2-ccb4-46bb-8851-aa4e346530de\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.753332 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.772556 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.793273 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.812854 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.833135 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.855049 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.873697 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.893326 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.901698 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21b57717-c607-4420-8944-8aadad02680e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zrd7k\" (UID: \"21b57717-c607-4420-8944-8aadad02680e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.913308 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.915662 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b57717-c607-4420-8944-8aadad02680e-config\") pod \"kube-controller-manager-operator-78b949d7b-zrd7k\" (UID: \"21b57717-c607-4420-8944-8aadad02680e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.932936 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.953684 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.973465 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.993492 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 06:46:45 crc kubenswrapper[5118]: I0223 06:46:45.996654 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/813a6df4-23c6-4969-a911-9f74fa603ccf-images\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.014045 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.033608 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.053194 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.072831 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.093925 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.113252 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.134430 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.153527 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.157228 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/813a6df4-23c6-4969-a911-9f74fa603ccf-proxy-tls\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.174157 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.193831 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.213618 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.232863 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.253352 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.272957 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.294152 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.313836 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.332908 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.354894 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.359368 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2621c2c6-b102-4d43-8733-095ef4181f5f-profile-collector-cert\") pod \"catalog-operator-68c6474976-82cw7\" (UID: \"2621c2c6-b102-4d43-8733-095ef4181f5f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.373955 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.378900 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2621c2c6-b102-4d43-8733-095ef4181f5f-srv-cert\") pod \"catalog-operator-68c6474976-82cw7\" (UID: \"2621c2c6-b102-4d43-8733-095ef4181f5f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.413335 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.433843 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.453400 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.473673 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.493050 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.512139 5118 request.go:700] Waited for 1.004079128s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-metrics-certs-default&limit=500&resourceVersion=0 Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.513767 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.535135 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.554704 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.574427 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.592566 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.613711 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.634551 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.653874 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.673847 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.694573 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.713327 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.733740 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.753715 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.774663 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.793976 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.813132 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.834160 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.853947 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.874360 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.894659 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.913518 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.953583 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 06:46:46 crc kubenswrapper[5118]: I0223 06:46:46.973790 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.001093 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.012638 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.033727 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.054197 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.073765 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.093744 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.114506 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.133508 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.154024 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.205309 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8n4c\" (UniqueName: \"kubernetes.io/projected/f790193b-0058-41ab-8320-0819ec673fb9-kube-api-access-z8n4c\") pod \"cluster-samples-operator-665b6dd947-gkztb\" (UID: \"f790193b-0058-41ab-8320-0819ec673fb9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.218535 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x47f\" (UniqueName: \"kubernetes.io/projected/144b5994-4ccc-47ba-9c33-4d94119f2a07-kube-api-access-2x47f\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.240028 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xpcs\" (UniqueName: \"kubernetes.io/projected/d8fc5dad-079f-4768-a10e-616ff7228ccd-kube-api-access-5xpcs\") pod \"console-f9d7485db-r9t74\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.263391 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/144b5994-4ccc-47ba-9c33-4d94119f2a07-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7hlsm\" (UID: \"144b5994-4ccc-47ba-9c33-4d94119f2a07\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.271762 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwfpl\" (UniqueName: \"kubernetes.io/projected/71c2245d-dcc3-4d90-abaf-381b5784bcc5-kube-api-access-fwfpl\") pod \"apiserver-76f77b778f-m4mzh\" (UID: \"71c2245d-dcc3-4d90-abaf-381b5784bcc5\") " pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.289946 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqgc9\" (UniqueName: \"kubernetes.io/projected/04a64a02-6f03-45e0-8c06-47d5f452adc8-kube-api-access-bqgc9\") pod \"apiserver-7bbb656c7d-4jd4v\" (UID: \"04a64a02-6f03-45e0-8c06-47d5f452adc8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.310415 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.311445 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh6ms\" (UniqueName: \"kubernetes.io/projected/0808f45e-026f-464a-8e52-db360912b7a5-kube-api-access-qh6ms\") pod \"dns-operator-744455d44c-gvkls\" (UID: \"0808f45e-026f-464a-8e52-db360912b7a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.329554 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.350489 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5kf4\" (UniqueName: \"kubernetes.io/projected/f3a278dd-90ba-4af2-860d-ec7350b7e8f9-kube-api-access-s5kf4\") pod \"console-operator-58897d9998-sz8sg\" (UID: \"f3a278dd-90ba-4af2-860d-ec7350b7e8f9\") " pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.373008 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnpft\" (UniqueName: \"kubernetes.io/projected/dd3bd945-b0c4-404c-a066-c7fd19e177f6-kube-api-access-hnpft\") pod \"machine-approver-56656f9798-srlf5\" (UID: \"dd3bd945-b0c4-404c-a066-c7fd19e177f6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.392991 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqc4v\" (UniqueName: \"kubernetes.io/projected/e6b9c8e3-352b-4295-b407-6121f10f878f-kube-api-access-wqc4v\") pod \"authentication-operator-69f744f599-bqs2t\" (UID: \"e6b9c8e3-352b-4295-b407-6121f10f878f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.416934 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjkxx\" (UniqueName: \"kubernetes.io/projected/90b8cdf6-2770-4311-87a9-55c70e7967cf-kube-api-access-pjkxx\") pod \"oauth-openshift-558db77b4-h6hnk\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.437296 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl9zl\" (UniqueName: \"kubernetes.io/projected/300dcadc-c269-45fb-b9b9-7ea4cca524b5-kube-api-access-cl9zl\") pod \"openshift-apiserver-operator-796bbdcf4f-4h2kq\" (UID: \"300dcadc-c269-45fb-b9b9-7ea4cca524b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.453489 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2x4r\" (UniqueName: \"kubernetes.io/projected/8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b-kube-api-access-q2x4r\") pod \"cluster-image-registry-operator-dc59b4c8b-mr9r9\" (UID: \"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.470066 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.475325 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p72v7\" (UniqueName: \"kubernetes.io/projected/871941dc-2eb3-4bb4-8db4-e1a726f1171e-kube-api-access-p72v7\") pod \"openshift-config-operator-7777fb866f-6kgbs\" (UID: \"871941dc-2eb3-4bb4-8db4-e1a726f1171e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.486394 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.486545 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.486719 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.491267 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.494750 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zphm\" (UniqueName: \"kubernetes.io/projected/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-kube-api-access-8zphm\") pod \"controller-manager-879f6c89f-lmbrq\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.495908 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.509670 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6s2p\" (UniqueName: \"kubernetes.io/projected/509bc0a3-3bfa-4d55-b42d-3f584823ba57-kube-api-access-m6s2p\") pod \"machine-api-operator-5694c8668f-wtljz\" (UID: \"509bc0a3-3bfa-4d55-b42d-3f584823ba57\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.511586 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.512984 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.516735 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.532319 5118 request.go:700] Waited for 1.858607732s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-dockercfg-qx5rd&limit=500&resourceVersion=0 Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.534761 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.555726 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.558916 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.565935 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.576340 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.588601 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.592675 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v"] Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.595836 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.615003 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.635216 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.659778 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.662974 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.676827 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.692441 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.693922 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 06:46:47 crc kubenswrapper[5118]: W0223 06:46:47.696967 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a64a02_6f03_45e0_8c06_47d5f452adc8.slice/crio-91617e9c23879deafd7c608ffb1894ac16b7127215219d5c81698c6bb5b8ef1c WatchSource:0}: Error finding container 91617e9c23879deafd7c608ffb1894ac16b7127215219d5c81698c6bb5b8ef1c: Status 404 returned error can't find the container with id 91617e9c23879deafd7c608ffb1894ac16b7127215219d5c81698c6bb5b8ef1c Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.705803 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.715221 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.746041 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.754412 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.767449 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gvkls"] Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.768216 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-r9t74"] Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.773960 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.794278 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.815704 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.854313 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/587284f2-ccb4-46bb-8851-aa4e346530de-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hlfkx\" (UID: \"587284f2-ccb4-46bb-8851-aa4e346530de\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.871875 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.875824 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-969gb\" (UniqueName: \"kubernetes.io/projected/813a6df4-23c6-4969-a911-9f74fa603ccf-kube-api-access-969gb\") pod \"machine-config-operator-74547568cd-n6fnv\" (UID: \"813a6df4-23c6-4969-a911-9f74fa603ccf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.894604 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6gnd\" (UniqueName: \"kubernetes.io/projected/2621c2c6-b102-4d43-8733-095ef4181f5f-kube-api-access-z6gnd\") pod \"catalog-operator-68c6474976-82cw7\" (UID: \"2621c2c6-b102-4d43-8733-095ef4181f5f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.909944 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vg89\" (UniqueName: \"kubernetes.io/projected/b79b4aad-40b3-45ae-a757-36638cbb4571-kube-api-access-8vg89\") pod \"downloads-7954f5f757-f7t79\" (UID: \"b79b4aad-40b3-45ae-a757-36638cbb4571\") " pod="openshift-console/downloads-7954f5f757-f7t79" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.931730 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21b57717-c607-4420-8944-8aadad02680e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zrd7k\" (UID: \"21b57717-c607-4420-8944-8aadad02680e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.934988 5118 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.954604 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 23 06:46:47 crc kubenswrapper[5118]: I0223 06:46:47.973396 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.052717 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e5a1bc-8eba-438c-9f19-98f25a962db8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngh7g\" (UID: \"c8e5a1bc-8eba-438c-9f19-98f25a962db8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.052752 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31d16fe7-a19b-493e-b69b-c619870aa747-etcd-client\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.052772 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-client-ca\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.052881 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0980dea0-5e96-4761-a991-a2998c9f2684-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hvzgz\" (UID: \"0980dea0-5e96-4761-a991-a2998c9f2684\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.052916 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-registry-tls\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.052956 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9kwvf\" (UID: \"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.052984 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwjt5\" (UniqueName: \"kubernetes.io/projected/bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522-kube-api-access-rwjt5\") pod \"kube-storage-version-migrator-operator-b67b599dd-9kwvf\" (UID: \"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.054604 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/31d16fe7-a19b-493e-b69b-c619870aa747-etcd-ca\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.055366 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.055724 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-serving-cert\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.055764 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-trusted-ca\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.056628 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:48.556599273 +0000 UTC m=+71.560383846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.058295 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44z8z\" (UniqueName: \"kubernetes.io/projected/c8e5a1bc-8eba-438c-9f19-98f25a962db8-kube-api-access-44z8z\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngh7g\" (UID: \"c8e5a1bc-8eba-438c-9f19-98f25a962db8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.058351 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0980dea0-5e96-4761-a991-a2998c9f2684-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hvzgz\" (UID: \"0980dea0-5e96-4761-a991-a2998c9f2684\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.058410 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-bound-sa-token\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.058794 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcpd6\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-kube-api-access-fcpd6\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.059234 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lghxp\" (UniqueName: \"kubernetes.io/projected/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-kube-api-access-lghxp\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.059307 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-config\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.060526 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08634871-e819-4b75-93e5-fed45013b977-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.060567 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08634871-e819-4b75-93e5-fed45013b977-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.060620 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0980dea0-5e96-4761-a991-a2998c9f2684-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hvzgz\" (UID: \"0980dea0-5e96-4761-a991-a2998c9f2684\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.060659 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31d16fe7-a19b-493e-b69b-c619870aa747-serving-cert\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.060676 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj55t\" (UniqueName: \"kubernetes.io/projected/31d16fe7-a19b-493e-b69b-c619870aa747-kube-api-access-hj55t\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.060715 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-registry-certificates\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.060730 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e5a1bc-8eba-438c-9f19-98f25a962db8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngh7g\" (UID: \"c8e5a1bc-8eba-438c-9f19-98f25a962db8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.060976 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d16fe7-a19b-493e-b69b-c619870aa747-config\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.061004 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9kwvf\" (UID: \"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.061074 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/31d16fe7-a19b-493e-b69b-c619870aa747-etcd-service-ca\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.105576 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h6hnk"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.122557 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f7t79" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.143231 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.143420 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.167722 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wtljz"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.174218 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.186202 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sz8sg"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.189138 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.189675 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.190133 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ba477b1-5a26-44b2-800e-2eea288fef93-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.191281 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.191527 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.193839 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-m4mzh"] Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.195669 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:48.69563426 +0000 UTC m=+71.699418833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.197507 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.199881 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-csi-data-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.199927 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5qff\" (UniqueName: \"kubernetes.io/projected/18908f3c-feca-437a-bde5-df76b32e9a10-kube-api-access-z5qff\") pod \"service-ca-9c57cc56f-5v8wn\" (UID: \"18908f3c-feca-437a-bde5-df76b32e9a10\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.199968 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fglg\" (UniqueName: \"kubernetes.io/projected/adfd1b6a-2add-429b-b8e9-b245e1b999ad-kube-api-access-7fglg\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.200023 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9kwvf\" (UID: \"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.200042 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwjt5\" (UniqueName: \"kubernetes.io/projected/bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522-kube-api-access-rwjt5\") pod \"kube-storage-version-migrator-operator-b67b599dd-9kwvf\" (UID: \"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.200067 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/31d16fe7-a19b-493e-b69b-c619870aa747-etcd-ca\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.200443 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03934e78-e05d-4971-a195-9a9df7443df3-secret-volume\") pod \"collect-profiles-29530485-fmxd5\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.200506 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-f42lm\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.201047 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.201248 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-socket-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.201587 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-serving-cert\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.201676 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:48.701647104 +0000 UTC m=+71.705431677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.202246 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-trusted-ca\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.202294 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b44d\" (UniqueName: \"kubernetes.io/projected/542d3480-a525-47a5-88b6-61189c31609e-kube-api-access-9b44d\") pod \"machine-config-server-44jsw\" (UID: \"542d3480-a525-47a5-88b6-61189c31609e\") " pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.202320 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d37a0834-d9d7-48d2-a69a-3f4982f227fc-proxy-tls\") pod \"machine-config-controller-84d6567774-548rl\" (UID: \"d37a0834-d9d7-48d2-a69a-3f4982f227fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.203504 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/31d16fe7-a19b-493e-b69b-c619870aa747-etcd-ca\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.204130 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q67b6\" (UniqueName: \"kubernetes.io/projected/34479184-8c08-43c7-b0c6-7d46408f3f33-kube-api-access-q67b6\") pod \"marketplace-operator-79b997595-f42lm\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.204195 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44z8z\" (UniqueName: \"kubernetes.io/projected/c8e5a1bc-8eba-438c-9f19-98f25a962db8-kube-api-access-44z8z\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngh7g\" (UID: \"c8e5a1bc-8eba-438c-9f19-98f25a962db8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.204224 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0980dea0-5e96-4761-a991-a2998c9f2684-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hvzgz\" (UID: \"0980dea0-5e96-4761-a991-a2998c9f2684\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.204253 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-bound-sa-token\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.204307 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-mountpoint-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.204332 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pknn\" (UniqueName: \"kubernetes.io/projected/4dae455a-bb65-4035-9cac-679bcb07e7f3-kube-api-access-6pknn\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.204417 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbtzf\" (UniqueName: \"kubernetes.io/projected/312dcf47-596f-497a-9dfd-2f782406b1f0-kube-api-access-zbtzf\") pod \"package-server-manager-789f6589d5-gc8gv\" (UID: \"312dcf47-596f-497a-9dfd-2f782406b1f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.204445 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9ms4\" (UniqueName: \"kubernetes.io/projected/9b14be00-5577-4aaf-b7b2-762b27be7d7e-kube-api-access-t9ms4\") pod \"service-ca-operator-777779d784-77kkb\" (UID: \"9b14be00-5577-4aaf-b7b2-762b27be7d7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.204509 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcpd6\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-kube-api-access-fcpd6\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.204572 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rck9k\" (UID: \"edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.208027 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/af54156b-7666-4bb6-8747-f5b7450b3c51-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmd6b\" (UID: \"af54156b-7666-4bb6-8747-f5b7450b3c51\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.208061 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5rxx\" (UniqueName: \"kubernetes.io/projected/af54156b-7666-4bb6-8747-f5b7450b3c51-kube-api-access-k5rxx\") pod \"olm-operator-6b444d44fb-mmd6b\" (UID: \"af54156b-7666-4bb6-8747-f5b7450b3c51\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.208095 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf8pt\" (UniqueName: \"kubernetes.io/projected/4a3c2076-5da9-4b1d-9485-37c3b578669f-kube-api-access-kf8pt\") pod \"dns-default-x8c6x\" (UID: \"4a3c2076-5da9-4b1d-9485-37c3b578669f\") " pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.210236 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b14be00-5577-4aaf-b7b2-762b27be7d7e-serving-cert\") pod \"service-ca-operator-777779d784-77kkb\" (UID: \"9b14be00-5577-4aaf-b7b2-762b27be7d7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.210753 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-serving-cert\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.211042 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/968b751e-e05f-4b13-b627-a44b4db9777d-webhook-cert\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.211178 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lghxp\" (UniqueName: \"kubernetes.io/projected/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-kube-api-access-lghxp\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.211265 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4509298-a9d8-48a5-91b0-2c28a10e837b-cert\") pod \"ingress-canary-lbv8k\" (UID: \"e4509298-a9d8-48a5-91b0-2c28a10e837b\") " pod="openshift-ingress-canary/ingress-canary-lbv8k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.211341 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b14be00-5577-4aaf-b7b2-762b27be7d7e-config\") pod \"service-ca-operator-777779d784-77kkb\" (UID: \"9b14be00-5577-4aaf-b7b2-762b27be7d7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.211443 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a3c2076-5da9-4b1d-9485-37c3b578669f-config-volume\") pod \"dns-default-x8c6x\" (UID: \"4a3c2076-5da9-4b1d-9485-37c3b578669f\") " pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.211515 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-config\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.211586 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldmw7\" (UniqueName: \"kubernetes.io/projected/968b751e-e05f-4b13-b627-a44b4db9777d-kube-api-access-ldmw7\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.211691 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/18908f3c-feca-437a-bde5-df76b32e9a10-signing-cabundle\") pod \"service-ca-9c57cc56f-5v8wn\" (UID: \"18908f3c-feca-437a-bde5-df76b32e9a10\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.211758 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/968b751e-e05f-4b13-b627-a44b4db9777d-tmpfs\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.211191 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-trusted-ca\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.213011 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-config\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.214055 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/af54156b-7666-4bb6-8747-f5b7450b3c51-srv-cert\") pod \"olm-operator-6b444d44fb-mmd6b\" (UID: \"af54156b-7666-4bb6-8747-f5b7450b3c51\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.214299 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08634871-e819-4b75-93e5-fed45013b977-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.214375 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-plugins-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.214470 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08634871-e819-4b75-93e5-fed45013b977-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.214556 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d37a0834-d9d7-48d2-a69a-3f4982f227fc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-548rl\" (UID: \"d37a0834-d9d7-48d2-a69a-3f4982f227fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.214655 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/8ba477b1-5a26-44b2-800e-2eea288fef93-ready\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.214726 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0980dea0-5e96-4761-a991-a2998c9f2684-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hvzgz\" (UID: \"0980dea0-5e96-4761-a991-a2998c9f2684\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.214768 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08634871-e819-4b75-93e5-fed45013b977-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.214853 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31d16fe7-a19b-493e-b69b-c619870aa747-serving-cert\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.214924 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj55t\" (UniqueName: \"kubernetes.io/projected/31d16fe7-a19b-493e-b69b-c619870aa747-kube-api-access-hj55t\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.215016 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3d6cc75-6216-4db9-8f11-6b5045548df1-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rz7k\" (UID: \"d3d6cc75-6216-4db9-8f11-6b5045548df1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.215101 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/542d3480-a525-47a5-88b6-61189c31609e-certs\") pod \"machine-config-server-44jsw\" (UID: \"542d3480-a525-47a5-88b6-61189c31609e\") " pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.215285 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adfd1b6a-2add-429b-b8e9-b245e1b999ad-metrics-certs\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.215330 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-registry-certificates\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.215435 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e5a1bc-8eba-438c-9f19-98f25a962db8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngh7g\" (UID: \"c8e5a1bc-8eba-438c-9f19-98f25a962db8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.216293 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pk49\" (UniqueName: \"kubernetes.io/projected/03934e78-e05d-4971-a195-9a9df7443df3-kube-api-access-7pk49\") pod \"collect-profiles-29530485-fmxd5\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.216603 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/968b751e-e05f-4b13-b627-a44b4db9777d-apiservice-cert\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.216822 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-registry-certificates\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.216900 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4f2g\" (UniqueName: \"kubernetes.io/projected/d37a0834-d9d7-48d2-a69a-3f4982f227fc-kube-api-access-j4f2g\") pod \"machine-config-controller-84d6567774-548rl\" (UID: \"d37a0834-d9d7-48d2-a69a-3f4982f227fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217258 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d16fe7-a19b-493e-b69b-c619870aa747-config\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217337 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9kwvf\" (UID: \"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217450 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tb2l\" (UniqueName: \"kubernetes.io/projected/e4509298-a9d8-48a5-91b0-2c28a10e837b-kube-api-access-2tb2l\") pod \"ingress-canary-lbv8k\" (UID: \"e4509298-a9d8-48a5-91b0-2c28a10e837b\") " pod="openshift-ingress-canary/ingress-canary-lbv8k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217502 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/312dcf47-596f-497a-9dfd-2f782406b1f0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gc8gv\" (UID: \"312dcf47-596f-497a-9dfd-2f782406b1f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217579 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wdk5\" (UniqueName: \"kubernetes.io/projected/7943dfa3-6af3-43ca-b8f4-fc0d40d0b4a2-kube-api-access-7wdk5\") pod \"migrator-59844c95c7-qq9bd\" (UID: \"7943dfa3-6af3-43ca-b8f4-fc0d40d0b4a2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217643 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ba477b1-5a26-44b2-800e-2eea288fef93-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217698 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/31d16fe7-a19b-493e-b69b-c619870aa747-etcd-service-ca\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217736 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adfd1b6a-2add-429b-b8e9-b245e1b999ad-service-ca-bundle\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217766 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-f42lm\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217791 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/18908f3c-feca-437a-bde5-df76b32e9a10-signing-key\") pod \"service-ca-9c57cc56f-5v8wn\" (UID: \"18908f3c-feca-437a-bde5-df76b32e9a10\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217816 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31d16fe7-a19b-493e-b69b-c619870aa747-etcd-client\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217840 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-client-ca\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217867 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e5a1bc-8eba-438c-9f19-98f25a962db8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngh7g\" (UID: \"c8e5a1bc-8eba-438c-9f19-98f25a962db8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217905 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/542d3480-a525-47a5-88b6-61189c31609e-node-bootstrap-token\") pod \"machine-config-server-44jsw\" (UID: \"542d3480-a525-47a5-88b6-61189c31609e\") " pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217917 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d16fe7-a19b-493e-b69b-c619870aa747-config\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217939 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03934e78-e05d-4971-a195-9a9df7443df3-config-volume\") pod \"collect-profiles-29530485-fmxd5\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.217959 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/adfd1b6a-2add-429b-b8e9-b245e1b999ad-default-certificate\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.218097 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-registration-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.218546 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/adfd1b6a-2add-429b-b8e9-b245e1b999ad-stats-auth\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.218579 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a3c2076-5da9-4b1d-9485-37c3b578669f-metrics-tls\") pod \"dns-default-x8c6x\" (UID: \"4a3c2076-5da9-4b1d-9485-37c3b578669f\") " pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.218599 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4tft\" (UniqueName: \"kubernetes.io/projected/d3d6cc75-6216-4db9-8f11-6b5045548df1-kube-api-access-f4tft\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rz7k\" (UID: \"d3d6cc75-6216-4db9-8f11-6b5045548df1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.219411 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-client-ca\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.219675 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0980dea0-5e96-4761-a991-a2998c9f2684-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hvzgz\" (UID: \"0980dea0-5e96-4761-a991-a2998c9f2684\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.219927 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25cr9\" (UniqueName: \"kubernetes.io/projected/edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449-kube-api-access-25cr9\") pod \"multus-admission-controller-857f4d67dd-rck9k\" (UID: \"edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.219941 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9kwvf\" (UID: \"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.220341 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8e5a1bc-8eba-438c-9f19-98f25a962db8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngh7g\" (UID: \"c8e5a1bc-8eba-438c-9f19-98f25a962db8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.220468 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-registry-tls\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.220566 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnjxj\" (UniqueName: \"kubernetes.io/projected/8ba477b1-5a26-44b2-800e-2eea288fef93-kube-api-access-mnjxj\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.223489 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" event={"ID":"dd3bd945-b0c4-404c-a066-c7fd19e177f6","Type":"ContainerStarted","Data":"31200b3a575b3e16ecc423fc400285a6901fd46453084cb90b6ab86bdf99b365"} Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.223607 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" event={"ID":"dd3bd945-b0c4-404c-a066-c7fd19e177f6","Type":"ContainerStarted","Data":"a388d12c737c580637b3a8be1011c4012421b467a52368710f5e68cb4719e840"} Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.226180 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9kwvf\" (UID: \"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.226192 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0980dea0-5e96-4761-a991-a2998c9f2684-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hvzgz\" (UID: \"0980dea0-5e96-4761-a991-a2998c9f2684\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.226407 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8e5a1bc-8eba-438c-9f19-98f25a962db8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngh7g\" (UID: \"c8e5a1bc-8eba-438c-9f19-98f25a962db8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.228358 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08634871-e819-4b75-93e5-fed45013b977-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.229865 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0980dea0-5e96-4761-a991-a2998c9f2684-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hvzgz\" (UID: \"0980dea0-5e96-4761-a991-a2998c9f2684\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.230198 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-registry-tls\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.234227 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31d16fe7-a19b-493e-b69b-c619870aa747-serving-cert\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.235274 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" event={"ID":"509bc0a3-3bfa-4d55-b42d-3f584823ba57","Type":"ContainerStarted","Data":"22f0ecda5e641b703e788fb2ed4bb196e96597b1deaf71850dd73d7e49b10f4e"} Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.237168 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" event={"ID":"300dcadc-c269-45fb-b9b9-7ea4cca524b5","Type":"ContainerStarted","Data":"998537fd560a85cced8ecde258ca9ba491f3fd386cd5c984ee0a8deba4299ddc"} Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.240759 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmbrq"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.241193 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" event={"ID":"90b8cdf6-2770-4311-87a9-55c70e7967cf","Type":"ContainerStarted","Data":"cb4381f00dd44a5cc9866074f305f826582f23254003faf098d5c97662d535b6"} Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.244347 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" event={"ID":"0808f45e-026f-464a-8e52-db360912b7a5","Type":"ContainerStarted","Data":"23e34d61317431300a2dce62927d9303e9db303a19c8e324036b2a08508de68e"} Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.247716 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sz8sg" event={"ID":"f3a278dd-90ba-4af2-860d-ec7350b7e8f9","Type":"ContainerStarted","Data":"82fb64ac9f43888e1561cdf461f98458e741e156d0180de2438d98647ecd3cca"} Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.249401 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/31d16fe7-a19b-493e-b69b-c619870aa747-etcd-service-ca\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.250658 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bqs2t"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.251288 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-r9t74" event={"ID":"d8fc5dad-079f-4768-a10e-616ff7228ccd","Type":"ContainerStarted","Data":"3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16"} Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.251327 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-r9t74" event={"ID":"d8fc5dad-079f-4768-a10e-616ff7228ccd","Type":"ContainerStarted","Data":"1ce2ccfffb303f7b4c7a8e17ff25ef727daf4707a8d3437b778c1b3065776c76"} Feb 23 06:46:48 crc kubenswrapper[5118]: W0223 06:46:48.251983 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod587284f2_ccb4_46bb_8851_aa4e346530de.slice/crio-30ef81715dd54e4f2708c7321d7e431eb2816518edc6df35894f5821edd4196f WatchSource:0}: Error finding container 30ef81715dd54e4f2708c7321d7e431eb2816518edc6df35894f5821edd4196f: Status 404 returned error can't find the container with id 30ef81715dd54e4f2708c7321d7e431eb2816518edc6df35894f5821edd4196f Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.252979 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwjt5\" (UniqueName: \"kubernetes.io/projected/bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522-kube-api-access-rwjt5\") pod \"kube-storage-version-migrator-operator-b67b599dd-9kwvf\" (UID: \"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.253126 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.254698 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31d16fe7-a19b-493e-b69b-c619870aa747-etcd-client\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.254991 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9"] Feb 23 06:46:48 crc kubenswrapper[5118]: W0223 06:46:48.259330 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6b9c8e3_352b_4295_b407_6121f10f878f.slice/crio-5f68d9e0ae503abee2cebe3c4dc3804fe897c4ecd8293ad4bc67e6645d3004ae WatchSource:0}: Error finding container 5f68d9e0ae503abee2cebe3c4dc3804fe897c4ecd8293ad4bc67e6645d3004ae: Status 404 returned error can't find the container with id 5f68d9e0ae503abee2cebe3c4dc3804fe897c4ecd8293ad4bc67e6645d3004ae Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.263796 5118 generic.go:334] "Generic (PLEG): container finished" podID="04a64a02-6f03-45e0-8c06-47d5f452adc8" containerID="667c17e3326c41c54ab4c67d71d0b2555ad903ab32a0f6fef1253a732bfb60f8" exitCode=0 Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.263854 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" event={"ID":"04a64a02-6f03-45e0-8c06-47d5f452adc8","Type":"ContainerDied","Data":"667c17e3326c41c54ab4c67d71d0b2555ad903ab32a0f6fef1253a732bfb60f8"} Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.263889 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" event={"ID":"04a64a02-6f03-45e0-8c06-47d5f452adc8","Type":"ContainerStarted","Data":"91617e9c23879deafd7c608ffb1894ac16b7127215219d5c81698c6bb5b8ef1c"} Feb 23 06:46:48 crc kubenswrapper[5118]: W0223 06:46:48.266409 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dfff42d_03d2_4cd9_9c0a_beb0b9d85b5b.slice/crio-1764ca9dea5918bb905c681831dc0460fc0e8b794901c8fcdae0b1151d2eb65c WatchSource:0}: Error finding container 1764ca9dea5918bb905c681831dc0460fc0e8b794901c8fcdae0b1151d2eb65c: Status 404 returned error can't find the container with id 1764ca9dea5918bb905c681831dc0460fc0e8b794901c8fcdae0b1151d2eb65c Feb 23 06:46:48 crc kubenswrapper[5118]: W0223 06:46:48.267292 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod871941dc_2eb3_4bb4_8db4_e1a726f1171e.slice/crio-0dcbbbb791c7b1b5815d1914d8aaddf88a085362bce23149134c4b756f1fb837 WatchSource:0}: Error finding container 0dcbbbb791c7b1b5815d1914d8aaddf88a085362bce23149134c4b756f1fb837: Status 404 returned error can't find the container with id 0dcbbbb791c7b1b5815d1914d8aaddf88a085362bce23149134c4b756f1fb837 Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.269009 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44z8z\" (UniqueName: \"kubernetes.io/projected/c8e5a1bc-8eba-438c-9f19-98f25a962db8-kube-api-access-44z8z\") pod \"openshift-controller-manager-operator-756b6f6bc6-ngh7g\" (UID: \"c8e5a1bc-8eba-438c-9f19-98f25a962db8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.299597 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcpd6\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-kube-api-access-fcpd6\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.321438 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.321767 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wdk5\" (UniqueName: \"kubernetes.io/projected/7943dfa3-6af3-43ca-b8f4-fc0d40d0b4a2-kube-api-access-7wdk5\") pod \"migrator-59844c95c7-qq9bd\" (UID: \"7943dfa3-6af3-43ca-b8f4-fc0d40d0b4a2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.321794 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tb2l\" (UniqueName: \"kubernetes.io/projected/e4509298-a9d8-48a5-91b0-2c28a10e837b-kube-api-access-2tb2l\") pod \"ingress-canary-lbv8k\" (UID: \"e4509298-a9d8-48a5-91b0-2c28a10e837b\") " pod="openshift-ingress-canary/ingress-canary-lbv8k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.321817 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/312dcf47-596f-497a-9dfd-2f782406b1f0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gc8gv\" (UID: \"312dcf47-596f-497a-9dfd-2f782406b1f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.321841 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ba477b1-5a26-44b2-800e-2eea288fef93-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.321864 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adfd1b6a-2add-429b-b8e9-b245e1b999ad-service-ca-bundle\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.321898 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-f42lm\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.321919 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/18908f3c-feca-437a-bde5-df76b32e9a10-signing-key\") pod \"service-ca-9c57cc56f-5v8wn\" (UID: \"18908f3c-feca-437a-bde5-df76b32e9a10\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.321949 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/542d3480-a525-47a5-88b6-61189c31609e-node-bootstrap-token\") pod \"machine-config-server-44jsw\" (UID: \"542d3480-a525-47a5-88b6-61189c31609e\") " pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.321994 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03934e78-e05d-4971-a195-9a9df7443df3-config-volume\") pod \"collect-profiles-29530485-fmxd5\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322015 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/adfd1b6a-2add-429b-b8e9-b245e1b999ad-default-certificate\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322047 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-registration-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322073 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/adfd1b6a-2add-429b-b8e9-b245e1b999ad-stats-auth\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322122 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a3c2076-5da9-4b1d-9485-37c3b578669f-metrics-tls\") pod \"dns-default-x8c6x\" (UID: \"4a3c2076-5da9-4b1d-9485-37c3b578669f\") " pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322143 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4tft\" (UniqueName: \"kubernetes.io/projected/d3d6cc75-6216-4db9-8f11-6b5045548df1-kube-api-access-f4tft\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rz7k\" (UID: \"d3d6cc75-6216-4db9-8f11-6b5045548df1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322164 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25cr9\" (UniqueName: \"kubernetes.io/projected/edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449-kube-api-access-25cr9\") pod \"multus-admission-controller-857f4d67dd-rck9k\" (UID: \"edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322198 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnjxj\" (UniqueName: \"kubernetes.io/projected/8ba477b1-5a26-44b2-800e-2eea288fef93-kube-api-access-mnjxj\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322217 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ba477b1-5a26-44b2-800e-2eea288fef93-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322237 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-csi-data-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322257 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5qff\" (UniqueName: \"kubernetes.io/projected/18908f3c-feca-437a-bde5-df76b32e9a10-kube-api-access-z5qff\") pod \"service-ca-9c57cc56f-5v8wn\" (UID: \"18908f3c-feca-437a-bde5-df76b32e9a10\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322278 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fglg\" (UniqueName: \"kubernetes.io/projected/adfd1b6a-2add-429b-b8e9-b245e1b999ad-kube-api-access-7fglg\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322304 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03934e78-e05d-4971-a195-9a9df7443df3-secret-volume\") pod \"collect-profiles-29530485-fmxd5\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322342 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-socket-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322369 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-f42lm\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322430 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q67b6\" (UniqueName: \"kubernetes.io/projected/34479184-8c08-43c7-b0c6-7d46408f3f33-kube-api-access-q67b6\") pod \"marketplace-operator-79b997595-f42lm\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322448 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b44d\" (UniqueName: \"kubernetes.io/projected/542d3480-a525-47a5-88b6-61189c31609e-kube-api-access-9b44d\") pod \"machine-config-server-44jsw\" (UID: \"542d3480-a525-47a5-88b6-61189c31609e\") " pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322467 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d37a0834-d9d7-48d2-a69a-3f4982f227fc-proxy-tls\") pod \"machine-config-controller-84d6567774-548rl\" (UID: \"d37a0834-d9d7-48d2-a69a-3f4982f227fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322500 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-mountpoint-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322534 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pknn\" (UniqueName: \"kubernetes.io/projected/4dae455a-bb65-4035-9cac-679bcb07e7f3-kube-api-access-6pknn\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322554 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbtzf\" (UniqueName: \"kubernetes.io/projected/312dcf47-596f-497a-9dfd-2f782406b1f0-kube-api-access-zbtzf\") pod \"package-server-manager-789f6589d5-gc8gv\" (UID: \"312dcf47-596f-497a-9dfd-2f782406b1f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322572 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9ms4\" (UniqueName: \"kubernetes.io/projected/9b14be00-5577-4aaf-b7b2-762b27be7d7e-kube-api-access-t9ms4\") pod \"service-ca-operator-777779d784-77kkb\" (UID: \"9b14be00-5577-4aaf-b7b2-762b27be7d7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322605 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf8pt\" (UniqueName: \"kubernetes.io/projected/4a3c2076-5da9-4b1d-9485-37c3b578669f-kube-api-access-kf8pt\") pod \"dns-default-x8c6x\" (UID: \"4a3c2076-5da9-4b1d-9485-37c3b578669f\") " pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322624 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rck9k\" (UID: \"edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322648 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/af54156b-7666-4bb6-8747-f5b7450b3c51-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmd6b\" (UID: \"af54156b-7666-4bb6-8747-f5b7450b3c51\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322675 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5rxx\" (UniqueName: \"kubernetes.io/projected/af54156b-7666-4bb6-8747-f5b7450b3c51-kube-api-access-k5rxx\") pod \"olm-operator-6b444d44fb-mmd6b\" (UID: \"af54156b-7666-4bb6-8747-f5b7450b3c51\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322699 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b14be00-5577-4aaf-b7b2-762b27be7d7e-serving-cert\") pod \"service-ca-operator-777779d784-77kkb\" (UID: \"9b14be00-5577-4aaf-b7b2-762b27be7d7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322733 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/968b751e-e05f-4b13-b627-a44b4db9777d-webhook-cert\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322771 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4509298-a9d8-48a5-91b0-2c28a10e837b-cert\") pod \"ingress-canary-lbv8k\" (UID: \"e4509298-a9d8-48a5-91b0-2c28a10e837b\") " pod="openshift-ingress-canary/ingress-canary-lbv8k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322790 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b14be00-5577-4aaf-b7b2-762b27be7d7e-config\") pod \"service-ca-operator-777779d784-77kkb\" (UID: \"9b14be00-5577-4aaf-b7b2-762b27be7d7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322807 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a3c2076-5da9-4b1d-9485-37c3b578669f-config-volume\") pod \"dns-default-x8c6x\" (UID: \"4a3c2076-5da9-4b1d-9485-37c3b578669f\") " pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322828 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmw7\" (UniqueName: \"kubernetes.io/projected/968b751e-e05f-4b13-b627-a44b4db9777d-kube-api-access-ldmw7\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322867 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/18908f3c-feca-437a-bde5-df76b32e9a10-signing-cabundle\") pod \"service-ca-9c57cc56f-5v8wn\" (UID: \"18908f3c-feca-437a-bde5-df76b32e9a10\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322885 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/968b751e-e05f-4b13-b627-a44b4db9777d-tmpfs\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322889 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-bound-sa-token\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322901 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/af54156b-7666-4bb6-8747-f5b7450b3c51-srv-cert\") pod \"olm-operator-6b444d44fb-mmd6b\" (UID: \"af54156b-7666-4bb6-8747-f5b7450b3c51\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.322996 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-plugins-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.323051 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d37a0834-d9d7-48d2-a69a-3f4982f227fc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-548rl\" (UID: \"d37a0834-d9d7-48d2-a69a-3f4982f227fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.323084 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/8ba477b1-5a26-44b2-800e-2eea288fef93-ready\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.323180 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3d6cc75-6216-4db9-8f11-6b5045548df1-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rz7k\" (UID: \"d3d6cc75-6216-4db9-8f11-6b5045548df1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.323202 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/542d3480-a525-47a5-88b6-61189c31609e-certs\") pod \"machine-config-server-44jsw\" (UID: \"542d3480-a525-47a5-88b6-61189c31609e\") " pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.323243 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adfd1b6a-2add-429b-b8e9-b245e1b999ad-metrics-certs\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.323266 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4f2g\" (UniqueName: \"kubernetes.io/projected/d37a0834-d9d7-48d2-a69a-3f4982f227fc-kube-api-access-j4f2g\") pod \"machine-config-controller-84d6567774-548rl\" (UID: \"d37a0834-d9d7-48d2-a69a-3f4982f227fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.323285 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pk49\" (UniqueName: \"kubernetes.io/projected/03934e78-e05d-4971-a195-9a9df7443df3-kube-api-access-7pk49\") pod \"collect-profiles-29530485-fmxd5\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.323304 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/968b751e-e05f-4b13-b627-a44b4db9777d-apiservice-cert\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.323524 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ba477b1-5a26-44b2-800e-2eea288fef93-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.323627 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:48.82360886 +0000 UTC m=+71.827393433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.325704 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d37a0834-d9d7-48d2-a69a-3f4982f227fc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-548rl\" (UID: \"d37a0834-d9d7-48d2-a69a-3f4982f227fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.325909 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-plugins-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.326133 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/8ba477b1-5a26-44b2-800e-2eea288fef93-ready\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.326727 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-mountpoint-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.327572 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adfd1b6a-2add-429b-b8e9-b245e1b999ad-service-ca-bundle\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.329251 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/18908f3c-feca-437a-bde5-df76b32e9a10-signing-cabundle\") pod \"service-ca-9c57cc56f-5v8wn\" (UID: \"18908f3c-feca-437a-bde5-df76b32e9a10\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.329683 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/968b751e-e05f-4b13-b627-a44b4db9777d-tmpfs\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.330237 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03934e78-e05d-4971-a195-9a9df7443df3-config-volume\") pod \"collect-profiles-29530485-fmxd5\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.330363 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-registration-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.332139 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/542d3480-a525-47a5-88b6-61189c31609e-node-bootstrap-token\") pod \"machine-config-server-44jsw\" (UID: \"542d3480-a525-47a5-88b6-61189c31609e\") " pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.332207 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/542d3480-a525-47a5-88b6-61189c31609e-certs\") pod \"machine-config-server-44jsw\" (UID: \"542d3480-a525-47a5-88b6-61189c31609e\") " pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.332415 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b14be00-5577-4aaf-b7b2-762b27be7d7e-config\") pod \"service-ca-operator-777779d784-77kkb\" (UID: \"9b14be00-5577-4aaf-b7b2-762b27be7d7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.332753 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-socket-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.333298 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4509298-a9d8-48a5-91b0-2c28a10e837b-cert\") pod \"ingress-canary-lbv8k\" (UID: \"e4509298-a9d8-48a5-91b0-2c28a10e837b\") " pod="openshift-ingress-canary/ingress-canary-lbv8k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.333500 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4dae455a-bb65-4035-9cac-679bcb07e7f3-csi-data-dir\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.335429 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3d6cc75-6216-4db9-8f11-6b5045548df1-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rz7k\" (UID: \"d3d6cc75-6216-4db9-8f11-6b5045548df1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.335984 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a3c2076-5da9-4b1d-9485-37c3b578669f-config-volume\") pod \"dns-default-x8c6x\" (UID: \"4a3c2076-5da9-4b1d-9485-37c3b578669f\") " pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.336406 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ba477b1-5a26-44b2-800e-2eea288fef93-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.344661 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d37a0834-d9d7-48d2-a69a-3f4982f227fc-proxy-tls\") pod \"machine-config-controller-84d6567774-548rl\" (UID: \"d37a0834-d9d7-48d2-a69a-3f4982f227fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.345776 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03934e78-e05d-4971-a195-9a9df7443df3-secret-volume\") pod \"collect-profiles-29530485-fmxd5\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.345933 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-f42lm\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.346051 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/adfd1b6a-2add-429b-b8e9-b245e1b999ad-stats-auth\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.346609 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/adfd1b6a-2add-429b-b8e9-b245e1b999ad-default-certificate\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.346680 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rck9k\" (UID: \"edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.346748 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/af54156b-7666-4bb6-8747-f5b7450b3c51-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mmd6b\" (UID: \"af54156b-7666-4bb6-8747-f5b7450b3c51\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.347628 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/968b751e-e05f-4b13-b627-a44b4db9777d-webhook-cert\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.348346 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b14be00-5577-4aaf-b7b2-762b27be7d7e-serving-cert\") pod \"service-ca-operator-777779d784-77kkb\" (UID: \"9b14be00-5577-4aaf-b7b2-762b27be7d7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.348425 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a3c2076-5da9-4b1d-9485-37c3b578669f-metrics-tls\") pod \"dns-default-x8c6x\" (UID: \"4a3c2076-5da9-4b1d-9485-37c3b578669f\") " pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.349540 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-f42lm\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.350013 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/968b751e-e05f-4b13-b627-a44b4db9777d-apiservice-cert\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.350601 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/18908f3c-feca-437a-bde5-df76b32e9a10-signing-key\") pod \"service-ca-9c57cc56f-5v8wn\" (UID: \"18908f3c-feca-437a-bde5-df76b32e9a10\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.350592 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/af54156b-7666-4bb6-8747-f5b7450b3c51-srv-cert\") pod \"olm-operator-6b444d44fb-mmd6b\" (UID: \"af54156b-7666-4bb6-8747-f5b7450b3c51\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.350630 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0980dea0-5e96-4761-a991-a2998c9f2684-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hvzgz\" (UID: \"0980dea0-5e96-4761-a991-a2998c9f2684\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.355738 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/312dcf47-596f-497a-9dfd-2f782406b1f0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gc8gv\" (UID: \"312dcf47-596f-497a-9dfd-2f782406b1f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.362159 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adfd1b6a-2add-429b-b8e9-b245e1b999ad-metrics-certs\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.368259 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj55t\" (UniqueName: \"kubernetes.io/projected/31d16fe7-a19b-493e-b69b-c619870aa747-kube-api-access-hj55t\") pod \"etcd-operator-b45778765-9kjcn\" (UID: \"31d16fe7-a19b-493e-b69b-c619870aa747\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.368577 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lghxp\" (UniqueName: \"kubernetes.io/projected/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-kube-api-access-lghxp\") pod \"route-controller-manager-6576b87f9c-lmzgw\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.412257 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wdk5\" (UniqueName: \"kubernetes.io/projected/7943dfa3-6af3-43ca-b8f4-fc0d40d0b4a2-kube-api-access-7wdk5\") pod \"migrator-59844c95c7-qq9bd\" (UID: \"7943dfa3-6af3-43ca-b8f4-fc0d40d0b4a2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.412487 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.426269 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.426798 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:48.926779339 +0000 UTC m=+71.930563912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.431164 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.437277 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.444804 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tb2l\" (UniqueName: \"kubernetes.io/projected/e4509298-a9d8-48a5-91b0-2c28a10e837b-kube-api-access-2tb2l\") pod \"ingress-canary-lbv8k\" (UID: \"e4509298-a9d8-48a5-91b0-2c28a10e837b\") " pod="openshift-ingress-canary/ingress-canary-lbv8k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.452474 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4f2g\" (UniqueName: \"kubernetes.io/projected/d37a0834-d9d7-48d2-a69a-3f4982f227fc-kube-api-access-j4f2g\") pod \"machine-config-controller-84d6567774-548rl\" (UID: \"d37a0834-d9d7-48d2-a69a-3f4982f227fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.472462 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.474138 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.487942 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pk49\" (UniqueName: \"kubernetes.io/projected/03934e78-e05d-4971-a195-9a9df7443df3-kube-api-access-7pk49\") pod \"collect-profiles-29530485-fmxd5\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.491611 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fglg\" (UniqueName: \"kubernetes.io/projected/adfd1b6a-2add-429b-b8e9-b245e1b999ad-kube-api-access-7fglg\") pod \"router-default-5444994796-sqp4p\" (UID: \"adfd1b6a-2add-429b-b8e9-b245e1b999ad\") " pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.494728 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.513857 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.516961 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q67b6\" (UniqueName: \"kubernetes.io/projected/34479184-8c08-43c7-b0c6-7d46408f3f33-kube-api-access-q67b6\") pod \"marketplace-operator-79b997595-f42lm\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.524511 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.527640 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.528198 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.028175617 +0000 UTC m=+72.031960190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.528295 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.534075 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4tft\" (UniqueName: \"kubernetes.io/projected/d3d6cc75-6216-4db9-8f11-6b5045548df1-kube-api-access-f4tft\") pod \"control-plane-machine-set-operator-78cbb6b69f-2rz7k\" (UID: \"d3d6cc75-6216-4db9-8f11-6b5045548df1\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.543560 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.551602 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.557775 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5rxx\" (UniqueName: \"kubernetes.io/projected/af54156b-7666-4bb6-8747-f5b7450b3c51-kube-api-access-k5rxx\") pod \"olm-operator-6b444d44fb-mmd6b\" (UID: \"af54156b-7666-4bb6-8747-f5b7450b3c51\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.558177 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.568243 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b44d\" (UniqueName: \"kubernetes.io/projected/542d3480-a525-47a5-88b6-61189c31609e-kube-api-access-9b44d\") pod \"machine-config-server-44jsw\" (UID: \"542d3480-a525-47a5-88b6-61189c31609e\") " pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.581701 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-44jsw" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.588745 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lbv8k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.595430 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.596123 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf8pt\" (UniqueName: \"kubernetes.io/projected/4a3c2076-5da9-4b1d-9485-37c3b578669f-kube-api-access-kf8pt\") pod \"dns-default-x8c6x\" (UID: \"4a3c2076-5da9-4b1d-9485-37c3b578669f\") " pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.618181 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbtzf\" (UniqueName: \"kubernetes.io/projected/312dcf47-596f-497a-9dfd-2f782406b1f0-kube-api-access-zbtzf\") pod \"package-server-manager-789f6589d5-gc8gv\" (UID: \"312dcf47-596f-497a-9dfd-2f782406b1f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.629056 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.629524 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.129499825 +0000 UTC m=+72.133284398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.641492 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldmw7\" (UniqueName: \"kubernetes.io/projected/968b751e-e05f-4b13-b627-a44b4db9777d-kube-api-access-ldmw7\") pod \"packageserver-d55dfcdfc-kfxlc\" (UID: \"968b751e-e05f-4b13-b627-a44b4db9777d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.687735 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9ms4\" (UniqueName: \"kubernetes.io/projected/9b14be00-5577-4aaf-b7b2-762b27be7d7e-kube-api-access-t9ms4\") pod \"service-ca-operator-777779d784-77kkb\" (UID: \"9b14be00-5577-4aaf-b7b2-762b27be7d7e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.694397 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5qff\" (UniqueName: \"kubernetes.io/projected/18908f3c-feca-437a-bde5-df76b32e9a10-kube-api-access-z5qff\") pod \"service-ca-9c57cc56f-5v8wn\" (UID: \"18908f3c-feca-437a-bde5-df76b32e9a10\") " pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.696872 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25cr9\" (UniqueName: \"kubernetes.io/projected/edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449-kube-api-access-25cr9\") pod \"multus-admission-controller-857f4d67dd-rck9k\" (UID: \"edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.707634 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnjxj\" (UniqueName: \"kubernetes.io/projected/8ba477b1-5a26-44b2-800e-2eea288fef93-kube-api-access-mnjxj\") pod \"cni-sysctl-allowlist-ds-7zksp\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.712810 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f7t79"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.733160 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.733801 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.233758437 +0000 UTC m=+72.237543010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.734351 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.735598 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.235580518 +0000 UTC m=+72.239365091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.735798 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pknn\" (UniqueName: \"kubernetes.io/projected/4dae455a-bb65-4035-9cac-679bcb07e7f3-kube-api-access-6pknn\") pod \"csi-hostpathplugin-kf2x5\" (UID: \"4dae455a-bb65-4035-9cac-679bcb07e7f3\") " pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.788894 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.801023 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.807378 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.824053 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.836685 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.836850 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.837060 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.337036607 +0000 UTC m=+72.340821180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.851485 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.868085 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.874348 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.898766 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.937986 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:48 crc kubenswrapper[5118]: E0223 06:46:48.938514 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.438495498 +0000 UTC m=+72.442280061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.939044 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.947927 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw"] Feb 23 06:46:48 crc kubenswrapper[5118]: I0223 06:46:48.991309 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz"] Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.042214 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.042371 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.542337411 +0000 UTC m=+72.546121974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.042820 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.042861 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.043276 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.543257941 +0000 UTC m=+72.547042694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.074435 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57709b5a-89ef-4120-af9b-d96006564eff-metrics-certs\") pod \"network-metrics-daemon-9qwhq\" (UID: \"57709b5a-89ef-4120-af9b-d96006564eff\") " pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:49 crc kubenswrapper[5118]: W0223 06:46:49.113952 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542d3480_a525_47a5_88b6_61189c31609e.slice/crio-90ef863a82a072d1417427947ea534610992711ae07b71acf63bb1580f26c57b WatchSource:0}: Error finding container 90ef863a82a072d1417427947ea534610992711ae07b71acf63bb1580f26c57b: Status 404 returned error can't find the container with id 90ef863a82a072d1417427947ea534610992711ae07b71acf63bb1580f26c57b Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.144704 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.145553 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.645529629 +0000 UTC m=+72.649314192 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.209046 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9qwhq" Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.252952 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.253474 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.753461103 +0000 UTC m=+72.757245676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.354644 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.355881 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.855836475 +0000 UTC m=+72.859621048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.357029 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.357518 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.857500272 +0000 UTC m=+72.861284845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.358591 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-44jsw" event={"ID":"542d3480-a525-47a5-88b6-61189c31609e","Type":"ContainerStarted","Data":"90ef863a82a072d1417427947ea534610992711ae07b71acf63bb1580f26c57b"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.378892 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" event={"ID":"509bc0a3-3bfa-4d55-b42d-3f584823ba57","Type":"ContainerStarted","Data":"266724042e1fd5427121ff7d47b4f2339c34555eebba8b42b105ae5042c29c42"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.378948 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" event={"ID":"509bc0a3-3bfa-4d55-b42d-3f584823ba57","Type":"ContainerStarted","Data":"57894b30d4009bce8d03ac91a023e50412112a0212352f9d8bf732ee17317022"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.412613 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" event={"ID":"300dcadc-c269-45fb-b9b9-7ea4cca524b5","Type":"ContainerStarted","Data":"535d330e15172dd3061d25581029532f8de8cf95f14a68fe99906bba88c8fba6"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.416533 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" event={"ID":"71c2245d-dcc3-4d90-abaf-381b5784bcc5","Type":"ContainerStarted","Data":"ecd34fccf7548e6c14854a37ecd7e6188561afeee977e8ff60e9251213c122b5"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.416601 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" event={"ID":"71c2245d-dcc3-4d90-abaf-381b5784bcc5","Type":"ContainerStarted","Data":"b6d88b86599d6e052b8ddff7bbfaad4a495a10a429b319a76fe1405468b5e9c1"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.424243 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" event={"ID":"0808f45e-026f-464a-8e52-db360912b7a5","Type":"ContainerStarted","Data":"b10eb6d994ca2254feac7a9e99c8a5e27f46f6cb38243275aa574a1ffa8e7c39"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.424297 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" event={"ID":"0808f45e-026f-464a-8e52-db360912b7a5","Type":"ContainerStarted","Data":"78f77932557f60b49d74e9ee6af95fc39df4661d4ae89786ecb19b1deea9f4ea"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.441944 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g"] Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.448423 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" event={"ID":"144b5994-4ccc-47ba-9c33-4d94119f2a07","Type":"ContainerStarted","Data":"0a06695d6f42035f097910a7cb0eef9886c6a5e58f8cadc28eb6a702c178df37"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.448468 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" event={"ID":"144b5994-4ccc-47ba-9c33-4d94119f2a07","Type":"ContainerStarted","Data":"57dbfe676cf7c5f98eac03dae57af543915decb33a46500b1b5fb504a64c4d69"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.457738 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.459665 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:49.959637837 +0000 UTC m=+72.963422420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.466411 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" event={"ID":"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118","Type":"ContainerStarted","Data":"9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.466464 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" event={"ID":"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118","Type":"ContainerStarted","Data":"7ad77e712b97912ad69865416db69013ab487807ac581ca2d90a8e510e267828"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.466691 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.470216 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f7t79" event={"ID":"b79b4aad-40b3-45ae-a757-36638cbb4571","Type":"ContainerStarted","Data":"bfeaaa1c3e0e3a8127e88c0b7bcf61a26da13d657980f4d33430e3ea898ec0b5"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.473216 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" event={"ID":"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b","Type":"ContainerStarted","Data":"55553667f0fd98a07cb57b135b463c0083b449848cfe0f6340f5191ab18e0f17"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.473242 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" event={"ID":"8dfff42d-03d2-4cd9-9c0a-beb0b9d85b5b","Type":"ContainerStarted","Data":"1764ca9dea5918bb905c681831dc0460fc0e8b794901c8fcdae0b1151d2eb65c"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.475222 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" event={"ID":"21b57717-c607-4420-8944-8aadad02680e","Type":"ContainerStarted","Data":"0a84e2f96b11eab0269e4d2a9c5669934ed51e70042bc82f1e9d2959c0946a65"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.486915 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" event={"ID":"813a6df4-23c6-4969-a911-9f74fa603ccf","Type":"ContainerStarted","Data":"c4ee68004084af3f1f29afc652036001690389b86e3302b2f877e234f3295dbe"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.490274 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-sqp4p" event={"ID":"adfd1b6a-2add-429b-b8e9-b245e1b999ad","Type":"ContainerStarted","Data":"9b1585c3648c8c5bd295fc1b05055072e661918313d9af0cf0d387d40abaebde"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.494992 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" event={"ID":"f790193b-0058-41ab-8320-0819ec673fb9","Type":"ContainerStarted","Data":"a3aef2002bb9e07bef5db2f051214626c00e0bfa4307b9e5a39be485323670fe"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.495057 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" event={"ID":"f790193b-0058-41ab-8320-0819ec673fb9","Type":"ContainerStarted","Data":"da8ea77c76611e8c819cb0e418e400c9d3987e4a9565732485826c5d416cac6e"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.498333 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" event={"ID":"0980dea0-5e96-4761-a991-a2998c9f2684","Type":"ContainerStarted","Data":"c48b38e1286ffb76d8b3a407c78f4ca7c97c0902e19c4298370231099601d3bb"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.500387 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" event={"ID":"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0","Type":"ContainerStarted","Data":"ba9d7fdfbf9dab12c2b1e9ad2a0fe52a354bf403d4943170c3120b2c034eb204"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.503686 5118 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-lmbrq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.503732 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" podUID="a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.512014 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" event={"ID":"2621c2c6-b102-4d43-8733-095ef4181f5f","Type":"ContainerStarted","Data":"0521d57ba7db8f38a1321e58aa704c53447a452b014879f4ae2399d2d7d857cb"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.512054 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" event={"ID":"2621c2c6-b102-4d43-8733-095ef4181f5f","Type":"ContainerStarted","Data":"9887f5203061c04c9d2ff3f854ff823f1c7009b2e748d852af0d50eb247eaec0"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.512443 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.518457 5118 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-82cw7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.518526 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" podUID="2621c2c6-b102-4d43-8733-095ef4181f5f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.534045 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" event={"ID":"871941dc-2eb3-4bb4-8db4-e1a726f1171e","Type":"ContainerStarted","Data":"0438c8d407c21976b30d513af37be7b3ae4e1c99b6e7f54683c3c55f051877c3"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.534126 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" event={"ID":"871941dc-2eb3-4bb4-8db4-e1a726f1171e","Type":"ContainerStarted","Data":"0dcbbbb791c7b1b5815d1914d8aaddf88a085362bce23149134c4b756f1fb837"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.560728 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.563903 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.063884459 +0000 UTC m=+73.067669242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.564875 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" event={"ID":"dd3bd945-b0c4-404c-a066-c7fd19e177f6","Type":"ContainerStarted","Data":"90449ce475cf5c6f0f3029795f88117f889add86ad82ca73fc7d657ca18a71f4"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.590806 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" event={"ID":"e6b9c8e3-352b-4295-b407-6121f10f878f","Type":"ContainerStarted","Data":"c77590da1b671d5a5c6ea822187fe597beb4912ead8af1fdc053536be8fbc368"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.590981 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" event={"ID":"e6b9c8e3-352b-4295-b407-6121f10f878f","Type":"ContainerStarted","Data":"5f68d9e0ae503abee2cebe3c4dc3804fe897c4ecd8293ad4bc67e6645d3004ae"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.609204 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" event={"ID":"587284f2-ccb4-46bb-8851-aa4e346530de","Type":"ContainerStarted","Data":"881bdf1d0bd2a6b206175da76f56e3444e49e658022d80e836446e62a2e446fe"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.609250 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" event={"ID":"587284f2-ccb4-46bb-8851-aa4e346530de","Type":"ContainerStarted","Data":"30ef81715dd54e4f2708c7321d7e431eb2816518edc6df35894f5821edd4196f"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.645612 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" event={"ID":"90b8cdf6-2770-4311-87a9-55c70e7967cf","Type":"ContainerStarted","Data":"30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.646201 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.648310 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sz8sg" event={"ID":"f3a278dd-90ba-4af2-860d-ec7350b7e8f9","Type":"ContainerStarted","Data":"50a7aff213b26c6d101c9f38bb1f0c10b99c0e7628b25bb5b17c044662be28f9"} Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.649260 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.654270 5118 patch_prober.go:28] interesting pod/console-operator-58897d9998-sz8sg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.654331 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sz8sg" podUID="f3a278dd-90ba-4af2-860d-ec7350b7e8f9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/readyz\": dial tcp 10.217.0.32:8443: connect: connection refused" Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.655417 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f42lm"] Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.662408 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.664419 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.164391378 +0000 UTC m=+73.168175951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.690929 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" event={"ID":"04a64a02-6f03-45e0-8c06-47d5f452adc8","Type":"ContainerStarted","Data":"c870a10797eae4ac1288945cd07abba85dbdac2dca68bfb2e4e142becbe57c9e"} Feb 23 06:46:49 crc kubenswrapper[5118]: W0223 06:46:49.701675 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8e5a1bc_8eba_438c_9f19_98f25a962db8.slice/crio-26656086c46cbd1a4c0ce89c29cabf52eec6f83a56dc518f7515766717ade4e4 WatchSource:0}: Error finding container 26656086c46cbd1a4c0ce89c29cabf52eec6f83a56dc518f7515766717ade4e4: Status 404 returned error can't find the container with id 26656086c46cbd1a4c0ce89c29cabf52eec6f83a56dc518f7515766717ade4e4 Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.785339 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.786869 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.286844445 +0000 UTC m=+73.290629018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.886932 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.887360 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.387336174 +0000 UTC m=+73.391120747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.989299 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:49 crc kubenswrapper[5118]: E0223 06:46:49.989807 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.489782836 +0000 UTC m=+73.493567409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:49 crc kubenswrapper[5118]: I0223 06:46:49.998136 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-srlf5" podStartSLOduration=30.998086831 podStartE2EDuration="30.998086831s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:49.973145825 +0000 UTC m=+72.976930418" watchObservedRunningTime="2026-02-23 06:46:49.998086831 +0000 UTC m=+73.001871404" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.040671 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bqs2t" podStartSLOduration=31.040637989 podStartE2EDuration="31.040637989s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:49.999013702 +0000 UTC m=+73.002798275" watchObservedRunningTime="2026-02-23 06:46:50.040637989 +0000 UTC m=+73.044422562" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.071678 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" podStartSLOduration=31.07164966 podStartE2EDuration="31.07164966s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:50.071558378 +0000 UTC m=+73.075342951" watchObservedRunningTime="2026-02-23 06:46:50.07164966 +0000 UTC m=+73.075434233" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.091357 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.091706 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.591685306 +0000 UTC m=+73.595469879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.108885 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-r9t74" podStartSLOduration=31.108834627 podStartE2EDuration="31.108834627s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:50.107441486 +0000 UTC m=+73.111226059" watchObservedRunningTime="2026-02-23 06:46:50.108834627 +0000 UTC m=+73.112619220" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.193183 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.193767 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.693739819 +0000 UTC m=+73.697524392 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.205347 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-wtljz" podStartSLOduration=30.205288016 podStartE2EDuration="30.205288016s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:50.20050937 +0000 UTC m=+73.204293963" watchObservedRunningTime="2026-02-23 06:46:50.205288016 +0000 UTC m=+73.209072579" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.294806 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.295027 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.794976464 +0000 UTC m=+73.798761037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.295192 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.295573 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.795559128 +0000 UTC m=+73.799343691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.372094 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4h2kq" podStartSLOduration=31.372064791 podStartE2EDuration="31.372064791s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:50.3706488 +0000 UTC m=+73.374433373" watchObservedRunningTime="2026-02-23 06:46:50.372064791 +0000 UTC m=+73.375849364" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.373648 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mr9r9" podStartSLOduration=31.373640387000002 podStartE2EDuration="31.373640387s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:50.262247305 +0000 UTC m=+73.266031888" watchObservedRunningTime="2026-02-23 06:46:50.373640387 +0000 UTC m=+73.377424960" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.396579 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sz8sg" podStartSLOduration=31.396543437 podStartE2EDuration="31.396543437s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:50.394976892 +0000 UTC m=+73.398761475" watchObservedRunningTime="2026-02-23 06:46:50.396543437 +0000 UTC m=+73.400328010" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.400856 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.401123 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.901086398 +0000 UTC m=+73.904870971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.402011 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.402664 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:50.902650263 +0000 UTC m=+73.906434836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.462442 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" podStartSLOduration=30.462423524 podStartE2EDuration="30.462423524s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:50.461570015 +0000 UTC m=+73.465354588" watchObservedRunningTime="2026-02-23 06:46:50.462423524 +0000 UTC m=+73.466208097" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.504149 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.504642 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:51.004602154 +0000 UTC m=+74.008386727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.505011 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.505429 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:51.005414282 +0000 UTC m=+74.009198855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.514310 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf"] Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.530647 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.555454 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" podStartSLOduration=31.555430046 podStartE2EDuration="31.555430046s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:50.553622016 +0000 UTC m=+73.557406589" watchObservedRunningTime="2026-02-23 06:46:50.555430046 +0000 UTC m=+73.559214619" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.607447 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.611752 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:51.110851131 +0000 UTC m=+74.114635714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.632859 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lbv8k"] Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.702409 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd"] Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.708988 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.709461 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:51.209444257 +0000 UTC m=+74.213228820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.709505 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-548rl"] Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.716855 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5"] Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.801835 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" event={"ID":"f790193b-0058-41ab-8320-0819ec673fb9","Type":"ContainerStarted","Data":"d962466eae69bf19e7f21fa4c2714f0ec3c9e97aef49f00ab0bcf27880a7f3a3"} Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.816730 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hlfkx" podStartSLOduration=31.816706906 podStartE2EDuration="31.816706906s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:50.733304948 +0000 UTC m=+73.737089521" watchObservedRunningTime="2026-02-23 06:46:50.816706906 +0000 UTC m=+73.820491479" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.818926 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.820326 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.820425 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:51.320391329 +0000 UTC m=+74.324175902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.832848 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" event={"ID":"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522","Type":"ContainerStarted","Data":"56772a9f8258e4fe001edce1462143da9b2cfe209177fc82f9307abdac97c4b8"} Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.837181 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7hlsm" event={"ID":"144b5994-4ccc-47ba-9c33-4d94119f2a07","Type":"ContainerStarted","Data":"c4de4c614589ea9c1b3289b60b26cdcd048a9a950e09fa648bd4f857497a4884"} Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.890523 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" event={"ID":"34479184-8c08-43c7-b0c6-7d46408f3f33","Type":"ContainerStarted","Data":"b280c9b5348694540fdc64fc0302c47165a5a62ec6924b8651e7def1c315bf3d"} Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.891015 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" podStartSLOduration=31.890991351 podStartE2EDuration="31.890991351s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:50.848881953 +0000 UTC m=+73.852666526" watchObservedRunningTime="2026-02-23 06:46:50.890991351 +0000 UTC m=+73.894775924" Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.893044 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" event={"ID":"c8e5a1bc-8eba-438c-9f19-98f25a962db8","Type":"ContainerStarted","Data":"a17cec5c690bb8dc7327c6111f1bb6271a9edbe21416f8b0bab33f235967edba"} Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.893068 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" event={"ID":"c8e5a1bc-8eba-438c-9f19-98f25a962db8","Type":"ContainerStarted","Data":"26656086c46cbd1a4c0ce89c29cabf52eec6f83a56dc518f7515766717ade4e4"} Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.910301 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9qwhq"] Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.916707 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9kjcn"] Feb 23 06:46:50 crc kubenswrapper[5118]: I0223 06:46:50.922089 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:50 crc kubenswrapper[5118]: E0223 06:46:50.924664 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:51.424650461 +0000 UTC m=+74.428435034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.011200 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f7t79" event={"ID":"b79b4aad-40b3-45ae-a757-36638cbb4571","Type":"ContainerStarted","Data":"1fd7f7ccb8e744ff34f118b8de680d0391cb3687a26806929124bc9418ce7cfc"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.011860 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-f7t79" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.022294 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv"] Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.028192 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:51 crc kubenswrapper[5118]: E0223 06:46:51.029759 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:51.529742472 +0000 UTC m=+74.533527045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.033319 5118 patch_prober.go:28] interesting pod/downloads-7954f5f757-f7t79 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.033401 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f7t79" podUID="b79b4aad-40b3-45ae-a757-36638cbb4571" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.036460 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" event={"ID":"871941dc-2eb3-4bb4-8db4-e1a726f1171e","Type":"ContainerDied","Data":"0438c8d407c21976b30d513af37be7b3ae4e1c99b6e7f54683c3c55f051877c3"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.037233 5118 generic.go:334] "Generic (PLEG): container finished" podID="871941dc-2eb3-4bb4-8db4-e1a726f1171e" containerID="0438c8d407c21976b30d513af37be7b3ae4e1c99b6e7f54683c3c55f051877c3" exitCode=0 Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.060864 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" podStartSLOduration=31.060803413 podStartE2EDuration="31.060803413s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:51.01172676 +0000 UTC m=+74.015511333" watchObservedRunningTime="2026-02-23 06:46:51.060803413 +0000 UTC m=+74.064587986" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.079700 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" event={"ID":"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0","Type":"ContainerStarted","Data":"d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.081914 5118 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lmzgw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.081962 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" podUID="ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.080950 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.088006 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-gvkls" podStartSLOduration=32.087940728 podStartE2EDuration="32.087940728s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:51.060280972 +0000 UTC m=+74.064065545" watchObservedRunningTime="2026-02-23 06:46:51.087940728 +0000 UTC m=+74.091725311" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.101772 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" event={"ID":"21b57717-c607-4420-8944-8aadad02680e","Type":"ContainerStarted","Data":"49c114675d3c3e653d2ff28066d9711f97257b25d918b07141966e7be9936732"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.115496 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" event={"ID":"813a6df4-23c6-4969-a911-9f74fa603ccf","Type":"ContainerStarted","Data":"e172f9f55bf6c191464d7eebdf3f512c21c9bb4c867ed55d7ee1876a1d7ed4fa"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.131759 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-44jsw" event={"ID":"542d3480-a525-47a5-88b6-61189c31609e","Type":"ContainerStarted","Data":"3d7be1371f53fa4e81cdf32a625a0f9a8b667fcb377ff2aaba304e4545fb21d0"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.131932 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:51 crc kubenswrapper[5118]: E0223 06:46:51.133763 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:51.633738078 +0000 UTC m=+74.637522651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.152226 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" event={"ID":"8ba477b1-5a26-44b2-800e-2eea288fef93","Type":"ContainerStarted","Data":"bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.152297 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" event={"ID":"8ba477b1-5a26-44b2-800e-2eea288fef93","Type":"ContainerStarted","Data":"5db6293c7f82c3262fac1b93413aa998d4f055b96c6d19516c9f5655c6af418a"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.153372 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.200029 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lbv8k" event={"ID":"e4509298-a9d8-48a5-91b0-2c28a10e837b","Type":"ContainerStarted","Data":"e374ca7b57e15385aacfe0ce85f0e3206fbf1238d583c8d254e6580d944292f4"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.232373 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b"] Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.235032 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:51 crc kubenswrapper[5118]: E0223 06:46:51.239532 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:51.739504934 +0000 UTC m=+74.743289507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.246046 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k"] Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.272217 5118 generic.go:334] "Generic (PLEG): container finished" podID="71c2245d-dcc3-4d90-abaf-381b5784bcc5" containerID="ecd34fccf7548e6c14854a37ecd7e6188561afeee977e8ff60e9251213c122b5" exitCode=0 Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.271730 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rck9k"] Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.276589 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" event={"ID":"71c2245d-dcc3-4d90-abaf-381b5784bcc5","Type":"ContainerDied","Data":"ecd34fccf7548e6c14854a37ecd7e6188561afeee977e8ff60e9251213c122b5"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.276971 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" event={"ID":"71c2245d-dcc3-4d90-abaf-381b5784bcc5","Type":"ContainerStarted","Data":"2f919ea818edf34ca3483e1bd360e6284e50aa6bb63181faf7de4ff84963af81"} Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.300854 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sz8sg" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.300939 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.301164 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-82cw7" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.323585 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.352309 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kf2x5"] Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.362978 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:51 crc kubenswrapper[5118]: E0223 06:46:51.395167 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:51.895129251 +0000 UTC m=+74.898913824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.509882 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x8c6x"] Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.513792 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.516352 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-77kkb"] Feb 23 06:46:51 crc kubenswrapper[5118]: E0223 06:46:51.516956 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:52.016925484 +0000 UTC m=+75.020710057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.520558 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:51 crc kubenswrapper[5118]: E0223 06:46:51.522149 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:52.0221364 +0000 UTC m=+75.025920973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.541584 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc"] Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.550192 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5v8wn"] Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.623852 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:51 crc kubenswrapper[5118]: E0223 06:46:51.624320 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:52.124297746 +0000 UTC m=+75.128082319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.679323 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-f7t79" podStartSLOduration=32.679297811 podStartE2EDuration="32.679297811s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:51.672757115 +0000 UTC m=+74.676541688" watchObservedRunningTime="2026-02-23 06:46:51.679297811 +0000 UTC m=+74.683082384" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.681472 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ngh7g" podStartSLOduration=32.68146486 podStartE2EDuration="32.68146486s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:51.618493017 +0000 UTC m=+74.622277590" watchObservedRunningTime="2026-02-23 06:46:51.68146486 +0000 UTC m=+74.685249433" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.728023 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:51 crc kubenswrapper[5118]: E0223 06:46:51.728516 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:52.228499777 +0000 UTC m=+75.232284350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.766417 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gkztb" podStartSLOduration=32.766397011 podStartE2EDuration="32.766397011s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:51.765329547 +0000 UTC m=+74.769114130" watchObservedRunningTime="2026-02-23 06:46:51.766397011 +0000 UTC m=+74.770181584" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.828502 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zrd7k" podStartSLOduration=32.828483845 podStartE2EDuration="32.828483845s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:51.825515958 +0000 UTC m=+74.829300531" watchObservedRunningTime="2026-02-23 06:46:51.828483845 +0000 UTC m=+74.832268418" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.829355 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jfmkj"] Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.835154 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:51 crc kubenswrapper[5118]: E0223 06:46:51.835557 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:52.335539462 +0000 UTC m=+75.339324035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.860439 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.863878 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.864727 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfmkj"] Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.938142 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:51 crc kubenswrapper[5118]: E0223 06:46:51.938675 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:52.438655958 +0000 UTC m=+75.442440531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.940180 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-44jsw" podStartSLOduration=6.940145841 podStartE2EDuration="6.940145841s" podCreationTimestamp="2026-02-23 06:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:51.926004996 +0000 UTC m=+74.929789569" watchObservedRunningTime="2026-02-23 06:46:51.940145841 +0000 UTC m=+74.943930414" Feb 23 06:46:51 crc kubenswrapper[5118]: I0223 06:46:51.980784 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" podStartSLOduration=6.980761626 podStartE2EDuration="6.980761626s" podCreationTimestamp="2026-02-23 06:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:51.976563323 +0000 UTC m=+74.980347896" watchObservedRunningTime="2026-02-23 06:46:51.980761626 +0000 UTC m=+74.984546199" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.012400 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.012383281 podStartE2EDuration="2.012383281s" podCreationTimestamp="2026-02-23 06:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:52.006652543 +0000 UTC m=+75.010437116" watchObservedRunningTime="2026-02-23 06:46:52.012383281 +0000 UTC m=+75.016167844" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.014890 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ldm4m"] Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.018294 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.020563 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.043282 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" podStartSLOduration=32.043248018 podStartE2EDuration="32.043248018s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:52.043191897 +0000 UTC m=+75.046976470" watchObservedRunningTime="2026-02-23 06:46:52.043248018 +0000 UTC m=+75.047032591" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.043830 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.076645 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrhc\" (UniqueName: \"kubernetes.io/projected/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-kube-api-access-2mrhc\") pod \"certified-operators-jfmkj\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.076731 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-utilities\") pod \"certified-operators-jfmkj\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.076769 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-catalog-content\") pod \"certified-operators-jfmkj\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:52 crc kubenswrapper[5118]: E0223 06:46:52.077688 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:52.577668455 +0000 UTC m=+75.581453028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.195443 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-catalog-content\") pod \"community-operators-ldm4m\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.195651 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-utilities\") pod \"community-operators-ldm4m\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.195848 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrhc\" (UniqueName: \"kubernetes.io/projected/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-kube-api-access-2mrhc\") pod \"certified-operators-jfmkj\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.195913 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-utilities\") pod \"certified-operators-jfmkj\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.195982 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-catalog-content\") pod \"certified-operators-jfmkj\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.196023 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsr2n\" (UniqueName: \"kubernetes.io/projected/b35a400e-565d-40e1-aa28-4896f541c19f-kube-api-access-bsr2n\") pod \"community-operators-ldm4m\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.196156 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.196582 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-utilities\") pod \"certified-operators-jfmkj\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.196730 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-catalog-content\") pod \"certified-operators-jfmkj\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:52 crc kubenswrapper[5118]: E0223 06:46:52.197089 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:52.697054325 +0000 UTC m=+75.700838888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.218051 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ldm4m"] Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.232768 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-brcz5"] Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.235940 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.262421 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrhc\" (UniqueName: \"kubernetes.io/projected/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-kube-api-access-2mrhc\") pod \"certified-operators-jfmkj\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.263842 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-brcz5"] Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.307675 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.308041 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-utilities\") pod \"community-operators-ldm4m\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.308094 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-catalog-content\") pod \"certified-operators-brcz5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.308155 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-utilities\") pod \"certified-operators-brcz5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.308175 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsr2n\" (UniqueName: \"kubernetes.io/projected/b35a400e-565d-40e1-aa28-4896f541c19f-kube-api-access-bsr2n\") pod \"community-operators-ldm4m\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.308196 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl8jd\" (UniqueName: \"kubernetes.io/projected/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-kube-api-access-bl8jd\") pod \"certified-operators-brcz5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.308236 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-catalog-content\") pod \"community-operators-ldm4m\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.308833 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-catalog-content\") pod \"community-operators-ldm4m\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: E0223 06:46:52.308923 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:52.808907746 +0000 UTC m=+75.812692319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.309192 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-utilities\") pod \"community-operators-ldm4m\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.312538 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.312949 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.325010 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x8c6x" event={"ID":"4a3c2076-5da9-4b1d-9485-37c3b578669f","Type":"ContainerStarted","Data":"781c719623fa20b72e7f84b6410ba59564838cccb5fcc2349f0513c208934b2f"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.343288 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" event={"ID":"312dcf47-596f-497a-9dfd-2f782406b1f0","Type":"ContainerStarted","Data":"43959855a16c11466515e9c52818fe010908a39dce3f71b24567e12fb8ec9af7"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.346390 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7zksp"] Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.357263 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" event={"ID":"968b751e-e05f-4b13-b627-a44b4db9777d","Type":"ContainerStarted","Data":"c57f8820c4a65a38037a061253548d8e322bdb64c4bf5d2798ebac9efb8adde4"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.357318 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" event={"ID":"968b751e-e05f-4b13-b627-a44b4db9777d","Type":"ContainerStarted","Data":"65315a35b7477586e914f0e34ae4d91af67856111fbbe37a09a03041a813f40f"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.358384 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.358523 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.362231 5118 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kfxlc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.362288 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" podUID="968b751e-e05f-4b13-b627-a44b4db9777d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.398125 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" podStartSLOduration=32.398084072 podStartE2EDuration="32.398084072s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:52.394124975 +0000 UTC m=+75.397909548" watchObservedRunningTime="2026-02-23 06:46:52.398084072 +0000 UTC m=+75.401868645" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.398150 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsr2n\" (UniqueName: \"kubernetes.io/projected/b35a400e-565d-40e1-aa28-4896f541c19f-kube-api-access-bsr2n\") pod \"community-operators-ldm4m\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.404531 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd" event={"ID":"7943dfa3-6af3-43ca-b8f4-fc0d40d0b4a2","Type":"ContainerStarted","Data":"b9b5fbad3617ece967e1de9fc127861e471682d543a5dc5cd5e228c738bd1084"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.404571 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd" event={"ID":"7943dfa3-6af3-43ca-b8f4-fc0d40d0b4a2","Type":"ContainerStarted","Data":"7b57c63c53b165d947037e62f979f566c9f1200062339d3ba83014e17ec3e02d"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.409043 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.409190 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-catalog-content\") pod \"certified-operators-brcz5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.409255 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-utilities\") pod \"certified-operators-brcz5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.409281 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl8jd\" (UniqueName: \"kubernetes.io/projected/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-kube-api-access-bl8jd\") pod \"certified-operators-brcz5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: E0223 06:46:52.409791 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:52.909779333 +0000 UTC m=+75.913563906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.411191 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-catalog-content\") pod \"certified-operators-brcz5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.411597 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-utilities\") pod \"certified-operators-brcz5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.419745 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l4d7l"] Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.420774 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.421704 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" event={"ID":"03934e78-e05d-4971-a195-9a9df7443df3","Type":"ContainerStarted","Data":"69ad52863daa08e8010dc71adba88dc18e5ac3b5cfd724da71a0c774535ccb9f"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.421751 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" event={"ID":"03934e78-e05d-4971-a195-9a9df7443df3","Type":"ContainerStarted","Data":"0ea5f267d753da3f707c86a608f27ccd521a334fe2ae3eeeaeb2399f4c373b89"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.431030 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" event={"ID":"d37a0834-d9d7-48d2-a69a-3f4982f227fc","Type":"ContainerStarted","Data":"cfee931fe19e3c1f1e321b52b3384b8923f54775a34afc62e4c2e754e599354f"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.431082 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" event={"ID":"d37a0834-d9d7-48d2-a69a-3f4982f227fc","Type":"ContainerStarted","Data":"2a5aab627fc3867b0f81ca04101dd306e932df3195353d138f4a9a4d43e189bb"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.440883 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" event={"ID":"bbe2caa7-6f2c-478a-9bc7-5f6ed72cc522","Type":"ContainerStarted","Data":"5f2f183eddbb636a2beaaf33053fbda142a0f714aa7a25d134dd632b7dd6447f"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.461968 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4d7l"] Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.488621 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl8jd\" (UniqueName: \"kubernetes.io/projected/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-kube-api-access-bl8jd\") pod \"certified-operators-brcz5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.492555 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" event={"ID":"0980dea0-5e96-4761-a991-a2998c9f2684","Type":"ContainerStarted","Data":"45934cba954fdfe68be7139004824e556237454f4d6ca4062cb1155868c8f9de"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.498919 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lbv8k" event={"ID":"e4509298-a9d8-48a5-91b0-2c28a10e837b","Type":"ContainerStarted","Data":"265231187f1d85141eba4bdc5ed2c30db200278e8bfa0bd5b306952d426000a2"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.506723 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" event={"ID":"31d16fe7-a19b-493e-b69b-c619870aa747","Type":"ContainerStarted","Data":"7240bcb77f9aa08336140783426bbb05bc571e123d22075b5b70218103166969"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.514477 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.520042 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.520386 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xltk7\" (UniqueName: \"kubernetes.io/projected/00c172a6-346f-41df-8fb1-3f6c9458da33-kube-api-access-xltk7\") pod \"community-operators-l4d7l\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.520491 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-utilities\") pod \"community-operators-l4d7l\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.520529 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-catalog-content\") pod \"community-operators-l4d7l\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:52 crc kubenswrapper[5118]: E0223 06:46:52.520629 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.020613832 +0000 UTC m=+76.024398405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.548314 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" event={"ID":"813a6df4-23c6-4969-a911-9f74fa603ccf","Type":"ContainerStarted","Data":"b87ad872564383ec09fa8626ed643dbff36ab106ba625c68b4b3247819823c69"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.565996 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" event={"ID":"af54156b-7666-4bb6-8747-f5b7450b3c51","Type":"ContainerStarted","Data":"caab5d0f4bf8fa0fea77a39766a874a6c3b4c9227f3e74f69fb8f7e2f2ff4668"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.566057 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" event={"ID":"af54156b-7666-4bb6-8747-f5b7450b3c51","Type":"ContainerStarted","Data":"f53fab9e805e290ef3765e99be4ceb9538ffc6d99888057e897462acd92bc08e"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.567510 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.568711 5118 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mmd6b container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.568768 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" podUID="af54156b-7666-4bb6-8747-f5b7450b3c51" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.582647 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.599721 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-sqp4p" event={"ID":"adfd1b6a-2add-429b-b8e9-b245e1b999ad","Type":"ContainerStarted","Data":"42e80ff72608954eab52f3e566a04dbb66e52aa846c139207996843a2b4981f0"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.616714 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" event={"ID":"871941dc-2eb3-4bb4-8db4-e1a726f1171e","Type":"ContainerStarted","Data":"711069f5c06b167e9d075fe224199acf9db4c20815af70d7dc268d897e0304e0"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.617826 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.624557 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xltk7\" (UniqueName: \"kubernetes.io/projected/00c172a6-346f-41df-8fb1-3f6c9458da33-kube-api-access-xltk7\") pod \"community-operators-l4d7l\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.624619 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.624773 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-utilities\") pod \"community-operators-l4d7l\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.624843 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-catalog-content\") pod \"community-operators-l4d7l\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:52 crc kubenswrapper[5118]: E0223 06:46:52.627366 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.127343699 +0000 UTC m=+76.131128272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.627769 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-utilities\") pod \"community-operators-l4d7l\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.628335 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-catalog-content\") pod \"community-operators-l4d7l\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.714885 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.734010 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:52 crc kubenswrapper[5118]: E0223 06:46:52.737367 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.23732212 +0000 UTC m=+76.241106693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.739641 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9qwhq" event={"ID":"57709b5a-89ef-4120-af9b-d96006564eff","Type":"ContainerStarted","Data":"23bddc4f6034b418e8eab16812e5ba52e60a1d71d324a155a71c4b02e81477ec"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.755743 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" event={"ID":"9b14be00-5577-4aaf-b7b2-762b27be7d7e","Type":"ContainerStarted","Data":"fd7eeadcfc364f10e48262f7dd8eacab3a5a45bdbd39e8ff788cb4b081829ada"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.766302 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9kwvf" podStartSLOduration=33.766283494 podStartE2EDuration="33.766283494s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:52.668914246 +0000 UTC m=+75.672698819" watchObservedRunningTime="2026-02-23 06:46:52.766283494 +0000 UTC m=+75.770068067" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.767655 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" podStartSLOduration=33.767647924 podStartE2EDuration="33.767647924s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:52.756520947 +0000 UTC m=+75.760305530" watchObservedRunningTime="2026-02-23 06:46:52.767647924 +0000 UTC m=+75.771432507" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.768412 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" event={"ID":"d3d6cc75-6216-4db9-8f11-6b5045548df1","Type":"ContainerStarted","Data":"e9adef745642dfa3ddf723f3216acf8f9c06361ee282503d3103bd8ffd472a18"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.789444 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" event={"ID":"edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449","Type":"ContainerStarted","Data":"d4ae1d4195b391b4f2c5edcfaa297846abeb3cc554222f09f5fa15b7e3eee72f"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.797790 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" event={"ID":"4dae455a-bb65-4035-9cac-679bcb07e7f3","Type":"ContainerStarted","Data":"6793629c0f724a32d0a0aabde99ee63ac633637d19674b77b865e0fcb33541f7"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.802041 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xltk7\" (UniqueName: \"kubernetes.io/projected/00c172a6-346f-41df-8fb1-3f6c9458da33-kube-api-access-xltk7\") pod \"community-operators-l4d7l\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.803674 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" event={"ID":"34479184-8c08-43c7-b0c6-7d46408f3f33","Type":"ContainerStarted","Data":"0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.804938 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.827083 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" event={"ID":"71c2245d-dcc3-4d90-abaf-381b5784bcc5","Type":"ContainerStarted","Data":"92871b35ea8b0dec0331f65f423f67ed952e3ca865dd306df4abcfb4879b3eb0"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.841546 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:52 crc kubenswrapper[5118]: E0223 06:46:52.842820 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.342793569 +0000 UTC m=+76.346578352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.843784 5118 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-f42lm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.843854 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" podUID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.860458 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" event={"ID":"18908f3c-feca-437a-bde5-df76b32e9a10","Type":"ContainerStarted","Data":"77cc40e5ea7104fee4cc055b4c72a6fd92d717513ed6140ba737774f2de4e97c"} Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.867358 5118 patch_prober.go:28] interesting pod/downloads-7954f5f757-f7t79 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.867794 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f7t79" podUID="b79b4aad-40b3-45ae-a757-36638cbb4571" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.886561 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4jd4v" Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.953962 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:52 crc kubenswrapper[5118]: E0223 06:46:52.956978 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.456952982 +0000 UTC m=+76.460737555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:52 crc kubenswrapper[5118]: I0223 06:46:52.979407 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.004154 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" podStartSLOduration=34.004123772 podStartE2EDuration="34.004123772s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:52.913422802 +0000 UTC m=+75.917207375" watchObservedRunningTime="2026-02-23 06:46:53.004123772 +0000 UTC m=+76.007908345" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.045126 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.057068 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:53 crc kubenswrapper[5118]: E0223 06:46:53.060233 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.560209262 +0000 UTC m=+76.563993835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.101504 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n6fnv" podStartSLOduration=33.101476761 podStartE2EDuration="33.101476761s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:52.993848604 +0000 UTC m=+75.997633177" watchObservedRunningTime="2026-02-23 06:46:53.101476761 +0000 UTC m=+76.105261334" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.166063 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:53 crc kubenswrapper[5118]: E0223 06:46:53.166248 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.666212823 +0000 UTC m=+76.669997396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.166360 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:53 crc kubenswrapper[5118]: E0223 06:46:53.166856 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.666833136 +0000 UTC m=+76.670617709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.196468 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lbv8k" podStartSLOduration=8.196446877 podStartE2EDuration="8.196446877s" podCreationTimestamp="2026-02-23 06:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:53.101662735 +0000 UTC m=+76.105447308" watchObservedRunningTime="2026-02-23 06:46:53.196446877 +0000 UTC m=+76.200231450" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.265114 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hvzgz" podStartSLOduration=34.265078626 podStartE2EDuration="34.265078626s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:53.204269341 +0000 UTC m=+76.208053914" watchObservedRunningTime="2026-02-23 06:46:53.265078626 +0000 UTC m=+76.268863199" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.282863 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:53 crc kubenswrapper[5118]: E0223 06:46:53.283580 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.783564657 +0000 UTC m=+76.787349230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.320769 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" podStartSLOduration=34.320752855 podStartE2EDuration="34.320752855s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:53.266007466 +0000 UTC m=+76.269792039" watchObservedRunningTime="2026-02-23 06:46:53.320752855 +0000 UTC m=+76.324537428" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.321409 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-sqp4p" podStartSLOduration=34.32140298 podStartE2EDuration="34.32140298s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:53.320674163 +0000 UTC m=+76.324458736" watchObservedRunningTime="2026-02-23 06:46:53.32140298 +0000 UTC m=+76.325187553" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.389278 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:53 crc kubenswrapper[5118]: E0223 06:46:53.389713 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.889697022 +0000 UTC m=+76.893481585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.472356 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" podStartSLOduration=33.472334793 podStartE2EDuration="33.472334793s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:53.408769226 +0000 UTC m=+76.412553799" watchObservedRunningTime="2026-02-23 06:46:53.472334793 +0000 UTC m=+76.476119356" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.474555 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" podStartSLOduration=33.474546741 podStartE2EDuration="33.474546741s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:53.471977564 +0000 UTC m=+76.475762137" watchObservedRunningTime="2026-02-23 06:46:53.474546741 +0000 UTC m=+76.478331304" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.493202 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:53 crc kubenswrapper[5118]: E0223 06:46:53.493895 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:53.993877082 +0000 UTC m=+76.997661655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.503183 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.526217 5118 patch_prober.go:28] interesting pod/router-default-5444994796-sqp4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:46:53 crc kubenswrapper[5118]: [-]has-synced failed: reason withheld Feb 23 06:46:53 crc kubenswrapper[5118]: [+]process-running ok Feb 23 06:46:53 crc kubenswrapper[5118]: healthz check failed Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.526317 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sqp4p" podUID="adfd1b6a-2add-429b-b8e9-b245e1b999ad" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.592142 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" podStartSLOduration=34.59211457 podStartE2EDuration="34.59211457s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:53.584686745 +0000 UTC m=+76.588471318" watchObservedRunningTime="2026-02-23 06:46:53.59211457 +0000 UTC m=+76.595899143" Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.635447 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:53 crc kubenswrapper[5118]: E0223 06:46:53.635820 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:54.135808053 +0000 UTC m=+77.139592626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.743069 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:53 crc kubenswrapper[5118]: E0223 06:46:53.743459 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:54.243436091 +0000 UTC m=+77.247220654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.854021 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:53 crc kubenswrapper[5118]: E0223 06:46:53.855435 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:54.355419726 +0000 UTC m=+77.359204299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.956759 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.956921 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" podStartSLOduration=33.956874206 podStartE2EDuration="33.956874206s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:53.689126331 +0000 UTC m=+76.692910904" watchObservedRunningTime="2026-02-23 06:46:53.956874206 +0000 UTC m=+76.960658779" Feb 23 06:46:53 crc kubenswrapper[5118]: E0223 06:46:53.957081 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:54.45706519 +0000 UTC m=+77.460849763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.958830 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-brcz5"] Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.960534 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" event={"ID":"312dcf47-596f-497a-9dfd-2f782406b1f0","Type":"ContainerStarted","Data":"9c1e7c11e6a6d14d53bfd8923f41ce14f703437ec50686f2a225362c56afa3a7"} Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.960576 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" event={"ID":"312dcf47-596f-497a-9dfd-2f782406b1f0","Type":"ContainerStarted","Data":"1a791c40ec5f4bd55e823263eeda5dbfb8569a275f3d91aaded8a4a953a49f54"} Feb 23 06:46:53 crc kubenswrapper[5118]: I0223 06:46:53.961158 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.021959 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jfmkj"] Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.038536 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9kjcn" event={"ID":"31d16fe7-a19b-493e-b69b-c619870aa747","Type":"ContainerStarted","Data":"ace55894e98e9f5316ffb4c625e857289572a1f93cb1595c9baf734fa32c6071"} Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.040797 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-89mtl"] Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.042087 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.049452 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.061390 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:54 crc kubenswrapper[5118]: E0223 06:46:54.063871 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:54.563851419 +0000 UTC m=+77.567635992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.069816 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" event={"ID":"edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449","Type":"ContainerStarted","Data":"cd34a4c0ca1c8bbcd5ce25a46670aa8eece485deb2e13ba19d891fe264d8323e"} Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.092762 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" podStartSLOduration=34.092743463 podStartE2EDuration="34.092743463s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:54.054695205 +0000 UTC m=+77.058479778" watchObservedRunningTime="2026-02-23 06:46:54.092743463 +0000 UTC m=+77.096528036" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.099243 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-89mtl"] Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.126202 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ldm4m"] Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.136451 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" event={"ID":"d37a0834-d9d7-48d2-a69a-3f4982f227fc","Type":"ContainerStarted","Data":"0713b6e91afc769722e1e43726b5eadee7ac92de779c6b30de9738149c9a72e5"} Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.163798 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.164132 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjfp\" (UniqueName: \"kubernetes.io/projected/ae00e6ae-91b4-48e7-8836-53d0fc36c777-kube-api-access-qkjfp\") pod \"redhat-marketplace-89mtl\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.164177 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-utilities\") pod \"redhat-marketplace-89mtl\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.164243 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-catalog-content\") pod \"redhat-marketplace-89mtl\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: E0223 06:46:54.165253 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:54.665229697 +0000 UTC m=+77.669014270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.185327 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" event={"ID":"9b14be00-5577-4aaf-b7b2-762b27be7d7e","Type":"ContainerStarted","Data":"293fc2f1fa09ac14bbafb9a51ad217e6f05800b6cd865e627416ada90ca884f6"} Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.188454 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-548rl" podStartSLOduration=34.188428474 podStartE2EDuration="34.188428474s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:54.187049503 +0000 UTC m=+77.190834066" watchObservedRunningTime="2026-02-23 06:46:54.188428474 +0000 UTC m=+77.192213047" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.217919 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-77kkb" podStartSLOduration=34.21790077 podStartE2EDuration="34.21790077s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:54.216483899 +0000 UTC m=+77.220268472" watchObservedRunningTime="2026-02-23 06:46:54.21790077 +0000 UTC m=+77.221685343" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.233352 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x8c6x" event={"ID":"4a3c2076-5da9-4b1d-9485-37c3b578669f","Type":"ContainerStarted","Data":"006c06da1f904626f5b0f19bbfa3051edcb896846f8f693d5e46bfd3183a1362"} Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.234198 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-x8c6x" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.267529 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" event={"ID":"d3d6cc75-6216-4db9-8f11-6b5045548df1","Type":"ContainerStarted","Data":"82501c5415d9361680c228425647eaff5002fe1028e577f5330b688aab6b335f"} Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.282962 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-catalog-content\") pod \"redhat-marketplace-89mtl\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.283045 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.283155 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjfp\" (UniqueName: \"kubernetes.io/projected/ae00e6ae-91b4-48e7-8836-53d0fc36c777-kube-api-access-qkjfp\") pod \"redhat-marketplace-89mtl\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.283200 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-utilities\") pod \"redhat-marketplace-89mtl\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.283697 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-catalog-content\") pod \"redhat-marketplace-89mtl\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: E0223 06:46:54.284435 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:54.784421962 +0000 UTC m=+77.788206535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.285613 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-utilities\") pod \"redhat-marketplace-89mtl\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.286841 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x8c6x" podStartSLOduration=9.286818235 podStartE2EDuration="9.286818235s" podCreationTimestamp="2026-02-23 06:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:54.284609226 +0000 UTC m=+77.288393799" watchObservedRunningTime="2026-02-23 06:46:54.286818235 +0000 UTC m=+77.290602808" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.311591 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd" event={"ID":"7943dfa3-6af3-43ca-b8f4-fc0d40d0b4a2","Type":"ContainerStarted","Data":"b0046efa52e41f43c9d5f2a12567ba6017e9154b8e612938840782a56ceb60a6"} Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.338117 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4d7l"] Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.389327 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:54 crc kubenswrapper[5118]: E0223 06:46:54.391698 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:54.891667501 +0000 UTC m=+77.895452074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.393520 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.395630 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2rz7k" podStartSLOduration=34.395604439 podStartE2EDuration="34.395604439s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:54.352044748 +0000 UTC m=+77.355829321" watchObservedRunningTime="2026-02-23 06:46:54.395604439 +0000 UTC m=+77.399389012" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.397511 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qq9bd" podStartSLOduration=34.397504371 podStartE2EDuration="34.397504371s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:54.395526297 +0000 UTC m=+77.399310870" watchObservedRunningTime="2026-02-23 06:46:54.397504371 +0000 UTC m=+77.401288934" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.397758 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5v8wn" event={"ID":"18908f3c-feca-437a-bde5-df76b32e9a10","Type":"ContainerStarted","Data":"97ad125e9f16a0b29f635a570bdd95793ca0a7ac411ae6131ce2c42d1cab6bb1"} Feb 23 06:46:54 crc kubenswrapper[5118]: E0223 06:46:54.398479 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:54.898466422 +0000 UTC m=+77.902250995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.438310 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjfp\" (UniqueName: \"kubernetes.io/projected/ae00e6ae-91b4-48e7-8836-53d0fc36c777-kube-api-access-qkjfp\") pod \"redhat-marketplace-89mtl\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.443549 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh5v"] Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.457759 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" podUID="8ba477b1-5a26-44b2-800e-2eea288fef93" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" gracePeriod=30 Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.459150 5118 patch_prober.go:28] interesting pod/downloads-7954f5f757-f7t79 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.459308 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f7t79" podUID="b79b4aad-40b3-45ae-a757-36638cbb4571" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.459544 5118 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-f42lm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.459618 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" podUID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.462255 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9qwhq" event={"ID":"57709b5a-89ef-4120-af9b-d96006564eff","Type":"ContainerStarted","Data":"097a9e5e4a01e55c82c2802ac27c92e8ed3c0ce2e491ee13b6f3eb2cc061dd42"} Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.462480 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:54 crc kubenswrapper[5118]: W0223 06:46:54.472575 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00c172a6_346f_41df_8fb1_3f6c9458da33.slice/crio-fc58a9f26a578ef68b6b9188823b5427feeb3fb52895603791fb175314df98ed WatchSource:0}: Error finding container fc58a9f26a578ef68b6b9188823b5427feeb3fb52895603791fb175314df98ed: Status 404 returned error can't find the container with id fc58a9f26a578ef68b6b9188823b5427feeb3fb52895603791fb175314df98ed Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.487227 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mmd6b" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.498296 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:54 crc kubenswrapper[5118]: E0223 06:46:54.499162 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:54.999124775 +0000 UTC m=+78.002909348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.504052 5118 patch_prober.go:28] interesting pod/router-default-5444994796-sqp4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:46:54 crc kubenswrapper[5118]: [-]has-synced failed: reason withheld Feb 23 06:46:54 crc kubenswrapper[5118]: [+]process-running ok Feb 23 06:46:54 crc kubenswrapper[5118]: healthz check failed Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.504135 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sqp4p" podUID="adfd1b6a-2add-429b-b8e9-b245e1b999ad" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.510542 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh5v"] Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.541238 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9qwhq" podStartSLOduration=35.541209382 podStartE2EDuration="35.541209382s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:54.540850584 +0000 UTC m=+77.544635157" watchObservedRunningTime="2026-02-23 06:46:54.541209382 +0000 UTC m=+77.544993955" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.600078 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-utilities\") pod \"redhat-marketplace-jvh5v\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.600138 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jcb\" (UniqueName: \"kubernetes.io/projected/d061beb5-e2d8-49b3-b802-099dbdddafca-kube-api-access-c7jcb\") pod \"redhat-marketplace-jvh5v\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.600516 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-catalog-content\") pod \"redhat-marketplace-jvh5v\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.600634 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:54 crc kubenswrapper[5118]: E0223 06:46:54.605399 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:55.105378912 +0000 UTC m=+78.109163485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.706527 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.707290 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-catalog-content\") pod \"redhat-marketplace-jvh5v\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.707376 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-utilities\") pod \"redhat-marketplace-jvh5v\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.707403 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jcb\" (UniqueName: \"kubernetes.io/projected/d061beb5-e2d8-49b3-b802-099dbdddafca-kube-api-access-c7jcb\") pod \"redhat-marketplace-jvh5v\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:54 crc kubenswrapper[5118]: E0223 06:46:54.707773 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:55.207757232 +0000 UTC m=+78.211541805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.708223 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-catalog-content\") pod \"redhat-marketplace-jvh5v\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.708450 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-utilities\") pod \"redhat-marketplace-jvh5v\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.713311 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.740472 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6kgbs" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.808758 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:54 crc kubenswrapper[5118]: E0223 06:46:54.833011 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:55.332988051 +0000 UTC m=+78.336772624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.863297 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jcb\" (UniqueName: \"kubernetes.io/projected/d061beb5-e2d8-49b3-b802-099dbdddafca-kube-api-access-c7jcb\") pod \"redhat-marketplace-jvh5v\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:54 crc kubenswrapper[5118]: I0223 06:46:54.910804 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:54 crc kubenswrapper[5118]: E0223 06:46:54.911344 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:55.411298516 +0000 UTC m=+78.415083089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.012543 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:55 crc kubenswrapper[5118]: E0223 06:46:55.012889 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:55.512877799 +0000 UTC m=+78.516662362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.030262 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-526dd"] Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.031372 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.060689 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.071628 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-526dd"] Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.095416 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.113725 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.113940 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-catalog-content\") pod \"redhat-operators-526dd\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.113964 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-utilities\") pod \"redhat-operators-526dd\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.114014 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rvsr\" (UniqueName: \"kubernetes.io/projected/67106794-6724-41de-9db8-d51c468e1e28-kube-api-access-7rvsr\") pod \"redhat-operators-526dd\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: E0223 06:46:55.114219 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:55.614197906 +0000 UTC m=+78.617982479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.216722 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-catalog-content\") pod \"redhat-operators-526dd\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.217022 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-utilities\") pod \"redhat-operators-526dd\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.217075 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rvsr\" (UniqueName: \"kubernetes.io/projected/67106794-6724-41de-9db8-d51c468e1e28-kube-api-access-7rvsr\") pod \"redhat-operators-526dd\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.217127 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:55 crc kubenswrapper[5118]: E0223 06:46:55.217499 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:55.717483147 +0000 UTC m=+78.721267710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.217924 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-catalog-content\") pod \"redhat-operators-526dd\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.218160 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-utilities\") pod \"redhat-operators-526dd\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.258430 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qknnk"] Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.271939 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.302663 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rvsr\" (UniqueName: \"kubernetes.io/projected/67106794-6724-41de-9db8-d51c468e1e28-kube-api-access-7rvsr\") pod \"redhat-operators-526dd\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.318843 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:55 crc kubenswrapper[5118]: E0223 06:46:55.319471 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:55.819435528 +0000 UTC m=+78.823220101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.339624 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qknnk"] Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.357289 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.422908 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-catalog-content\") pod \"redhat-operators-qknnk\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.422967 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twbjt\" (UniqueName: \"kubernetes.io/projected/e669f113-9929-4497-ba2b-e5cc2169ae24-kube-api-access-twbjt\") pod \"redhat-operators-qknnk\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.422999 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.423493 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-utilities\") pod \"redhat-operators-qknnk\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: E0223 06:46:55.423578 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:55.923560747 +0000 UTC m=+78.927345320 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.457825 5118 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kfxlc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.457885 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" podUID="968b751e-e05f-4b13-b627-a44b4db9777d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.503571 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" event={"ID":"edf8f2e2-d2d1-4ceb-a3eb-f04c73af6449","Type":"ContainerStarted","Data":"5df6c5185974aca5b90549841a2dac518ae2424d31d1ad6d68ef5ec904f80e5a"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.507473 5118 patch_prober.go:28] interesting pod/router-default-5444994796-sqp4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:46:55 crc kubenswrapper[5118]: [-]has-synced failed: reason withheld Feb 23 06:46:55 crc kubenswrapper[5118]: [+]process-running ok Feb 23 06:46:55 crc kubenswrapper[5118]: healthz check failed Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.507514 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sqp4p" podUID="adfd1b6a-2add-429b-b8e9-b245e1b999ad" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.523702 5118 generic.go:334] "Generic (PLEG): container finished" podID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerID="6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b" exitCode=0 Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.523766 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4d7l" event={"ID":"00c172a6-346f-41df-8fb1-3f6c9458da33","Type":"ContainerDied","Data":"6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.523795 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4d7l" event={"ID":"00c172a6-346f-41df-8fb1-3f6c9458da33","Type":"ContainerStarted","Data":"fc58a9f26a578ef68b6b9188823b5427feeb3fb52895603791fb175314df98ed"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.532927 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.533285 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-utilities\") pod \"redhat-operators-qknnk\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.533363 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-catalog-content\") pod \"redhat-operators-qknnk\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.533394 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twbjt\" (UniqueName: \"kubernetes.io/projected/e669f113-9929-4497-ba2b-e5cc2169ae24-kube-api-access-twbjt\") pod \"redhat-operators-qknnk\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: E0223 06:46:55.533733 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:56.033718301 +0000 UTC m=+79.037502874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.534981 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-utilities\") pod \"redhat-operators-qknnk\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.543206 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-89mtl"] Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.545240 5118 generic.go:334] "Generic (PLEG): container finished" podID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerID="50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f" exitCode=0 Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.546182 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brcz5" event={"ID":"01ab420b-d9af-4caf-b63d-c39f2d30e6d5","Type":"ContainerDied","Data":"50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.546225 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brcz5" event={"ID":"01ab420b-d9af-4caf-b63d-c39f2d30e6d5","Type":"ContainerStarted","Data":"ad6ad238884b7933edc2da484f21c8f29e105ff308bd424ee86eee6d0b42be39"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.546268 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.552200 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-rck9k" podStartSLOduration=35.552173592 podStartE2EDuration="35.552173592s" podCreationTimestamp="2026-02-23 06:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:55.539601452 +0000 UTC m=+78.543386025" watchObservedRunningTime="2026-02-23 06:46:55.552173592 +0000 UTC m=+78.555958165" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.553689 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-catalog-content\") pod \"redhat-operators-qknnk\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.572620 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" event={"ID":"4dae455a-bb65-4035-9cac-679bcb07e7f3","Type":"ContainerStarted","Data":"3a70e6d2c22f6d15fb1094d8b0b0e0c5d3eed8f38527489b154fcf93a1c1d3c3"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.585476 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twbjt\" (UniqueName: \"kubernetes.io/projected/e669f113-9929-4497-ba2b-e5cc2169ae24-kube-api-access-twbjt\") pod \"redhat-operators-qknnk\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.604429 5118 generic.go:334] "Generic (PLEG): container finished" podID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerID="a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd" exitCode=0 Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.604548 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfmkj" event={"ID":"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f","Type":"ContainerDied","Data":"a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.604593 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfmkj" event={"ID":"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f","Type":"ContainerStarted","Data":"d4b94230197bc7f443ccb1e3e8d6e1919aeacf4a142772dcc3fe4c3b32b8921f"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.621979 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.634514 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:55 crc kubenswrapper[5118]: E0223 06:46:55.636193 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:56.136180523 +0000 UTC m=+79.139965106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.636709 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9qwhq" event={"ID":"57709b5a-89ef-4120-af9b-d96006564eff","Type":"ContainerStarted","Data":"bd07e133ced3181c7e6267ba6f808a4e78b20882f8e5ff7f5fb70152eaa10c16"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.736707 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:55 crc kubenswrapper[5118]: E0223 06:46:55.738305 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:56.238286018 +0000 UTC m=+79.242070591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.777085 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x8c6x" event={"ID":"4a3c2076-5da9-4b1d-9485-37c3b578669f","Type":"ContainerStarted","Data":"171070ac8b697c93e86533302a28942ecd899f70568fb0a423803fff9814de64"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.789690 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.790465 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.810183 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.810466 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.817731 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.839656 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:55 crc kubenswrapper[5118]: E0223 06:46:55.840491 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:56.340472274 +0000 UTC m=+79.344256847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.857354 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh5v"] Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.868170 5118 generic.go:334] "Generic (PLEG): container finished" podID="b35a400e-565d-40e1-aa28-4896f541c19f" containerID="a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8" exitCode=0 Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.890851 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldm4m" event={"ID":"b35a400e-565d-40e1-aa28-4896f541c19f","Type":"ContainerDied","Data":"a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.890988 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kfxlc" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.891006 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldm4m" event={"ID":"b35a400e-565d-40e1-aa28-4896f541c19f","Type":"ContainerStarted","Data":"f47d887f28583191e65ec216e11560234247effca3bb6f24ce7e42480d8e930f"} Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.891032 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.942275 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.942586 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9250ab61-5840-443a-b138-0f6652080ea6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9250ab61-5840-443a-b138-0f6652080ea6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.942688 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9250ab61-5840-443a-b138-0f6652080ea6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9250ab61-5840-443a-b138-0f6652080ea6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:46:55 crc kubenswrapper[5118]: E0223 06:46:55.943979 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:56.443955539 +0000 UTC m=+79.447740112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.978519 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmbrq"] Feb 23 06:46:55 crc kubenswrapper[5118]: I0223 06:46:55.978765 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" podUID="a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" containerName="controller-manager" containerID="cri-o://9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3" gracePeriod=30 Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.072884 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9250ab61-5840-443a-b138-0f6652080ea6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9250ab61-5840-443a-b138-0f6652080ea6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.073621 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.073871 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9250ab61-5840-443a-b138-0f6652080ea6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9250ab61-5840-443a-b138-0f6652080ea6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.085877 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9250ab61-5840-443a-b138-0f6652080ea6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9250ab61-5840-443a-b138-0f6652080ea6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.086437 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:56.586413803 +0000 UTC m=+79.590198376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.118705 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-526dd"] Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.129965 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw"] Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.151261 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9250ab61-5840-443a-b138-0f6652080ea6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9250ab61-5840-443a-b138-0f6652080ea6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.175831 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.176294 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:56.676272174 +0000 UTC m=+79.680056747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.220538 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.277805 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.278292 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:56.778278097 +0000 UTC m=+79.782062670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.379234 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.380163 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:56.880132305 +0000 UTC m=+79.883916878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.413844 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qknnk"] Feb 23 06:46:56 crc kubenswrapper[5118]: W0223 06:46:56.453603 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode669f113_9929_4497_ba2b_e5cc2169ae24.slice/crio-332be544671998abc5830870daadbe62b9a04b216facca13c128bc3a59a2d482 WatchSource:0}: Error finding container 332be544671998abc5830870daadbe62b9a04b216facca13c128bc3a59a2d482: Status 404 returned error can't find the container with id 332be544671998abc5830870daadbe62b9a04b216facca13c128bc3a59a2d482 Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.482293 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.482657 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:56.982642189 +0000 UTC m=+79.986426762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.510077 5118 patch_prober.go:28] interesting pod/router-default-5444994796-sqp4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:46:56 crc kubenswrapper[5118]: [-]has-synced failed: reason withheld Feb 23 06:46:56 crc kubenswrapper[5118]: [+]process-running ok Feb 23 06:46:56 crc kubenswrapper[5118]: healthz check failed Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.510277 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sqp4p" podUID="adfd1b6a-2add-429b-b8e9-b245e1b999ad" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.583679 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.584828 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.084806385 +0000 UTC m=+80.088590958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.639793 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 06:46:56 crc kubenswrapper[5118]: W0223 06:46:56.659415 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9250ab61_5840_443a_b138_0f6652080ea6.slice/crio-957ee4844c70a47a63dcc8f4838d6079d0ce0e28747233e472e45c9cd04440c7 WatchSource:0}: Error finding container 957ee4844c70a47a63dcc8f4838d6079d0ce0e28747233e472e45c9cd04440c7: Status 404 returned error can't find the container with id 957ee4844c70a47a63dcc8f4838d6079d0ce0e28747233e472e45c9cd04440c7 Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.686461 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.687072 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.187050103 +0000 UTC m=+80.190834676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.712421 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.787996 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.788225 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.288193836 +0000 UTC m=+80.291978409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.788319 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.788713 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.288705667 +0000 UTC m=+80.292490240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.891763 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-proxy-ca-bundles\") pod \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.891819 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zphm\" (UniqueName: \"kubernetes.io/projected/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-kube-api-access-8zphm\") pod \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.891940 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.892037 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-client-ca\") pod \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.892124 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-serving-cert\") pod \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.892173 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-config\") pod \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\" (UID: \"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118\") " Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.893669 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-config" (OuterVolumeSpecName: "config") pod "a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" (UID: "a9b5bcf2-850b-4e39-8f2b-e2bf9467b118"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.894324 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.894514 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.394477543 +0000 UTC m=+80.398262116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.894768 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" containerName="controller-manager" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.894839 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" containerName="controller-manager" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.895030 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" containerName="controller-manager" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.895525 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" (UID: "a9b5bcf2-850b-4e39-8f2b-e2bf9467b118"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.895696 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.895938 5118 generic.go:334] "Generic (PLEG): container finished" podID="a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" containerID="9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3" exitCode=0 Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.896083 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" event={"ID":"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118","Type":"ContainerDied","Data":"9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3"} Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.896198 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" event={"ID":"a9b5bcf2-850b-4e39-8f2b-e2bf9467b118","Type":"ContainerDied","Data":"7ad77e712b97912ad69865416db69013ab487807ac581ca2d90a8e510e267828"} Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.896228 5118 scope.go:117] "RemoveContainer" containerID="9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.896355 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" (UID: "a9b5bcf2-850b-4e39-8f2b-e2bf9467b118"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.897749 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lmbrq" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.903469 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.903809 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.906386 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-kube-api-access-8zphm" (OuterVolumeSpecName: "kube-api-access-8zphm") pod "a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" (UID: "a9b5bcf2-850b-4e39-8f2b-e2bf9467b118"). InnerVolumeSpecName "kube-api-access-8zphm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.906886 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.918695 5118 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.925524 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" (UID: "a9b5bcf2-850b-4e39-8f2b-e2bf9467b118"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.943073 5118 generic.go:334] "Generic (PLEG): container finished" podID="67106794-6724-41de-9db8-d51c468e1e28" containerID="4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268" exitCode=0 Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.943185 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-526dd" event={"ID":"67106794-6724-41de-9db8-d51c468e1e28","Type":"ContainerDied","Data":"4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268"} Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.943222 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-526dd" event={"ID":"67106794-6724-41de-9db8-d51c468e1e28","Type":"ContainerStarted","Data":"6cfdf60797c9c095e1906d90ce251c8d495ec5b4f750e025d77e8150c341f859"} Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.966380 5118 generic.go:334] "Generic (PLEG): container finished" podID="03934e78-e05d-4971-a195-9a9df7443df3" containerID="69ad52863daa08e8010dc71adba88dc18e5ac3b5cfd724da71a0c774535ccb9f" exitCode=0 Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.966597 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" event={"ID":"03934e78-e05d-4971-a195-9a9df7443df3","Type":"ContainerDied","Data":"69ad52863daa08e8010dc71adba88dc18e5ac3b5cfd724da71a0c774535ccb9f"} Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.973628 5118 generic.go:334] "Generic (PLEG): container finished" podID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerID="8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074" exitCode=0 Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.973696 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh5v" event={"ID":"d061beb5-e2d8-49b3-b802-099dbdddafca","Type":"ContainerDied","Data":"8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074"} Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.973721 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh5v" event={"ID":"d061beb5-e2d8-49b3-b802-099dbdddafca","Type":"ContainerStarted","Data":"ec73d0468bbb1f12443a3b0193ab2ecd89761ad6c90611fa923afcd646ffed79"} Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.981041 5118 generic.go:334] "Generic (PLEG): container finished" podID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerID="80c68126beabffb30f3625d22dac1ad5512216113bf8b196394018a315a613c3" exitCode=0 Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.981174 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qknnk" event={"ID":"e669f113-9929-4497-ba2b-e5cc2169ae24","Type":"ContainerDied","Data":"80c68126beabffb30f3625d22dac1ad5512216113bf8b196394018a315a613c3"} Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.981225 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qknnk" event={"ID":"e669f113-9929-4497-ba2b-e5cc2169ae24","Type":"ContainerStarted","Data":"332be544671998abc5830870daadbe62b9a04b216facca13c128bc3a59a2d482"} Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.994762 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.994900 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.994913 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.994926 5118 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.995453 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zphm\" (UniqueName: \"kubernetes.io/projected/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-kube-api-access-8zphm\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.995500 5118 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:56 crc kubenswrapper[5118]: E0223 06:46:56.995869 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.495843991 +0000 UTC m=+80.499628734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.997318 5118 scope.go:117] "RemoveContainer" containerID="9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3" Feb 23 06:46:56 crc kubenswrapper[5118]: I0223 06:46:56.998633 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" event={"ID":"4dae455a-bb65-4035-9cac-679bcb07e7f3","Type":"ContainerStarted","Data":"56e4dceccf1d0c66c59baa1f67222d74f8c0e0f409808ad7d82c5d1891accc58"} Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.000389 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3\": container with ID starting with 9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3 not found: ID does not exist" containerID="9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.000449 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3"} err="failed to get container status \"9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3\": rpc error: code = NotFound desc = could not find container \"9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3\": container with ID starting with 9e23abae01d7cb1f0f9372061bcac13bcbbfbd83a921d635ff3d2adc075584a3 not found: ID does not exist" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.008799 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9250ab61-5840-443a-b138-0f6652080ea6","Type":"ContainerStarted","Data":"957ee4844c70a47a63dcc8f4838d6079d0ce0e28747233e472e45c9cd04440c7"} Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.021569 5118 generic.go:334] "Generic (PLEG): container finished" podID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerID="8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12" exitCode=0 Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.021767 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89mtl" event={"ID":"ae00e6ae-91b4-48e7-8836-53d0fc36c777","Type":"ContainerDied","Data":"8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12"} Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.021829 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89mtl" event={"ID":"ae00e6ae-91b4-48e7-8836-53d0fc36c777","Type":"ContainerStarted","Data":"797e0ec091fcede2fd9bc0c59729c3689f789c272023c7ad82cc7465e4211974"} Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.023907 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" podUID="ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" containerName="route-controller-manager" containerID="cri-o://d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21" gracePeriod=30 Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.097184 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.097337 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.597309771 +0000 UTC m=+80.601094344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.098317 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c37c475-22ac-48ab-b699-0e01e84f1536-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1c37c475-22ac-48ab-b699-0e01e84f1536\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.098491 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c37c475-22ac-48ab-b699-0e01e84f1536-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1c37c475-22ac-48ab-b699-0e01e84f1536\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.098526 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.099692 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.599163743 +0000 UTC m=+80.602948316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.201414 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.201644 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.701596864 +0000 UTC m=+80.705381437 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.201792 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c37c475-22ac-48ab-b699-0e01e84f1536-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1c37c475-22ac-48ab-b699-0e01e84f1536\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.201889 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c37c475-22ac-48ab-b699-0e01e84f1536-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1c37c475-22ac-48ab-b699-0e01e84f1536\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.201924 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.202963 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c37c475-22ac-48ab-b699-0e01e84f1536-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1c37c475-22ac-48ab-b699-0e01e84f1536\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.203437 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.703414404 +0000 UTC m=+80.707198977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.231998 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c37c475-22ac-48ab-b699-0e01e84f1536-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1c37c475-22ac-48ab-b699-0e01e84f1536\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.243185 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.279802 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmbrq"] Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.279877 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lmbrq"] Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.303568 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.304199 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.804174789 +0000 UTC m=+80.807959362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.405011 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.405424 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:57.905408135 +0000 UTC m=+80.909192708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.487030 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.487857 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.497606 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.498187 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.501361 5118 patch_prober.go:28] interesting pod/router-default-5444994796-sqp4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:46:57 crc kubenswrapper[5118]: [-]has-synced failed: reason withheld Feb 23 06:46:57 crc kubenswrapper[5118]: [+]process-running ok Feb 23 06:46:57 crc kubenswrapper[5118]: healthz check failed Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.501444 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sqp4p" podUID="adfd1b6a-2add-429b-b8e9-b245e1b999ad" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.502383 5118 patch_prober.go:28] interesting pod/console-f9d7485db-r9t74 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.502428 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-r9t74" podUID="d8fc5dad-079f-4768-a10e-616ff7228ccd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.504907 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.505813 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.506141 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:58.006126478 +0000 UTC m=+81.009911051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.600613 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.608163 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-serving-cert\") pod \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.608219 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-client-ca\") pod \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.608252 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lghxp\" (UniqueName: \"kubernetes.io/projected/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-kube-api-access-lghxp\") pod \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.608290 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-config\") pod \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\" (UID: \"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0\") " Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.608551 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.610447 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:58.110431311 +0000 UTC m=+81.114215884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.613921 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-config" (OuterVolumeSpecName: "config") pod "ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" (UID: "ea38a1cc-9f9a-4379-8638-70e34b6bc8e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.616834 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-client-ca" (OuterVolumeSpecName: "client-ca") pod "ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" (UID: "ea38a1cc-9f9a-4379-8638-70e34b6bc8e0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.619596 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" (UID: "ea38a1cc-9f9a-4379-8638-70e34b6bc8e0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.621636 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-kube-api-access-lghxp" (OuterVolumeSpecName: "kube-api-access-lghxp") pod "ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" (UID: "ea38a1cc-9f9a-4379-8638-70e34b6bc8e0"). InnerVolumeSpecName "kube-api-access-lghxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.676280 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56b586ccb9-fnncr"] Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.676678 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" containerName="route-controller-manager" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.676699 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" containerName="route-controller-manager" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.676814 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" containerName="route-controller-manager" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.677388 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.683624 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.683940 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.684127 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.684123 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.684217 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.684428 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.687566 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56b586ccb9-fnncr"] Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.695247 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.709909 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.710001 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:58.209980099 +0000 UTC m=+81.213764672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.710435 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-689gj\" (UniqueName: \"kubernetes.io/projected/1ce759e4-1e35-4f6f-866d-e810a5e231c8-kube-api-access-689gj\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.710588 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce759e4-1e35-4f6f-866d-e810a5e231c8-serving-cert\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.710676 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-config\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.710789 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-proxy-ca-bundles\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.710831 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-client-ca\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.710887 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.710944 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.710967 5118 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.710981 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lghxp\" (UniqueName: \"kubernetes.io/projected/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-kube-api-access-lghxp\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.710996 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.711792 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:58.211776579 +0000 UTC m=+81.215561152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.717789 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b5bcf2-850b-4e39-8f2b-e2bf9467b118" path="/var/lib/kubelet/pods/a9b5bcf2-850b-4e39-8f2b-e2bf9467b118/volumes" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.779368 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.812676 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.812995 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 06:46:58.312951703 +0000 UTC m=+81.316736276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.813071 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-client-ca\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.813160 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.813190 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-689gj\" (UniqueName: \"kubernetes.io/projected/1ce759e4-1e35-4f6f-866d-e810a5e231c8-kube-api-access-689gj\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.813248 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce759e4-1e35-4f6f-866d-e810a5e231c8-serving-cert\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.813293 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-config\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.813367 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-proxy-ca-bundles\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: E0223 06:46:57.813735 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 06:46:58.313708989 +0000 UTC m=+81.317493562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sh4c" (UID: "08634871-e819-4b75-93e5-fed45013b977") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.814628 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-proxy-ca-bundles\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.817884 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-config\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.819130 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-client-ca\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: W0223 06:46:57.831002 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1c37c475_22ac_48ab_b699_0e01e84f1536.slice/crio-1a9159555364990ea69b3b4131daf39282fd5295e6464e872331d3c30794e66c WatchSource:0}: Error finding container 1a9159555364990ea69b3b4131daf39282fd5295e6464e872331d3c30794e66c: Status 404 returned error can't find the container with id 1a9159555364990ea69b3b4131daf39282fd5295e6464e872331d3c30794e66c Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.836472 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-689gj\" (UniqueName: \"kubernetes.io/projected/1ce759e4-1e35-4f6f-866d-e810a5e231c8-kube-api-access-689gj\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.837594 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce759e4-1e35-4f6f-866d-e810a5e231c8-serving-cert\") pod \"controller-manager-56b586ccb9-fnncr\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.862422 5118 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-23T06:46:56.918723953Z","Handler":null,"Name":""} Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.867421 5118 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.867464 5118 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.922593 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 06:46:57 crc kubenswrapper[5118]: I0223 06:46:57.945047 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.009842 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.033083 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.038319 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.038354 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.093276 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sh4c\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.117824 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" event={"ID":"4dae455a-bb65-4035-9cac-679bcb07e7f3","Type":"ContainerStarted","Data":"5623c02dfbaddd73b6cde92af35189f62412d6cd449b69af6f7abf14742ab041"} Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.123808 5118 patch_prober.go:28] interesting pod/downloads-7954f5f757-f7t79 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.123870 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-f7t79" podUID="b79b4aad-40b3-45ae-a757-36638cbb4571" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.123903 5118 patch_prober.go:28] interesting pod/downloads-7954f5f757-f7t79 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.123970 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f7t79" podUID="b79b4aad-40b3-45ae-a757-36638cbb4571" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.132214 5118 generic.go:334] "Generic (PLEG): container finished" podID="9250ab61-5840-443a-b138-0f6652080ea6" containerID="2c00283b7cc731f90cb5136bc2fb918bc718b1ed5d953d2ef1dcfe313455b6e7" exitCode=0 Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.132271 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9250ab61-5840-443a-b138-0f6652080ea6","Type":"ContainerDied","Data":"2c00283b7cc731f90cb5136bc2fb918bc718b1ed5d953d2ef1dcfe313455b6e7"} Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.173209 5118 generic.go:334] "Generic (PLEG): container finished" podID="ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" containerID="d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21" exitCode=0 Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.173302 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" event={"ID":"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0","Type":"ContainerDied","Data":"d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21"} Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.173752 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" event={"ID":"ea38a1cc-9f9a-4379-8638-70e34b6bc8e0","Type":"ContainerDied","Data":"ba9d7fdfbf9dab12c2b1e9ad2a0fe52a354bf403d4943170c3120b2c034eb204"} Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.173359 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.173790 5118 scope.go:117] "RemoveContainer" containerID="d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.184823 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1c37c475-22ac-48ab-b699-0e01e84f1536","Type":"ContainerStarted","Data":"1a9159555364990ea69b3b4131daf39282fd5295e6464e872331d3c30794e66c"} Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.199144 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw"] Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.200651 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-m4mzh" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.203649 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lmzgw"] Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.255936 5118 scope.go:117] "RemoveContainer" containerID="d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21" Feb 23 06:46:58 crc kubenswrapper[5118]: E0223 06:46:58.301611 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21\": container with ID starting with d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21 not found: ID does not exist" containerID="d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.301681 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21"} err="failed to get container status \"d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21\": rpc error: code = NotFound desc = could not find container \"d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21\": container with ID starting with d9a8a4d58301d12bb9f80a71e4938873a4ec4c454c453fc57dfc8b876f6a7a21 not found: ID does not exist" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.351244 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.495814 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.519306 5118 patch_prober.go:28] interesting pod/router-default-5444994796-sqp4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:46:58 crc kubenswrapper[5118]: [-]has-synced failed: reason withheld Feb 23 06:46:58 crc kubenswrapper[5118]: [+]process-running ok Feb 23 06:46:58 crc kubenswrapper[5118]: healthz check failed Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.519374 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sqp4p" podUID="adfd1b6a-2add-429b-b8e9-b245e1b999ad" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.856534 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56b586ccb9-fnncr"] Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.887692 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.904509 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sh4c"] Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.967824 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03934e78-e05d-4971-a195-9a9df7443df3-config-volume\") pod \"03934e78-e05d-4971-a195-9a9df7443df3\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.968348 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03934e78-e05d-4971-a195-9a9df7443df3-secret-volume\") pod \"03934e78-e05d-4971-a195-9a9df7443df3\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.968600 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pk49\" (UniqueName: \"kubernetes.io/projected/03934e78-e05d-4971-a195-9a9df7443df3-kube-api-access-7pk49\") pod \"03934e78-e05d-4971-a195-9a9df7443df3\" (UID: \"03934e78-e05d-4971-a195-9a9df7443df3\") " Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.972391 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03934e78-e05d-4971-a195-9a9df7443df3-config-volume" (OuterVolumeSpecName: "config-volume") pod "03934e78-e05d-4971-a195-9a9df7443df3" (UID: "03934e78-e05d-4971-a195-9a9df7443df3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:46:58 crc kubenswrapper[5118]: E0223 06:46:58.977812 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 06:46:58 crc kubenswrapper[5118]: I0223 06:46:58.987435 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03934e78-e05d-4971-a195-9a9df7443df3-kube-api-access-7pk49" (OuterVolumeSpecName: "kube-api-access-7pk49") pod "03934e78-e05d-4971-a195-9a9df7443df3" (UID: "03934e78-e05d-4971-a195-9a9df7443df3"). InnerVolumeSpecName "kube-api-access-7pk49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:58.997942 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03934e78-e05d-4971-a195-9a9df7443df3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "03934e78-e05d-4971-a195-9a9df7443df3" (UID: "03934e78-e05d-4971-a195-9a9df7443df3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:46:59 crc kubenswrapper[5118]: E0223 06:46:59.009160 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 06:46:59 crc kubenswrapper[5118]: E0223 06:46:59.020535 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 06:46:59 crc kubenswrapper[5118]: E0223 06:46:59.020642 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" podUID="8ba477b1-5a26-44b2-800e-2eea288fef93" containerName="kube-multus-additional-cni-plugins" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.073858 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pk49\" (UniqueName: \"kubernetes.io/projected/03934e78-e05d-4971-a195-9a9df7443df3-kube-api-access-7pk49\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.073909 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03934e78-e05d-4971-a195-9a9df7443df3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.073918 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03934e78-e05d-4971-a195-9a9df7443df3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.261796 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1c37c475-22ac-48ab-b699-0e01e84f1536","Type":"ContainerStarted","Data":"e1a5541bb18c02f7590ac9b6d4ba8d6f260cd7e9e5063354e3cd16bdbd84edd3"} Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.293706 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" event={"ID":"08634871-e819-4b75-93e5-fed45013b977","Type":"ContainerStarted","Data":"86cf8a2fc28135093762a9bafb873d6876ced90069ae33ebc61099b61cde6192"} Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.305809 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" event={"ID":"03934e78-e05d-4971-a195-9a9df7443df3","Type":"ContainerDied","Data":"0ea5f267d753da3f707c86a608f27ccd521a334fe2ae3eeeaeb2399f4c373b89"} Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.305871 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ea5f267d753da3f707c86a608f27ccd521a334fe2ae3eeeaeb2399f4c373b89" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.305972 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.332181 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.332164054 podStartE2EDuration="3.332164054s" podCreationTimestamp="2026-02-23 06:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:59.325046716 +0000 UTC m=+82.328831279" watchObservedRunningTime="2026-02-23 06:46:59.332164054 +0000 UTC m=+82.335948627" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.410495 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" event={"ID":"4dae455a-bb65-4035-9cac-679bcb07e7f3","Type":"ContainerStarted","Data":"885d448227392cae51136f3feaef2df99048702c98a2aacecedf942365cc5aa2"} Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.435904 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" event={"ID":"1ce759e4-1e35-4f6f-866d-e810a5e231c8","Type":"ContainerStarted","Data":"525ecb7103bf2c8651777be1f7e52815de92714c1c2777289d2bd08951980337"} Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.500007 5118 patch_prober.go:28] interesting pod/router-default-5444994796-sqp4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:46:59 crc kubenswrapper[5118]: [-]has-synced failed: reason withheld Feb 23 06:46:59 crc kubenswrapper[5118]: [+]process-running ok Feb 23 06:46:59 crc kubenswrapper[5118]: healthz check failed Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.500081 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sqp4p" podUID="adfd1b6a-2add-429b-b8e9-b245e1b999ad" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.504483 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" podStartSLOduration=14.504466233 podStartE2EDuration="14.504466233s" podCreationTimestamp="2026-02-23 06:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:46:59.501089017 +0000 UTC m=+82.504873590" watchObservedRunningTime="2026-02-23 06:46:59.504466233 +0000 UTC m=+82.508250806" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.659200 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj"] Feb 23 06:46:59 crc kubenswrapper[5118]: E0223 06:46:59.659662 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03934e78-e05d-4971-a195-9a9df7443df3" containerName="collect-profiles" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.659674 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="03934e78-e05d-4971-a195-9a9df7443df3" containerName="collect-profiles" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.659797 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="03934e78-e05d-4971-a195-9a9df7443df3" containerName="collect-profiles" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.660269 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.669880 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.670197 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.670350 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.670590 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.670739 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.671394 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.676340 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj"] Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.691881 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-config\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.691938 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-client-ca\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.692049 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-serving-cert\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.692113 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqb5c\" (UniqueName: \"kubernetes.io/projected/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-kube-api-access-jqb5c\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.727768 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.730505 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea38a1cc-9f9a-4379-8638-70e34b6bc8e0" path="/var/lib/kubelet/pods/ea38a1cc-9f9a-4379-8638-70e34b6bc8e0/volumes" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.793612 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqb5c\" (UniqueName: \"kubernetes.io/projected/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-kube-api-access-jqb5c\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.793683 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-config\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.793727 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-client-ca\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.793795 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-serving-cert\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.795544 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-config\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.795595 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-client-ca\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.809401 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-serving-cert\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:46:59 crc kubenswrapper[5118]: I0223 06:46:59.812566 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqb5c\" (UniqueName: \"kubernetes.io/projected/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-kube-api-access-jqb5c\") pod \"route-controller-manager-7dfd46479d-4f8fj\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.014802 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.016987 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.101713 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9250ab61-5840-443a-b138-0f6652080ea6-kube-api-access\") pod \"9250ab61-5840-443a-b138-0f6652080ea6\" (UID: \"9250ab61-5840-443a-b138-0f6652080ea6\") " Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.101820 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9250ab61-5840-443a-b138-0f6652080ea6-kubelet-dir\") pod \"9250ab61-5840-443a-b138-0f6652080ea6\" (UID: \"9250ab61-5840-443a-b138-0f6652080ea6\") " Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.102046 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9250ab61-5840-443a-b138-0f6652080ea6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9250ab61-5840-443a-b138-0f6652080ea6" (UID: "9250ab61-5840-443a-b138-0f6652080ea6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.102558 5118 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9250ab61-5840-443a-b138-0f6652080ea6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.112885 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9250ab61-5840-443a-b138-0f6652080ea6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9250ab61-5840-443a-b138-0f6652080ea6" (UID: "9250ab61-5840-443a-b138-0f6652080ea6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.204126 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9250ab61-5840-443a-b138-0f6652080ea6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.503337 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" event={"ID":"1ce759e4-1e35-4f6f-866d-e810a5e231c8","Type":"ContainerStarted","Data":"e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9"} Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.504706 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.507063 5118 patch_prober.go:28] interesting pod/router-default-5444994796-sqp4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:47:00 crc kubenswrapper[5118]: [-]has-synced failed: reason withheld Feb 23 06:47:00 crc kubenswrapper[5118]: [+]process-running ok Feb 23 06:47:00 crc kubenswrapper[5118]: healthz check failed Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.507123 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sqp4p" podUID="adfd1b6a-2add-429b-b8e9-b245e1b999ad" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.516358 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9250ab61-5840-443a-b138-0f6652080ea6","Type":"ContainerDied","Data":"957ee4844c70a47a63dcc8f4838d6079d0ce0e28747233e472e45c9cd04440c7"} Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.516401 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="957ee4844c70a47a63dcc8f4838d6079d0ce0e28747233e472e45c9cd04440c7" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.516482 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.539282 5118 generic.go:334] "Generic (PLEG): container finished" podID="1c37c475-22ac-48ab-b699-0e01e84f1536" containerID="e1a5541bb18c02f7590ac9b6d4ba8d6f260cd7e9e5063354e3cd16bdbd84edd3" exitCode=0 Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.539696 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1c37c475-22ac-48ab-b699-0e01e84f1536","Type":"ContainerDied","Data":"e1a5541bb18c02f7590ac9b6d4ba8d6f260cd7e9e5063354e3cd16bdbd84edd3"} Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.570065 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.593912 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" podStartSLOduration=4.59387224 podStartE2EDuration="4.59387224s" podCreationTimestamp="2026-02-23 06:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:00.539075909 +0000 UTC m=+83.542860482" watchObservedRunningTime="2026-02-23 06:47:00.59387224 +0000 UTC m=+83.597656833" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.604613 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" event={"ID":"08634871-e819-4b75-93e5-fed45013b977","Type":"ContainerStarted","Data":"56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa"} Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.606540 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:47:00 crc kubenswrapper[5118]: I0223 06:47:00.612748 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj"] Feb 23 06:47:01 crc kubenswrapper[5118]: I0223 06:47:01.501637 5118 patch_prober.go:28] interesting pod/router-default-5444994796-sqp4p container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 06:47:01 crc kubenswrapper[5118]: [-]has-synced failed: reason withheld Feb 23 06:47:01 crc kubenswrapper[5118]: [+]process-running ok Feb 23 06:47:01 crc kubenswrapper[5118]: healthz check failed Feb 23 06:47:01 crc kubenswrapper[5118]: I0223 06:47:01.501823 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-sqp4p" podUID="adfd1b6a-2add-429b-b8e9-b245e1b999ad" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 06:47:01 crc kubenswrapper[5118]: I0223 06:47:01.663443 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" event={"ID":"07c58462-b8e6-404f-baa0-8eb3fec9bc6c","Type":"ContainerStarted","Data":"67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902"} Feb 23 06:47:01 crc kubenswrapper[5118]: I0223 06:47:01.663527 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" event={"ID":"07c58462-b8e6-404f-baa0-8eb3fec9bc6c","Type":"ContainerStarted","Data":"d95a6fd1d8ded7733cf24d487eb5f6893c946a6c3c08f73d5950036d7edce700"} Feb 23 06:47:01 crc kubenswrapper[5118]: I0223 06:47:01.684206 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" podStartSLOduration=42.684174967 podStartE2EDuration="42.684174967s" podCreationTimestamp="2026-02-23 06:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:00.662153011 +0000 UTC m=+83.665937604" watchObservedRunningTime="2026-02-23 06:47:01.684174967 +0000 UTC m=+84.687959540" Feb 23 06:47:01 crc kubenswrapper[5118]: I0223 06:47:01.687565 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" podStartSLOduration=5.6875562219999996 podStartE2EDuration="5.687556222s" podCreationTimestamp="2026-02-23 06:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:01.684819321 +0000 UTC m=+84.688603894" watchObservedRunningTime="2026-02-23 06:47:01.687556222 +0000 UTC m=+84.691340795" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.053409 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.085832 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c37c475-22ac-48ab-b699-0e01e84f1536-kubelet-dir\") pod \"1c37c475-22ac-48ab-b699-0e01e84f1536\" (UID: \"1c37c475-22ac-48ab-b699-0e01e84f1536\") " Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.085970 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c37c475-22ac-48ab-b699-0e01e84f1536-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1c37c475-22ac-48ab-b699-0e01e84f1536" (UID: "1c37c475-22ac-48ab-b699-0e01e84f1536"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.086020 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c37c475-22ac-48ab-b699-0e01e84f1536-kube-api-access\") pod \"1c37c475-22ac-48ab-b699-0e01e84f1536\" (UID: \"1c37c475-22ac-48ab-b699-0e01e84f1536\") " Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.086369 5118 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c37c475-22ac-48ab-b699-0e01e84f1536-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.098235 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c37c475-22ac-48ab-b699-0e01e84f1536-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1c37c475-22ac-48ab-b699-0e01e84f1536" (UID: "1c37c475-22ac-48ab-b699-0e01e84f1536"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.188306 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c37c475-22ac-48ab-b699-0e01e84f1536-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.499499 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.503858 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-sqp4p" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.685874 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.687631 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1c37c475-22ac-48ab-b699-0e01e84f1536","Type":"ContainerDied","Data":"1a9159555364990ea69b3b4131daf39282fd5295e6464e872331d3c30794e66c"} Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.687679 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a9159555364990ea69b3b4131daf39282fd5295e6464e872331d3c30794e66c" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.687710 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:47:02 crc kubenswrapper[5118]: I0223 06:47:02.694040 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:47:03 crc kubenswrapper[5118]: I0223 06:47:03.042719 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:47:03 crc kubenswrapper[5118]: I0223 06:47:03.902306 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x8c6x" Feb 23 06:47:05 crc kubenswrapper[5118]: I0223 06:47:05.767959 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:05 crc kubenswrapper[5118]: I0223 06:47:05.768367 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:05 crc kubenswrapper[5118]: I0223 06:47:05.768433 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:05 crc kubenswrapper[5118]: I0223 06:47:05.768487 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:05 crc kubenswrapper[5118]: I0223 06:47:05.769189 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:05 crc kubenswrapper[5118]: I0223 06:47:05.779687 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:05 crc kubenswrapper[5118]: I0223 06:47:05.779777 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:05 crc kubenswrapper[5118]: I0223 06:47:05.806890 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:06 crc kubenswrapper[5118]: I0223 06:47:06.001085 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 06:47:06 crc kubenswrapper[5118]: I0223 06:47:06.014458 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:06 crc kubenswrapper[5118]: I0223 06:47:06.022348 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 06:47:07 crc kubenswrapper[5118]: I0223 06:47:07.497481 5118 patch_prober.go:28] interesting pod/console-f9d7485db-r9t74 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 23 06:47:07 crc kubenswrapper[5118]: I0223 06:47:07.497827 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-r9t74" podUID="d8fc5dad-079f-4768-a10e-616ff7228ccd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 23 06:47:08 crc kubenswrapper[5118]: I0223 06:47:08.139351 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-f7t79" Feb 23 06:47:08 crc kubenswrapper[5118]: E0223 06:47:08.880672 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 06:47:08 crc kubenswrapper[5118]: E0223 06:47:08.885152 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 06:47:08 crc kubenswrapper[5118]: E0223 06:47:08.889059 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 06:47:08 crc kubenswrapper[5118]: E0223 06:47:08.889106 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" podUID="8ba477b1-5a26-44b2-800e-2eea288fef93" containerName="kube-multus-additional-cni-plugins" Feb 23 06:47:17 crc kubenswrapper[5118]: I0223 06:47:17.506943 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:47:17 crc kubenswrapper[5118]: I0223 06:47:17.517534 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-r9t74" Feb 23 06:47:18 crc kubenswrapper[5118]: I0223 06:47:18.362232 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:47:18 crc kubenswrapper[5118]: E0223 06:47:18.878052 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 06:47:18 crc kubenswrapper[5118]: E0223 06:47:18.880256 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 06:47:18 crc kubenswrapper[5118]: E0223 06:47:18.881747 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 06:47:18 crc kubenswrapper[5118]: E0223 06:47:18.881783 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" podUID="8ba477b1-5a26-44b2-800e-2eea288fef93" containerName="kube-multus-additional-cni-plugins" Feb 23 06:47:22 crc kubenswrapper[5118]: E0223 06:47:22.155071 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 23 06:47:22 crc kubenswrapper[5118]: E0223 06:47:22.156079 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mrhc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jfmkj_openshift-marketplace(4051ea46-bd23-4bc5-ae80-9b3cba5aa41f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:47:22 crc kubenswrapper[5118]: E0223 06:47:22.157388 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jfmkj" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" Feb 23 06:47:24 crc kubenswrapper[5118]: I0223 06:47:24.916578 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-7zksp_8ba477b1-5a26-44b2-800e-2eea288fef93/kube-multus-additional-cni-plugins/0.log" Feb 23 06:47:24 crc kubenswrapper[5118]: I0223 06:47:24.917170 5118 generic.go:334] "Generic (PLEG): container finished" podID="8ba477b1-5a26-44b2-800e-2eea288fef93" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" exitCode=137 Feb 23 06:47:24 crc kubenswrapper[5118]: I0223 06:47:24.917234 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" event={"ID":"8ba477b1-5a26-44b2-800e-2eea288fef93","Type":"ContainerDied","Data":"bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046"} Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.242550 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jfmkj" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.358550 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-7zksp_8ba477b1-5a26-44b2-800e-2eea288fef93/kube-multus-additional-cni-plugins/0.log" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.358634 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.407059 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.407697 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c7jcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jvh5v_openshift-marketplace(d061beb5-e2d8-49b3-b802-099dbdddafca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.409404 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jvh5v" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.414065 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.414238 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twbjt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qknnk_openshift-marketplace(e669f113-9929-4497-ba2b-e5cc2169ae24): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.418337 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qknnk" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.423646 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.424296 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7rvsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-526dd_openshift-marketplace(67106794-6724-41de-9db8-d51c468e1e28): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.427262 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-526dd" podUID="67106794-6724-41de-9db8-d51c468e1e28" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.439273 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.439469 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xltk7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-l4d7l_openshift-marketplace(00c172a6-346f-41df-8fb1-3f6c9458da33): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.441869 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-l4d7l" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.443312 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.443512 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bl8jd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-brcz5_openshift-marketplace(01ab420b-d9af-4caf-b63d-c39f2d30e6d5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.444823 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-brcz5" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.565649 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/8ba477b1-5a26-44b2-800e-2eea288fef93-ready\") pod \"8ba477b1-5a26-44b2-800e-2eea288fef93\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.565683 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ba477b1-5a26-44b2-800e-2eea288fef93-tuning-conf-dir\") pod \"8ba477b1-5a26-44b2-800e-2eea288fef93\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.565755 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ba477b1-5a26-44b2-800e-2eea288fef93-cni-sysctl-allowlist\") pod \"8ba477b1-5a26-44b2-800e-2eea288fef93\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.565796 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnjxj\" (UniqueName: \"kubernetes.io/projected/8ba477b1-5a26-44b2-800e-2eea288fef93-kube-api-access-mnjxj\") pod \"8ba477b1-5a26-44b2-800e-2eea288fef93\" (UID: \"8ba477b1-5a26-44b2-800e-2eea288fef93\") " Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.568342 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ba477b1-5a26-44b2-800e-2eea288fef93-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "8ba477b1-5a26-44b2-800e-2eea288fef93" (UID: "8ba477b1-5a26-44b2-800e-2eea288fef93"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.568676 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba477b1-5a26-44b2-800e-2eea288fef93-ready" (OuterVolumeSpecName: "ready") pod "8ba477b1-5a26-44b2-800e-2eea288fef93" (UID: "8ba477b1-5a26-44b2-800e-2eea288fef93"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.569191 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ba477b1-5a26-44b2-800e-2eea288fef93-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "8ba477b1-5a26-44b2-800e-2eea288fef93" (UID: "8ba477b1-5a26-44b2-800e-2eea288fef93"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.573402 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba477b1-5a26-44b2-800e-2eea288fef93-kube-api-access-mnjxj" (OuterVolumeSpecName: "kube-api-access-mnjxj") pod "8ba477b1-5a26-44b2-800e-2eea288fef93" (UID: "8ba477b1-5a26-44b2-800e-2eea288fef93"). InnerVolumeSpecName "kube-api-access-mnjxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.666948 5118 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/8ba477b1-5a26-44b2-800e-2eea288fef93-ready\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.667439 5118 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ba477b1-5a26-44b2-800e-2eea288fef93-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.667458 5118 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ba477b1-5a26-44b2-800e-2eea288fef93-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.667473 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnjxj\" (UniqueName: \"kubernetes.io/projected/8ba477b1-5a26-44b2-800e-2eea288fef93-kube-api-access-mnjxj\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.937371 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-7zksp_8ba477b1-5a26-44b2-800e-2eea288fef93/kube-multus-additional-cni-plugins/0.log" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.937723 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" event={"ID":"8ba477b1-5a26-44b2-800e-2eea288fef93","Type":"ContainerDied","Data":"5db6293c7f82c3262fac1b93413aa998d4f055b96c6d19516c9f5655c6af418a"} Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.937821 5118 scope.go:117] "RemoveContainer" containerID="bb49768d718db0ec3884d0c7c5711e09a14c656ab7f5d3f3e540077a192e1046" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.938077 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7zksp" Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.943846 5118 generic.go:334] "Generic (PLEG): container finished" podID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerID="c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369" exitCode=0 Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.944856 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89mtl" event={"ID":"ae00e6ae-91b4-48e7-8836-53d0fc36c777","Type":"ContainerDied","Data":"c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369"} Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.947524 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7aa424d8d8a849db98610848ab9d63549517eb87b292a70c1da376462d96ba10"} Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.950593 5118 generic.go:334] "Generic (PLEG): container finished" podID="b35a400e-565d-40e1-aa28-4896f541c19f" containerID="7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64" exitCode=0 Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.951157 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldm4m" event={"ID":"b35a400e-565d-40e1-aa28-4896f541c19f","Type":"ContainerDied","Data":"7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64"} Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.954588 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e47b7e7e703db9c8612d523bfb7049c7ebea49c8accb83d6f7d6e9d7f7c8219b"} Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.954667 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"df2dda47951b03df9da5325078c6cba375cc36285701358618a39612d6ec1e9c"} Feb 23 06:47:26 crc kubenswrapper[5118]: I0223 06:47:26.956057 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"953b61c98b462cac31f8038acce09bc535aa4c48c8e05a43615e5e4afc4a5a81"} Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.957046 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qknnk" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.957187 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-brcz5" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.958765 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-526dd" podUID="67106794-6724-41de-9db8-d51c468e1e28" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.959232 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jvh5v" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" Feb 23 06:47:26 crc kubenswrapper[5118]: E0223 06:47:26.960478 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-l4d7l" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" Feb 23 06:47:27 crc kubenswrapper[5118]: I0223 06:47:27.124192 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7zksp"] Feb 23 06:47:27 crc kubenswrapper[5118]: I0223 06:47:27.126636 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7zksp"] Feb 23 06:47:27 crc kubenswrapper[5118]: I0223 06:47:27.705565 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba477b1-5a26-44b2-800e-2eea288fef93" path="/var/lib/kubelet/pods/8ba477b1-5a26-44b2-800e-2eea288fef93/volumes" Feb 23 06:47:27 crc kubenswrapper[5118]: I0223 06:47:27.962930 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldm4m" event={"ID":"b35a400e-565d-40e1-aa28-4896f541c19f","Type":"ContainerStarted","Data":"3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453"} Feb 23 06:47:27 crc kubenswrapper[5118]: I0223 06:47:27.965823 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"da320bb0e45c09894771294b1cc7b05e7afc2f4db401b3fcfa5900c497539cae"} Feb 23 06:47:27 crc kubenswrapper[5118]: I0223 06:47:27.969758 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89mtl" event={"ID":"ae00e6ae-91b4-48e7-8836-53d0fc36c777","Type":"ContainerStarted","Data":"052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af"} Feb 23 06:47:27 crc kubenswrapper[5118]: I0223 06:47:27.971562 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b8a476be91112f206604341659820d90a099d4f6384eba4218791adc47c5c095"} Feb 23 06:47:27 crc kubenswrapper[5118]: I0223 06:47:27.971718 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:47:27 crc kubenswrapper[5118]: I0223 06:47:27.980571 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ldm4m" podStartSLOduration=5.535232181 podStartE2EDuration="36.980552679s" podCreationTimestamp="2026-02-23 06:46:51 +0000 UTC" firstStartedPulling="2026-02-23 06:46:55.871006455 +0000 UTC m=+78.874791018" lastFinishedPulling="2026-02-23 06:47:27.316326903 +0000 UTC m=+110.320111516" observedRunningTime="2026-02-23 06:47:27.97970337 +0000 UTC m=+110.983487953" watchObservedRunningTime="2026-02-23 06:47:27.980552679 +0000 UTC m=+110.984337252" Feb 23 06:47:28 crc kubenswrapper[5118]: I0223 06:47:28.044717 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-89mtl" podStartSLOduration=4.71093876 podStartE2EDuration="35.044697928s" podCreationTimestamp="2026-02-23 06:46:53 +0000 UTC" firstStartedPulling="2026-02-23 06:46:57.030954273 +0000 UTC m=+80.034738846" lastFinishedPulling="2026-02-23 06:47:27.364713441 +0000 UTC m=+110.368498014" observedRunningTime="2026-02-23 06:47:28.042033119 +0000 UTC m=+111.045817692" watchObservedRunningTime="2026-02-23 06:47:28.044697928 +0000 UTC m=+111.048482501" Feb 23 06:47:28 crc kubenswrapper[5118]: I0223 06:47:28.794838 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gc8gv" Feb 23 06:47:32 crc kubenswrapper[5118]: I0223 06:47:32.717169 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:47:32 crc kubenswrapper[5118]: I0223 06:47:32.717662 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:47:32 crc kubenswrapper[5118]: I0223 06:47:32.895726 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.053859 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.695066 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 06:47:33 crc kubenswrapper[5118]: E0223 06:47:33.702757 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9250ab61-5840-443a-b138-0f6652080ea6" containerName="pruner" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.702783 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9250ab61-5840-443a-b138-0f6652080ea6" containerName="pruner" Feb 23 06:47:33 crc kubenswrapper[5118]: E0223 06:47:33.702841 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c37c475-22ac-48ab-b699-0e01e84f1536" containerName="pruner" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.702849 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c37c475-22ac-48ab-b699-0e01e84f1536" containerName="pruner" Feb 23 06:47:33 crc kubenswrapper[5118]: E0223 06:47:33.702858 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba477b1-5a26-44b2-800e-2eea288fef93" containerName="kube-multus-additional-cni-plugins" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.702867 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba477b1-5a26-44b2-800e-2eea288fef93" containerName="kube-multus-additional-cni-plugins" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.703020 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c37c475-22ac-48ab-b699-0e01e84f1536" containerName="pruner" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.703035 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9250ab61-5840-443a-b138-0f6652080ea6" containerName="pruner" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.703050 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba477b1-5a26-44b2-800e-2eea288fef93" containerName="kube-multus-additional-cni-plugins" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.703619 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.703791 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.706520 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.706844 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.883758 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/668be977-eaf5-4975-b77b-9a6f1cc232a3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"668be977-eaf5-4975-b77b-9a6f1cc232a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.884188 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/668be977-eaf5-4975-b77b-9a6f1cc232a3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"668be977-eaf5-4975-b77b-9a6f1cc232a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.985644 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/668be977-eaf5-4975-b77b-9a6f1cc232a3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"668be977-eaf5-4975-b77b-9a6f1cc232a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.985754 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/668be977-eaf5-4975-b77b-9a6f1cc232a3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"668be977-eaf5-4975-b77b-9a6f1cc232a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:47:33 crc kubenswrapper[5118]: I0223 06:47:33.985885 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/668be977-eaf5-4975-b77b-9a6f1cc232a3-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"668be977-eaf5-4975-b77b-9a6f1cc232a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:47:34 crc kubenswrapper[5118]: I0223 06:47:34.010724 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/668be977-eaf5-4975-b77b-9a6f1cc232a3-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"668be977-eaf5-4975-b77b-9a6f1cc232a3\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:47:34 crc kubenswrapper[5118]: I0223 06:47:34.031391 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:47:34 crc kubenswrapper[5118]: I0223 06:47:34.231499 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 06:47:34 crc kubenswrapper[5118]: I0223 06:47:34.716290 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:47:34 crc kubenswrapper[5118]: I0223 06:47:34.716488 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:47:34 crc kubenswrapper[5118]: I0223 06:47:34.775346 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:47:35 crc kubenswrapper[5118]: I0223 06:47:35.023230 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"668be977-eaf5-4975-b77b-9a6f1cc232a3","Type":"ContainerStarted","Data":"20b08f63757b5a40cd4e910af3aa00f673f7927cd78caedf8e3da54466d3e7e5"} Feb 23 06:47:35 crc kubenswrapper[5118]: I0223 06:47:35.023768 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"668be977-eaf5-4975-b77b-9a6f1cc232a3","Type":"ContainerStarted","Data":"3cb7cd22d1efe3371f793f09207f098ea428e8236f7d9f0657e54717f25d5e40"} Feb 23 06:47:35 crc kubenswrapper[5118]: I0223 06:47:35.047639 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.047621414 podStartE2EDuration="2.047621414s" podCreationTimestamp="2026-02-23 06:47:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:35.045059128 +0000 UTC m=+118.048843701" watchObservedRunningTime="2026-02-23 06:47:35.047621414 +0000 UTC m=+118.051405987" Feb 23 06:47:35 crc kubenswrapper[5118]: I0223 06:47:35.074421 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:47:36 crc kubenswrapper[5118]: I0223 06:47:36.029600 5118 generic.go:334] "Generic (PLEG): container finished" podID="668be977-eaf5-4975-b77b-9a6f1cc232a3" containerID="20b08f63757b5a40cd4e910af3aa00f673f7927cd78caedf8e3da54466d3e7e5" exitCode=0 Feb 23 06:47:36 crc kubenswrapper[5118]: I0223 06:47:36.030559 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"668be977-eaf5-4975-b77b-9a6f1cc232a3","Type":"ContainerDied","Data":"20b08f63757b5a40cd4e910af3aa00f673f7927cd78caedf8e3da54466d3e7e5"} Feb 23 06:47:37 crc kubenswrapper[5118]: I0223 06:47:37.249806 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:47:37 crc kubenswrapper[5118]: I0223 06:47:37.437370 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/668be977-eaf5-4975-b77b-9a6f1cc232a3-kube-api-access\") pod \"668be977-eaf5-4975-b77b-9a6f1cc232a3\" (UID: \"668be977-eaf5-4975-b77b-9a6f1cc232a3\") " Feb 23 06:47:37 crc kubenswrapper[5118]: I0223 06:47:37.437586 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/668be977-eaf5-4975-b77b-9a6f1cc232a3-kubelet-dir\") pod \"668be977-eaf5-4975-b77b-9a6f1cc232a3\" (UID: \"668be977-eaf5-4975-b77b-9a6f1cc232a3\") " Feb 23 06:47:37 crc kubenswrapper[5118]: I0223 06:47:37.437787 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/668be977-eaf5-4975-b77b-9a6f1cc232a3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "668be977-eaf5-4975-b77b-9a6f1cc232a3" (UID: "668be977-eaf5-4975-b77b-9a6f1cc232a3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:47:37 crc kubenswrapper[5118]: I0223 06:47:37.438092 5118 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/668be977-eaf5-4975-b77b-9a6f1cc232a3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:37 crc kubenswrapper[5118]: I0223 06:47:37.444831 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668be977-eaf5-4975-b77b-9a6f1cc232a3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "668be977-eaf5-4975-b77b-9a6f1cc232a3" (UID: "668be977-eaf5-4975-b77b-9a6f1cc232a3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:47:37 crc kubenswrapper[5118]: I0223 06:47:37.539281 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/668be977-eaf5-4975-b77b-9a6f1cc232a3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:38 crc kubenswrapper[5118]: I0223 06:47:38.041893 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"668be977-eaf5-4975-b77b-9a6f1cc232a3","Type":"ContainerDied","Data":"3cb7cd22d1efe3371f793f09207f098ea428e8236f7d9f0657e54717f25d5e40"} Feb 23 06:47:38 crc kubenswrapper[5118]: I0223 06:47:38.042286 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cb7cd22d1efe3371f793f09207f098ea428e8236f7d9f0657e54717f25d5e40" Feb 23 06:47:38 crc kubenswrapper[5118]: I0223 06:47:38.041972 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 06:47:39 crc kubenswrapper[5118]: I0223 06:47:39.052818 5118 generic.go:334] "Generic (PLEG): container finished" podID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerID="0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da" exitCode=0 Feb 23 06:47:39 crc kubenswrapper[5118]: I0223 06:47:39.052921 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfmkj" event={"ID":"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f","Type":"ContainerDied","Data":"0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da"} Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.062065 5118 generic.go:334] "Generic (PLEG): container finished" podID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerID="81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7" exitCode=0 Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.062225 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4d7l" event={"ID":"00c172a6-346f-41df-8fb1-3f6c9458da33","Type":"ContainerDied","Data":"81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7"} Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.070493 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfmkj" event={"ID":"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f","Type":"ContainerStarted","Data":"573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212"} Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.734132 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jfmkj" podStartSLOduration=5.856230212 podStartE2EDuration="49.734086455s" podCreationTimestamp="2026-02-23 06:46:51 +0000 UTC" firstStartedPulling="2026-02-23 06:46:55.606666116 +0000 UTC m=+78.610450689" lastFinishedPulling="2026-02-23 06:47:39.484522349 +0000 UTC m=+122.488306932" observedRunningTime="2026-02-23 06:47:40.114559604 +0000 UTC m=+123.118344177" watchObservedRunningTime="2026-02-23 06:47:40.734086455 +0000 UTC m=+123.737871028" Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.891423 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 06:47:40 crc kubenswrapper[5118]: E0223 06:47:40.892264 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668be977-eaf5-4975-b77b-9a6f1cc232a3" containerName="pruner" Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.892287 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="668be977-eaf5-4975-b77b-9a6f1cc232a3" containerName="pruner" Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.892419 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="668be977-eaf5-4975-b77b-9a6f1cc232a3" containerName="pruner" Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.892978 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.897482 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.897766 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.912026 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.995792 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.996091 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:40 crc kubenswrapper[5118]: I0223 06:47:40.996176 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-var-lock\") pod \"installer-9-crc\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:41 crc kubenswrapper[5118]: I0223 06:47:41.097698 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-var-lock\") pod \"installer-9-crc\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:41 crc kubenswrapper[5118]: I0223 06:47:41.097839 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-var-lock\") pod \"installer-9-crc\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:41 crc kubenswrapper[5118]: I0223 06:47:41.098519 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:41 crc kubenswrapper[5118]: I0223 06:47:41.098742 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:41 crc kubenswrapper[5118]: I0223 06:47:41.098783 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:41 crc kubenswrapper[5118]: I0223 06:47:41.120926 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:41 crc kubenswrapper[5118]: I0223 06:47:41.214893 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:47:41 crc kubenswrapper[5118]: I0223 06:47:41.943573 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 06:47:42 crc kubenswrapper[5118]: I0223 06:47:42.091503 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b844ac65-c3d9-4e59-82cd-71d83bc3198a","Type":"ContainerStarted","Data":"b67696d47c8e503c246e279f14c611ab3d5339c993cf2bb3bbc927544477d176"} Feb 23 06:47:42 crc kubenswrapper[5118]: I0223 06:47:42.515543 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:47:42 crc kubenswrapper[5118]: I0223 06:47:42.516023 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:47:42 crc kubenswrapper[5118]: I0223 06:47:42.567085 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:47:43 crc kubenswrapper[5118]: I0223 06:47:43.102304 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brcz5" event={"ID":"01ab420b-d9af-4caf-b63d-c39f2d30e6d5","Type":"ContainerStarted","Data":"231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a"} Feb 23 06:47:43 crc kubenswrapper[5118]: I0223 06:47:43.104172 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-526dd" event={"ID":"67106794-6724-41de-9db8-d51c468e1e28","Type":"ContainerStarted","Data":"d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665"} Feb 23 06:47:43 crc kubenswrapper[5118]: I0223 06:47:43.107477 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4d7l" event={"ID":"00c172a6-346f-41df-8fb1-3f6c9458da33","Type":"ContainerStarted","Data":"6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd"} Feb 23 06:47:43 crc kubenswrapper[5118]: I0223 06:47:43.109031 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh5v" event={"ID":"d061beb5-e2d8-49b3-b802-099dbdddafca","Type":"ContainerStarted","Data":"3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c"} Feb 23 06:47:43 crc kubenswrapper[5118]: I0223 06:47:43.110846 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qknnk" event={"ID":"e669f113-9929-4497-ba2b-e5cc2169ae24","Type":"ContainerStarted","Data":"890ba51492bd6944ebc49c33e8cd193120eab6cc28f676abf35608e0ae2e462c"} Feb 23 06:47:43 crc kubenswrapper[5118]: I0223 06:47:43.113020 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b844ac65-c3d9-4e59-82cd-71d83bc3198a","Type":"ContainerStarted","Data":"ab071a05c1d2ce5891440a4395a79e44dd77e844a504db6fe09dca6278279560"} Feb 23 06:47:43 crc kubenswrapper[5118]: I0223 06:47:43.243831 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l4d7l" podStartSLOduration=5.089310063 podStartE2EDuration="51.24380525s" podCreationTimestamp="2026-02-23 06:46:52 +0000 UTC" firstStartedPulling="2026-02-23 06:46:55.545866442 +0000 UTC m=+78.549651015" lastFinishedPulling="2026-02-23 06:47:41.700361599 +0000 UTC m=+124.704146202" observedRunningTime="2026-02-23 06:47:43.222413624 +0000 UTC m=+126.226198227" watchObservedRunningTime="2026-02-23 06:47:43.24380525 +0000 UTC m=+126.247589823" Feb 23 06:47:43 crc kubenswrapper[5118]: I0223 06:47:43.245659 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.245654322 podStartE2EDuration="3.245654322s" podCreationTimestamp="2026-02-23 06:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:43.242953122 +0000 UTC m=+126.246737705" watchObservedRunningTime="2026-02-23 06:47:43.245654322 +0000 UTC m=+126.249438885" Feb 23 06:47:44 crc kubenswrapper[5118]: I0223 06:47:44.121932 5118 generic.go:334] "Generic (PLEG): container finished" podID="67106794-6724-41de-9db8-d51c468e1e28" containerID="d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665" exitCode=0 Feb 23 06:47:44 crc kubenswrapper[5118]: I0223 06:47:44.122006 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-526dd" event={"ID":"67106794-6724-41de-9db8-d51c468e1e28","Type":"ContainerDied","Data":"d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665"} Feb 23 06:47:44 crc kubenswrapper[5118]: I0223 06:47:44.125272 5118 generic.go:334] "Generic (PLEG): container finished" podID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerID="3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c" exitCode=0 Feb 23 06:47:44 crc kubenswrapper[5118]: I0223 06:47:44.125328 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh5v" event={"ID":"d061beb5-e2d8-49b3-b802-099dbdddafca","Type":"ContainerDied","Data":"3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c"} Feb 23 06:47:44 crc kubenswrapper[5118]: I0223 06:47:44.131165 5118 generic.go:334] "Generic (PLEG): container finished" podID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerID="231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a" exitCode=0 Feb 23 06:47:44 crc kubenswrapper[5118]: I0223 06:47:44.131210 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brcz5" event={"ID":"01ab420b-d9af-4caf-b63d-c39f2d30e6d5","Type":"ContainerDied","Data":"231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a"} Feb 23 06:47:44 crc kubenswrapper[5118]: I0223 06:47:44.134463 5118 generic.go:334] "Generic (PLEG): container finished" podID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerID="890ba51492bd6944ebc49c33e8cd193120eab6cc28f676abf35608e0ae2e462c" exitCode=0 Feb 23 06:47:44 crc kubenswrapper[5118]: I0223 06:47:44.134543 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qknnk" event={"ID":"e669f113-9929-4497-ba2b-e5cc2169ae24","Type":"ContainerDied","Data":"890ba51492bd6944ebc49c33e8cd193120eab6cc28f676abf35608e0ae2e462c"} Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.143953 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-526dd" event={"ID":"67106794-6724-41de-9db8-d51c468e1e28","Type":"ContainerStarted","Data":"e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a"} Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.147593 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh5v" event={"ID":"d061beb5-e2d8-49b3-b802-099dbdddafca","Type":"ContainerStarted","Data":"6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f"} Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.153630 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brcz5" event={"ID":"01ab420b-d9af-4caf-b63d-c39f2d30e6d5","Type":"ContainerStarted","Data":"83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf"} Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.157244 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qknnk" event={"ID":"e669f113-9929-4497-ba2b-e5cc2169ae24","Type":"ContainerStarted","Data":"14af38c51217358d9903bd127465a04b78bec99315d19aff82a7356bcaa0b7e4"} Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.174237 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-526dd" podStartSLOduration=3.3580670599999998 podStartE2EDuration="51.174217342s" podCreationTimestamp="2026-02-23 06:46:54 +0000 UTC" firstStartedPulling="2026-02-23 06:46:56.950864299 +0000 UTC m=+79.954648872" lastFinishedPulling="2026-02-23 06:47:44.767014571 +0000 UTC m=+127.770799154" observedRunningTime="2026-02-23 06:47:45.173216339 +0000 UTC m=+128.177000922" watchObservedRunningTime="2026-02-23 06:47:45.174217342 +0000 UTC m=+128.178001915" Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.203381 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qknnk" podStartSLOduration=2.393234654 podStartE2EDuration="50.202724407s" podCreationTimestamp="2026-02-23 06:46:55 +0000 UTC" firstStartedPulling="2026-02-23 06:46:56.99669583 +0000 UTC m=+80.000480403" lastFinishedPulling="2026-02-23 06:47:44.806185563 +0000 UTC m=+127.809970156" observedRunningTime="2026-02-23 06:47:45.199777131 +0000 UTC m=+128.203561724" watchObservedRunningTime="2026-02-23 06:47:45.202724407 +0000 UTC m=+128.206508980" Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.219740 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-brcz5" podStartSLOduration=3.8790606629999997 podStartE2EDuration="53.219720805s" podCreationTimestamp="2026-02-23 06:46:52 +0000 UTC" firstStartedPulling="2026-02-23 06:46:55.55121476 +0000 UTC m=+78.554999333" lastFinishedPulling="2026-02-23 06:47:44.891874892 +0000 UTC m=+127.895659475" observedRunningTime="2026-02-23 06:47:45.215177514 +0000 UTC m=+128.218962087" watchObservedRunningTime="2026-02-23 06:47:45.219720805 +0000 UTC m=+128.223505378" Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.237381 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jvh5v" podStartSLOduration=3.520847136 podStartE2EDuration="51.237355878s" podCreationTimestamp="2026-02-23 06:46:54 +0000 UTC" firstStartedPulling="2026-02-23 06:46:56.993017458 +0000 UTC m=+79.996802031" lastFinishedPulling="2026-02-23 06:47:44.70952616 +0000 UTC m=+127.713310773" observedRunningTime="2026-02-23 06:47:45.232795037 +0000 UTC m=+128.236579610" watchObservedRunningTime="2026-02-23 06:47:45.237355878 +0000 UTC m=+128.241140461" Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.358162 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.358225 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.623661 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:47:45 crc kubenswrapper[5118]: I0223 06:47:45.624026 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:47:46 crc kubenswrapper[5118]: I0223 06:47:46.396267 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-526dd" podUID="67106794-6724-41de-9db8-d51c468e1e28" containerName="registry-server" probeResult="failure" output=< Feb 23 06:47:46 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 06:47:46 crc kubenswrapper[5118]: > Feb 23 06:47:46 crc kubenswrapper[5118]: I0223 06:47:46.669127 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qknnk" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerName="registry-server" probeResult="failure" output=< Feb 23 06:47:46 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 06:47:46 crc kubenswrapper[5118]: > Feb 23 06:47:52 crc kubenswrapper[5118]: I0223 06:47:52.584007 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:47:52 crc kubenswrapper[5118]: I0223 06:47:52.584846 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:47:52 crc kubenswrapper[5118]: I0223 06:47:52.603561 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:47:52 crc kubenswrapper[5118]: I0223 06:47:52.702017 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:47:53 crc kubenswrapper[5118]: I0223 06:47:53.046325 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:47:53 crc kubenswrapper[5118]: I0223 06:47:53.046427 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:47:53 crc kubenswrapper[5118]: I0223 06:47:53.130454 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:47:53 crc kubenswrapper[5118]: I0223 06:47:53.309587 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:47:53 crc kubenswrapper[5118]: I0223 06:47:53.312913 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.097194 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.097616 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.158995 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.329016 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.377441 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56b586ccb9-fnncr"] Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.377702 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" podUID="1ce759e4-1e35-4f6f-866d-e810a5e231c8" containerName="controller-manager" containerID="cri-o://e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9" gracePeriod=30 Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.437873 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.478870 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj"] Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.479293 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" podUID="07c58462-b8e6-404f-baa0-8eb3fec9bc6c" containerName="route-controller-manager" containerID="cri-o://67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902" gracePeriod=30 Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.498923 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4d7l"] Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.499546 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l4d7l" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerName="registry-server" containerID="cri-o://6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd" gracePeriod=2 Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.506245 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.672462 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.690835 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-brcz5"] Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.691384 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-brcz5" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerName="registry-server" containerID="cri-o://83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf" gracePeriod=2 Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.726218 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.766326 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.849692 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-config\") pod \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.849792 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-client-ca\") pod \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.849888 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce759e4-1e35-4f6f-866d-e810a5e231c8-serving-cert\") pod \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.849925 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-689gj\" (UniqueName: \"kubernetes.io/projected/1ce759e4-1e35-4f6f-866d-e810a5e231c8-kube-api-access-689gj\") pod \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.849956 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-proxy-ca-bundles\") pod \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\" (UID: \"1ce759e4-1e35-4f6f-866d-e810a5e231c8\") " Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.851037 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-config" (OuterVolumeSpecName: "config") pod "1ce759e4-1e35-4f6f-866d-e810a5e231c8" (UID: "1ce759e4-1e35-4f6f-866d-e810a5e231c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.851124 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-client-ca" (OuterVolumeSpecName: "client-ca") pod "1ce759e4-1e35-4f6f-866d-e810a5e231c8" (UID: "1ce759e4-1e35-4f6f-866d-e810a5e231c8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.851277 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1ce759e4-1e35-4f6f-866d-e810a5e231c8" (UID: "1ce759e4-1e35-4f6f-866d-e810a5e231c8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.855834 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.860602 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce759e4-1e35-4f6f-866d-e810a5e231c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1ce759e4-1e35-4f6f-866d-e810a5e231c8" (UID: "1ce759e4-1e35-4f6f-866d-e810a5e231c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.861059 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce759e4-1e35-4f6f-866d-e810a5e231c8-kube-api-access-689gj" (OuterVolumeSpecName: "kube-api-access-689gj") pod "1ce759e4-1e35-4f6f-866d-e810a5e231c8" (UID: "1ce759e4-1e35-4f6f-866d-e810a5e231c8"). InnerVolumeSpecName "kube-api-access-689gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.907386 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.951386 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqb5c\" (UniqueName: \"kubernetes.io/projected/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-kube-api-access-jqb5c\") pod \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.951489 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-client-ca\") pod \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.951552 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-serving-cert\") pod \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.951582 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-config\") pod \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\" (UID: \"07c58462-b8e6-404f-baa0-8eb3fec9bc6c\") " Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.951817 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce759e4-1e35-4f6f-866d-e810a5e231c8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.951837 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-689gj\" (UniqueName: \"kubernetes.io/projected/1ce759e4-1e35-4f6f-866d-e810a5e231c8-kube-api-access-689gj\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.951850 5118 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.951863 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.951877 5118 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce759e4-1e35-4f6f-866d-e810a5e231c8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.952744 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-client-ca" (OuterVolumeSpecName: "client-ca") pod "07c58462-b8e6-404f-baa0-8eb3fec9bc6c" (UID: "07c58462-b8e6-404f-baa0-8eb3fec9bc6c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.952762 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-config" (OuterVolumeSpecName: "config") pod "07c58462-b8e6-404f-baa0-8eb3fec9bc6c" (UID: "07c58462-b8e6-404f-baa0-8eb3fec9bc6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.955043 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "07c58462-b8e6-404f-baa0-8eb3fec9bc6c" (UID: "07c58462-b8e6-404f-baa0-8eb3fec9bc6c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:47:55 crc kubenswrapper[5118]: I0223 06:47:55.957866 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-kube-api-access-jqb5c" (OuterVolumeSpecName: "kube-api-access-jqb5c") pod "07c58462-b8e6-404f-baa0-8eb3fec9bc6c" (UID: "07c58462-b8e6-404f-baa0-8eb3fec9bc6c"). InnerVolumeSpecName "kube-api-access-jqb5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.038685 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.052739 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xltk7\" (UniqueName: \"kubernetes.io/projected/00c172a6-346f-41df-8fb1-3f6c9458da33-kube-api-access-xltk7\") pod \"00c172a6-346f-41df-8fb1-3f6c9458da33\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.052845 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-catalog-content\") pod \"00c172a6-346f-41df-8fb1-3f6c9458da33\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.052939 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-utilities\") pod \"00c172a6-346f-41df-8fb1-3f6c9458da33\" (UID: \"00c172a6-346f-41df-8fb1-3f6c9458da33\") " Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.053199 5118 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.053216 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.053225 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.053234 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqb5c\" (UniqueName: \"kubernetes.io/projected/07c58462-b8e6-404f-baa0-8eb3fec9bc6c-kube-api-access-jqb5c\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.053947 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-utilities" (OuterVolumeSpecName: "utilities") pod "00c172a6-346f-41df-8fb1-3f6c9458da33" (UID: "00c172a6-346f-41df-8fb1-3f6c9458da33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.056656 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00c172a6-346f-41df-8fb1-3f6c9458da33-kube-api-access-xltk7" (OuterVolumeSpecName: "kube-api-access-xltk7") pod "00c172a6-346f-41df-8fb1-3f6c9458da33" (UID: "00c172a6-346f-41df-8fb1-3f6c9458da33"). InnerVolumeSpecName "kube-api-access-xltk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.120373 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00c172a6-346f-41df-8fb1-3f6c9458da33" (UID: "00c172a6-346f-41df-8fb1-3f6c9458da33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.154237 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-utilities\") pod \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.154325 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl8jd\" (UniqueName: \"kubernetes.io/projected/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-kube-api-access-bl8jd\") pod \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.154466 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-catalog-content\") pod \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\" (UID: \"01ab420b-d9af-4caf-b63d-c39f2d30e6d5\") " Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.154766 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xltk7\" (UniqueName: \"kubernetes.io/projected/00c172a6-346f-41df-8fb1-3f6c9458da33-kube-api-access-xltk7\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.154791 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.154803 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00c172a6-346f-41df-8fb1-3f6c9458da33-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.155910 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-utilities" (OuterVolumeSpecName: "utilities") pod "01ab420b-d9af-4caf-b63d-c39f2d30e6d5" (UID: "01ab420b-d9af-4caf-b63d-c39f2d30e6d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.159318 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-kube-api-access-bl8jd" (OuterVolumeSpecName: "kube-api-access-bl8jd") pod "01ab420b-d9af-4caf-b63d-c39f2d30e6d5" (UID: "01ab420b-d9af-4caf-b63d-c39f2d30e6d5"). InnerVolumeSpecName "kube-api-access-bl8jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.234613 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01ab420b-d9af-4caf-b63d-c39f2d30e6d5" (UID: "01ab420b-d9af-4caf-b63d-c39f2d30e6d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.256271 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.256316 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.256335 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl8jd\" (UniqueName: \"kubernetes.io/projected/01ab420b-d9af-4caf-b63d-c39f2d30e6d5-kube-api-access-bl8jd\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.266730 5118 generic.go:334] "Generic (PLEG): container finished" podID="07c58462-b8e6-404f-baa0-8eb3fec9bc6c" containerID="67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902" exitCode=0 Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.266836 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.266887 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" event={"ID":"07c58462-b8e6-404f-baa0-8eb3fec9bc6c","Type":"ContainerDied","Data":"67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902"} Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.266943 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj" event={"ID":"07c58462-b8e6-404f-baa0-8eb3fec9bc6c","Type":"ContainerDied","Data":"d95a6fd1d8ded7733cf24d487eb5f6893c946a6c3c08f73d5950036d7edce700"} Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.266976 5118 scope.go:117] "RemoveContainer" containerID="67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.272438 5118 generic.go:334] "Generic (PLEG): container finished" podID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerID="6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd" exitCode=0 Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.272532 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4d7l" event={"ID":"00c172a6-346f-41df-8fb1-3f6c9458da33","Type":"ContainerDied","Data":"6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd"} Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.272576 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4d7l" event={"ID":"00c172a6-346f-41df-8fb1-3f6c9458da33","Type":"ContainerDied","Data":"fc58a9f26a578ef68b6b9188823b5427feeb3fb52895603791fb175314df98ed"} Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.273474 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4d7l" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.279703 5118 generic.go:334] "Generic (PLEG): container finished" podID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerID="83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf" exitCode=0 Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.279819 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brcz5" event={"ID":"01ab420b-d9af-4caf-b63d-c39f2d30e6d5","Type":"ContainerDied","Data":"83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf"} Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.279866 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-brcz5" event={"ID":"01ab420b-d9af-4caf-b63d-c39f2d30e6d5","Type":"ContainerDied","Data":"ad6ad238884b7933edc2da484f21c8f29e105ff308bd424ee86eee6d0b42be39"} Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.280017 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-brcz5" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.281674 5118 generic.go:334] "Generic (PLEG): container finished" podID="1ce759e4-1e35-4f6f-866d-e810a5e231c8" containerID="e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9" exitCode=0 Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.282584 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.282703 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" event={"ID":"1ce759e4-1e35-4f6f-866d-e810a5e231c8","Type":"ContainerDied","Data":"e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9"} Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.282747 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56b586ccb9-fnncr" event={"ID":"1ce759e4-1e35-4f6f-866d-e810a5e231c8","Type":"ContainerDied","Data":"525ecb7103bf2c8651777be1f7e52815de92714c1c2777289d2bd08951980337"} Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.307814 5118 scope.go:117] "RemoveContainer" containerID="67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.309739 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902\": container with ID starting with 67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902 not found: ID does not exist" containerID="67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.309793 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902"} err="failed to get container status \"67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902\": rpc error: code = NotFound desc = could not find container \"67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902\": container with ID starting with 67238983cba54074f281696106d264639e87c8a6c7b39b93565c9082a6d30902 not found: ID does not exist" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.309835 5118 scope.go:117] "RemoveContainer" containerID="6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.340771 5118 scope.go:117] "RemoveContainer" containerID="81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.342178 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj"] Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.367756 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd46479d-4f8fj"] Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.384000 5118 scope.go:117] "RemoveContainer" containerID="6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.390489 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-brcz5"] Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.399745 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-brcz5"] Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.404607 5118 scope.go:117] "RemoveContainer" containerID="6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.404977 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4d7l"] Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.405142 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd\": container with ID starting with 6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd not found: ID does not exist" containerID="6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.405181 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd"} err="failed to get container status \"6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd\": rpc error: code = NotFound desc = could not find container \"6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd\": container with ID starting with 6aeb2371cbfd62bafe7c91de6917933b13eab7acaf134037e2b8549f2ba16bbd not found: ID does not exist" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.405206 5118 scope.go:117] "RemoveContainer" containerID="81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.405899 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7\": container with ID starting with 81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7 not found: ID does not exist" containerID="81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.406052 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7"} err="failed to get container status \"81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7\": rpc error: code = NotFound desc = could not find container \"81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7\": container with ID starting with 81e376ef3f3f6a5f982b8796afc174b35a55c0a338b12b97f07569b44c9e25d7 not found: ID does not exist" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.406239 5118 scope.go:117] "RemoveContainer" containerID="6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.407045 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b\": container with ID starting with 6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b not found: ID does not exist" containerID="6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.407172 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b"} err="failed to get container status \"6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b\": rpc error: code = NotFound desc = could not find container \"6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b\": container with ID starting with 6340a214964ef9dc544b23d844628302ba65c51494f6ea60ac309415a7951b6b not found: ID does not exist" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.407229 5118 scope.go:117] "RemoveContainer" containerID="83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.411324 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l4d7l"] Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.413861 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56b586ccb9-fnncr"] Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.419224 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56b586ccb9-fnncr"] Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.425586 5118 scope.go:117] "RemoveContainer" containerID="231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.444598 5118 scope.go:117] "RemoveContainer" containerID="50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.461980 5118 scope.go:117] "RemoveContainer" containerID="83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.462480 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf\": container with ID starting with 83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf not found: ID does not exist" containerID="83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.462523 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf"} err="failed to get container status \"83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf\": rpc error: code = NotFound desc = could not find container \"83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf\": container with ID starting with 83c40a63ba2d7306b9098fd921eca8252543394068f343d149f0328bf249d9bf not found: ID does not exist" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.462558 5118 scope.go:117] "RemoveContainer" containerID="231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.463054 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a\": container with ID starting with 231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a not found: ID does not exist" containerID="231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.463492 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a"} err="failed to get container status \"231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a\": rpc error: code = NotFound desc = could not find container \"231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a\": container with ID starting with 231601153b37d2546251665f499dd7223895c9e700ed36a1cb996ec7877e231a not found: ID does not exist" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.463649 5118 scope.go:117] "RemoveContainer" containerID="50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.464273 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f\": container with ID starting with 50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f not found: ID does not exist" containerID="50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.464302 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f"} err="failed to get container status \"50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f\": rpc error: code = NotFound desc = could not find container \"50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f\": container with ID starting with 50aeda5f41cc7b51e6d421c17af5c04618c09a9b7da618931c395fffdf69b79f not found: ID does not exist" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.464323 5118 scope.go:117] "RemoveContainer" containerID="e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.480051 5118 scope.go:117] "RemoveContainer" containerID="e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.480859 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9\": container with ID starting with e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9 not found: ID does not exist" containerID="e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.480901 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9"} err="failed to get container status \"e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9\": rpc error: code = NotFound desc = could not find container \"e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9\": container with ID starting with e3fc769fe6d64379728af14b36339a749406f9f5fde0946d50d03b2acc86ebe9 not found: ID does not exist" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.704879 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54d88f76bf-p7b85"] Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.705419 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerName="registry-server" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.705449 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerName="registry-server" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.705473 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerName="extract-utilities" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.705494 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerName="extract-utilities" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.705514 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerName="registry-server" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.705535 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerName="registry-server" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.705578 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerName="extract-content" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.705596 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerName="extract-content" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.705641 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerName="extract-content" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.705660 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerName="extract-content" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.705689 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c58462-b8e6-404f-baa0-8eb3fec9bc6c" containerName="route-controller-manager" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.705708 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c58462-b8e6-404f-baa0-8eb3fec9bc6c" containerName="route-controller-manager" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.705727 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerName="extract-utilities" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.705743 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerName="extract-utilities" Feb 23 06:47:56 crc kubenswrapper[5118]: E0223 06:47:56.705770 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce759e4-1e35-4f6f-866d-e810a5e231c8" containerName="controller-manager" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.705787 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce759e4-1e35-4f6f-866d-e810a5e231c8" containerName="controller-manager" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.706013 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" containerName="registry-server" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.706048 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c58462-b8e6-404f-baa0-8eb3fec9bc6c" containerName="route-controller-manager" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.706074 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce759e4-1e35-4f6f-866d-e810a5e231c8" containerName="controller-manager" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.706135 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" containerName="registry-server" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.707089 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.711374 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44"] Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.712681 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.726126 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.726255 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.726417 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.726528 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.726865 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.727345 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54d88f76bf-p7b85"] Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.727508 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.727846 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.727995 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.728237 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.728808 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.733822 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.734649 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.734737 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.755311 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44"] Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.868005 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6667568-7ca5-4e89-935b-4de426ec1cf1-serving-cert\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.868462 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv2fg\" (UniqueName: \"kubernetes.io/projected/2ef91c5e-da32-44d0-8c91-64ea51c4208a-kube-api-access-kv2fg\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.868564 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcsqs\" (UniqueName: \"kubernetes.io/projected/d6667568-7ca5-4e89-935b-4de426ec1cf1-kube-api-access-jcsqs\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.868639 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-config\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.868721 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-client-ca\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.868901 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-proxy-ca-bundles\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.869013 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-config\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.869120 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-client-ca\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.869157 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef91c5e-da32-44d0-8c91-64ea51c4208a-serving-cert\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.970951 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6667568-7ca5-4e89-935b-4de426ec1cf1-serving-cert\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.971074 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv2fg\" (UniqueName: \"kubernetes.io/projected/2ef91c5e-da32-44d0-8c91-64ea51c4208a-kube-api-access-kv2fg\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.971144 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcsqs\" (UniqueName: \"kubernetes.io/projected/d6667568-7ca5-4e89-935b-4de426ec1cf1-kube-api-access-jcsqs\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.971257 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-config\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.971294 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-client-ca\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.971369 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-proxy-ca-bundles\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.971419 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-config\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.971464 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-client-ca\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.971498 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef91c5e-da32-44d0-8c91-64ea51c4208a-serving-cert\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.974848 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-client-ca\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.975486 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-client-ca\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.975594 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-config\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.975599 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-proxy-ca-bundles\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.977223 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-config\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.979375 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef91c5e-da32-44d0-8c91-64ea51c4208a-serving-cert\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:56 crc kubenswrapper[5118]: I0223 06:47:56.983154 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6667568-7ca5-4e89-935b-4de426ec1cf1-serving-cert\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.004349 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv2fg\" (UniqueName: \"kubernetes.io/projected/2ef91c5e-da32-44d0-8c91-64ea51c4208a-kube-api-access-kv2fg\") pod \"controller-manager-54d88f76bf-p7b85\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.005271 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcsqs\" (UniqueName: \"kubernetes.io/projected/d6667568-7ca5-4e89-935b-4de426ec1cf1-kube-api-access-jcsqs\") pod \"route-controller-manager-cd6b6d866-fns44\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.043543 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.069262 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.346489 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54d88f76bf-p7b85"] Feb 23 06:47:57 crc kubenswrapper[5118]: W0223 06:47:57.352510 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ef91c5e_da32_44d0_8c91_64ea51c4208a.slice/crio-2d06131615a3a8f0a4bdf0db6f4d2c572af3466c8c36f27d84027916305db377 WatchSource:0}: Error finding container 2d06131615a3a8f0a4bdf0db6f4d2c572af3466c8c36f27d84027916305db377: Status 404 returned error can't find the container with id 2d06131615a3a8f0a4bdf0db6f4d2c572af3466c8c36f27d84027916305db377 Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.403595 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44"] Feb 23 06:47:57 crc kubenswrapper[5118]: W0223 06:47:57.424181 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6667568_7ca5_4e89_935b_4de426ec1cf1.slice/crio-80a0167c7157e5877823de6ca4df3904ed413e282fa566afb2d026f0c18c7944 WatchSource:0}: Error finding container 80a0167c7157e5877823de6ca4df3904ed413e282fa566afb2d026f0c18c7944: Status 404 returned error can't find the container with id 80a0167c7157e5877823de6ca4df3904ed413e282fa566afb2d026f0c18c7944 Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.710988 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00c172a6-346f-41df-8fb1-3f6c9458da33" path="/var/lib/kubelet/pods/00c172a6-346f-41df-8fb1-3f6c9458da33/volumes" Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.712570 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab420b-d9af-4caf-b63d-c39f2d30e6d5" path="/var/lib/kubelet/pods/01ab420b-d9af-4caf-b63d-c39f2d30e6d5/volumes" Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.713201 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c58462-b8e6-404f-baa0-8eb3fec9bc6c" path="/var/lib/kubelet/pods/07c58462-b8e6-404f-baa0-8eb3fec9bc6c/volumes" Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.714177 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ce759e4-1e35-4f6f-866d-e810a5e231c8" path="/var/lib/kubelet/pods/1ce759e4-1e35-4f6f-866d-e810a5e231c8/volumes" Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.893698 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh5v"] Feb 23 06:47:57 crc kubenswrapper[5118]: I0223 06:47:57.894077 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jvh5v" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerName="registry-server" containerID="cri-o://6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f" gracePeriod=2 Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.091153 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qknnk"] Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.091533 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qknnk" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerName="registry-server" containerID="cri-o://14af38c51217358d9903bd127465a04b78bec99315d19aff82a7356bcaa0b7e4" gracePeriod=2 Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.254194 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.307347 5118 generic.go:334] "Generic (PLEG): container finished" podID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerID="6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f" exitCode=0 Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.307428 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh5v" event={"ID":"d061beb5-e2d8-49b3-b802-099dbdddafca","Type":"ContainerDied","Data":"6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f"} Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.307485 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jvh5v" event={"ID":"d061beb5-e2d8-49b3-b802-099dbdddafca","Type":"ContainerDied","Data":"ec73d0468bbb1f12443a3b0193ab2ecd89761ad6c90611fa923afcd646ffed79"} Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.307508 5118 scope.go:117] "RemoveContainer" containerID="6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.307685 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jvh5v" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.319024 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" event={"ID":"d6667568-7ca5-4e89-935b-4de426ec1cf1","Type":"ContainerStarted","Data":"fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2"} Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.319154 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" event={"ID":"d6667568-7ca5-4e89-935b-4de426ec1cf1","Type":"ContainerStarted","Data":"80a0167c7157e5877823de6ca4df3904ed413e282fa566afb2d026f0c18c7944"} Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.319204 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.325090 5118 generic.go:334] "Generic (PLEG): container finished" podID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerID="14af38c51217358d9903bd127465a04b78bec99315d19aff82a7356bcaa0b7e4" exitCode=0 Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.325170 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qknnk" event={"ID":"e669f113-9929-4497-ba2b-e5cc2169ae24","Type":"ContainerDied","Data":"14af38c51217358d9903bd127465a04b78bec99315d19aff82a7356bcaa0b7e4"} Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.328469 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" event={"ID":"2ef91c5e-da32-44d0-8c91-64ea51c4208a","Type":"ContainerStarted","Data":"456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d"} Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.328531 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" event={"ID":"2ef91c5e-da32-44d0-8c91-64ea51c4208a","Type":"ContainerStarted","Data":"2d06131615a3a8f0a4bdf0db6f4d2c572af3466c8c36f27d84027916305db377"} Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.329005 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.332059 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.334438 5118 scope.go:117] "RemoveContainer" containerID="3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.344378 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.384761 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" podStartSLOduration=3.384744682 podStartE2EDuration="3.384744682s" podCreationTimestamp="2026-02-23 06:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:58.358018922 +0000 UTC m=+141.361803505" watchObservedRunningTime="2026-02-23 06:47:58.384744682 +0000 UTC m=+141.388529255" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.394215 5118 scope.go:117] "RemoveContainer" containerID="8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.394865 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7jcb\" (UniqueName: \"kubernetes.io/projected/d061beb5-e2d8-49b3-b802-099dbdddafca-kube-api-access-c7jcb\") pod \"d061beb5-e2d8-49b3-b802-099dbdddafca\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.394966 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-utilities\") pod \"d061beb5-e2d8-49b3-b802-099dbdddafca\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.395033 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-catalog-content\") pod \"d061beb5-e2d8-49b3-b802-099dbdddafca\" (UID: \"d061beb5-e2d8-49b3-b802-099dbdddafca\") " Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.396780 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-utilities" (OuterVolumeSpecName: "utilities") pod "d061beb5-e2d8-49b3-b802-099dbdddafca" (UID: "d061beb5-e2d8-49b3-b802-099dbdddafca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.408550 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d061beb5-e2d8-49b3-b802-099dbdddafca-kube-api-access-c7jcb" (OuterVolumeSpecName: "kube-api-access-c7jcb") pod "d061beb5-e2d8-49b3-b802-099dbdddafca" (UID: "d061beb5-e2d8-49b3-b802-099dbdddafca"). InnerVolumeSpecName "kube-api-access-c7jcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.425335 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" podStartSLOduration=3.425306463 podStartE2EDuration="3.425306463s" podCreationTimestamp="2026-02-23 06:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:47:58.417939286 +0000 UTC m=+141.421723869" watchObservedRunningTime="2026-02-23 06:47:58.425306463 +0000 UTC m=+141.429091036" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.428537 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d061beb5-e2d8-49b3-b802-099dbdddafca" (UID: "d061beb5-e2d8-49b3-b802-099dbdddafca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.444891 5118 scope.go:117] "RemoveContainer" containerID="6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f" Feb 23 06:47:58 crc kubenswrapper[5118]: E0223 06:47:58.445657 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f\": container with ID starting with 6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f not found: ID does not exist" containerID="6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.445747 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f"} err="failed to get container status \"6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f\": rpc error: code = NotFound desc = could not find container \"6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f\": container with ID starting with 6a6b769e1cc6c841cc7c6d2d9139eda70d951420081f10e9d82541915eeb1b7f not found: ID does not exist" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.445790 5118 scope.go:117] "RemoveContainer" containerID="3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c" Feb 23 06:47:58 crc kubenswrapper[5118]: E0223 06:47:58.446426 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c\": container with ID starting with 3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c not found: ID does not exist" containerID="3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.446465 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c"} err="failed to get container status \"3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c\": rpc error: code = NotFound desc = could not find container \"3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c\": container with ID starting with 3d02cffefea6872fbc3c7acd2d640697ddeddb719de0129bb65fd76e30f6f46c not found: ID does not exist" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.446486 5118 scope.go:117] "RemoveContainer" containerID="8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074" Feb 23 06:47:58 crc kubenswrapper[5118]: E0223 06:47:58.446857 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074\": container with ID starting with 8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074 not found: ID does not exist" containerID="8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.446902 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074"} err="failed to get container status \"8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074\": rpc error: code = NotFound desc = could not find container \"8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074\": container with ID starting with 8591ab199a32a99026c3632e1aab2f2fc204c5ed1498f3e57ed006f71e95e074 not found: ID does not exist" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.497181 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.497218 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d061beb5-e2d8-49b3-b802-099dbdddafca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.497235 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7jcb\" (UniqueName: \"kubernetes.io/projected/d061beb5-e2d8-49b3-b802-099dbdddafca-kube-api-access-c7jcb\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.523236 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.642787 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh5v"] Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.648771 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jvh5v"] Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.700201 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-utilities\") pod \"e669f113-9929-4497-ba2b-e5cc2169ae24\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.700313 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-catalog-content\") pod \"e669f113-9929-4497-ba2b-e5cc2169ae24\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.700405 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twbjt\" (UniqueName: \"kubernetes.io/projected/e669f113-9929-4497-ba2b-e5cc2169ae24-kube-api-access-twbjt\") pod \"e669f113-9929-4497-ba2b-e5cc2169ae24\" (UID: \"e669f113-9929-4497-ba2b-e5cc2169ae24\") " Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.701309 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-utilities" (OuterVolumeSpecName: "utilities") pod "e669f113-9929-4497-ba2b-e5cc2169ae24" (UID: "e669f113-9929-4497-ba2b-e5cc2169ae24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.705404 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e669f113-9929-4497-ba2b-e5cc2169ae24-kube-api-access-twbjt" (OuterVolumeSpecName: "kube-api-access-twbjt") pod "e669f113-9929-4497-ba2b-e5cc2169ae24" (UID: "e669f113-9929-4497-ba2b-e5cc2169ae24"). InnerVolumeSpecName "kube-api-access-twbjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.792963 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h6hnk"] Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.802475 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.802508 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twbjt\" (UniqueName: \"kubernetes.io/projected/e669f113-9929-4497-ba2b-e5cc2169ae24-kube-api-access-twbjt\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.867254 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e669f113-9929-4497-ba2b-e5cc2169ae24" (UID: "e669f113-9929-4497-ba2b-e5cc2169ae24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:47:58 crc kubenswrapper[5118]: I0223 06:47:58.903883 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e669f113-9929-4497-ba2b-e5cc2169ae24-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:47:59 crc kubenswrapper[5118]: I0223 06:47:59.338753 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qknnk" event={"ID":"e669f113-9929-4497-ba2b-e5cc2169ae24","Type":"ContainerDied","Data":"332be544671998abc5830870daadbe62b9a04b216facca13c128bc3a59a2d482"} Feb 23 06:47:59 crc kubenswrapper[5118]: I0223 06:47:59.338815 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qknnk" Feb 23 06:47:59 crc kubenswrapper[5118]: I0223 06:47:59.338911 5118 scope.go:117] "RemoveContainer" containerID="14af38c51217358d9903bd127465a04b78bec99315d19aff82a7356bcaa0b7e4" Feb 23 06:47:59 crc kubenswrapper[5118]: I0223 06:47:59.363410 5118 scope.go:117] "RemoveContainer" containerID="890ba51492bd6944ebc49c33e8cd193120eab6cc28f676abf35608e0ae2e462c" Feb 23 06:47:59 crc kubenswrapper[5118]: I0223 06:47:59.389928 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qknnk"] Feb 23 06:47:59 crc kubenswrapper[5118]: I0223 06:47:59.393250 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qknnk"] Feb 23 06:47:59 crc kubenswrapper[5118]: I0223 06:47:59.415779 5118 scope.go:117] "RemoveContainer" containerID="80c68126beabffb30f3625d22dac1ad5512216113bf8b196394018a315a613c3" Feb 23 06:47:59 crc kubenswrapper[5118]: I0223 06:47:59.711597 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" path="/var/lib/kubelet/pods/d061beb5-e2d8-49b3-b802-099dbdddafca/volumes" Feb 23 06:47:59 crc kubenswrapper[5118]: I0223 06:47:59.712555 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" path="/var/lib/kubelet/pods/e669f113-9929-4497-ba2b-e5cc2169ae24/volumes" Feb 23 06:48:06 crc kubenswrapper[5118]: I0223 06:48:06.022196 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 06:48:15 crc kubenswrapper[5118]: I0223 06:48:15.403213 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54d88f76bf-p7b85"] Feb 23 06:48:15 crc kubenswrapper[5118]: I0223 06:48:15.404291 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" podUID="2ef91c5e-da32-44d0-8c91-64ea51c4208a" containerName="controller-manager" containerID="cri-o://456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d" gracePeriod=30 Feb 23 06:48:15 crc kubenswrapper[5118]: I0223 06:48:15.441732 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44"] Feb 23 06:48:15 crc kubenswrapper[5118]: I0223 06:48:15.442042 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" podUID="d6667568-7ca5-4e89-935b-4de426ec1cf1" containerName="route-controller-manager" containerID="cri-o://fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2" gracePeriod=30 Feb 23 06:48:15 crc kubenswrapper[5118]: I0223 06:48:15.969946 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.025869 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-client-ca\") pod \"d6667568-7ca5-4e89-935b-4de426ec1cf1\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.025951 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-config\") pod \"d6667568-7ca5-4e89-935b-4de426ec1cf1\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.026073 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcsqs\" (UniqueName: \"kubernetes.io/projected/d6667568-7ca5-4e89-935b-4de426ec1cf1-kube-api-access-jcsqs\") pod \"d6667568-7ca5-4e89-935b-4de426ec1cf1\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.026125 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6667568-7ca5-4e89-935b-4de426ec1cf1-serving-cert\") pod \"d6667568-7ca5-4e89-935b-4de426ec1cf1\" (UID: \"d6667568-7ca5-4e89-935b-4de426ec1cf1\") " Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.027284 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-client-ca" (OuterVolumeSpecName: "client-ca") pod "d6667568-7ca5-4e89-935b-4de426ec1cf1" (UID: "d6667568-7ca5-4e89-935b-4de426ec1cf1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.027447 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-config" (OuterVolumeSpecName: "config") pod "d6667568-7ca5-4e89-935b-4de426ec1cf1" (UID: "d6667568-7ca5-4e89-935b-4de426ec1cf1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.033142 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6667568-7ca5-4e89-935b-4de426ec1cf1-kube-api-access-jcsqs" (OuterVolumeSpecName: "kube-api-access-jcsqs") pod "d6667568-7ca5-4e89-935b-4de426ec1cf1" (UID: "d6667568-7ca5-4e89-935b-4de426ec1cf1"). InnerVolumeSpecName "kube-api-access-jcsqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.033602 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6667568-7ca5-4e89-935b-4de426ec1cf1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d6667568-7ca5-4e89-935b-4de426ec1cf1" (UID: "d6667568-7ca5-4e89-935b-4de426ec1cf1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.046244 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.128580 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-proxy-ca-bundles\") pod \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.128828 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-config\") pod \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.128940 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv2fg\" (UniqueName: \"kubernetes.io/projected/2ef91c5e-da32-44d0-8c91-64ea51c4208a-kube-api-access-kv2fg\") pod \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.129034 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef91c5e-da32-44d0-8c91-64ea51c4208a-serving-cert\") pod \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.129259 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-client-ca\") pod \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\" (UID: \"2ef91c5e-da32-44d0-8c91-64ea51c4208a\") " Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.129949 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-client-ca" (OuterVolumeSpecName: "client-ca") pod "2ef91c5e-da32-44d0-8c91-64ea51c4208a" (UID: "2ef91c5e-da32-44d0-8c91-64ea51c4208a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.129991 5118 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.130033 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6667568-7ca5-4e89-935b-4de426ec1cf1-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.130090 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcsqs\" (UniqueName: \"kubernetes.io/projected/d6667568-7ca5-4e89-935b-4de426ec1cf1-kube-api-access-jcsqs\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.130134 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6667568-7ca5-4e89-935b-4de426ec1cf1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.129983 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-config" (OuterVolumeSpecName: "config") pod "2ef91c5e-da32-44d0-8c91-64ea51c4208a" (UID: "2ef91c5e-da32-44d0-8c91-64ea51c4208a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.131118 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2ef91c5e-da32-44d0-8c91-64ea51c4208a" (UID: "2ef91c5e-da32-44d0-8c91-64ea51c4208a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.133412 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef91c5e-da32-44d0-8c91-64ea51c4208a-kube-api-access-kv2fg" (OuterVolumeSpecName: "kube-api-access-kv2fg") pod "2ef91c5e-da32-44d0-8c91-64ea51c4208a" (UID: "2ef91c5e-da32-44d0-8c91-64ea51c4208a"). InnerVolumeSpecName "kube-api-access-kv2fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.134233 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef91c5e-da32-44d0-8c91-64ea51c4208a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2ef91c5e-da32-44d0-8c91-64ea51c4208a" (UID: "2ef91c5e-da32-44d0-8c91-64ea51c4208a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.232202 5118 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.232268 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.232290 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv2fg\" (UniqueName: \"kubernetes.io/projected/2ef91c5e-da32-44d0-8c91-64ea51c4208a-kube-api-access-kv2fg\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.232315 5118 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ef91c5e-da32-44d0-8c91-64ea51c4208a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.232333 5118 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ef91c5e-da32-44d0-8c91-64ea51c4208a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.484931 5118 generic.go:334] "Generic (PLEG): container finished" podID="d6667568-7ca5-4e89-935b-4de426ec1cf1" containerID="fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2" exitCode=0 Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.485034 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.485056 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" event={"ID":"d6667568-7ca5-4e89-935b-4de426ec1cf1","Type":"ContainerDied","Data":"fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2"} Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.485212 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44" event={"ID":"d6667568-7ca5-4e89-935b-4de426ec1cf1","Type":"ContainerDied","Data":"80a0167c7157e5877823de6ca4df3904ed413e282fa566afb2d026f0c18c7944"} Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.485252 5118 scope.go:117] "RemoveContainer" containerID="fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.490597 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.490685 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" event={"ID":"2ef91c5e-da32-44d0-8c91-64ea51c4208a","Type":"ContainerDied","Data":"456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d"} Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.490499 5118 generic.go:334] "Generic (PLEG): container finished" podID="2ef91c5e-da32-44d0-8c91-64ea51c4208a" containerID="456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d" exitCode=0 Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.491429 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d88f76bf-p7b85" event={"ID":"2ef91c5e-da32-44d0-8c91-64ea51c4208a","Type":"ContainerDied","Data":"2d06131615a3a8f0a4bdf0db6f4d2c572af3466c8c36f27d84027916305db377"} Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.536173 5118 scope.go:117] "RemoveContainer" containerID="fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.538296 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44"] Feb 23 06:48:16 crc kubenswrapper[5118]: E0223 06:48:16.540067 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2\": container with ID starting with fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2 not found: ID does not exist" containerID="fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.540155 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2"} err="failed to get container status \"fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2\": rpc error: code = NotFound desc = could not find container \"fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2\": container with ID starting with fcf0ea9b63c47acb01b6e202264b14fcf261689a35eeec38883e88a04032c4c2 not found: ID does not exist" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.540196 5118 scope.go:117] "RemoveContainer" containerID="456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.543742 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd6b6d866-fns44"] Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.557170 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54d88f76bf-p7b85"] Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.565343 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54d88f76bf-p7b85"] Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.573151 5118 scope.go:117] "RemoveContainer" containerID="456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d" Feb 23 06:48:16 crc kubenswrapper[5118]: E0223 06:48:16.574068 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d\": container with ID starting with 456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d not found: ID does not exist" containerID="456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.574162 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d"} err="failed to get container status \"456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d\": rpc error: code = NotFound desc = could not find container \"456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d\": container with ID starting with 456056bcf5c6b6c1a470a0341634480688764bd9f8db4830a5a097105b67e67d not found: ID does not exist" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.717331 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69d997f996-chn4s"] Feb 23 06:48:16 crc kubenswrapper[5118]: E0223 06:48:16.717732 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerName="extract-utilities" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.717765 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerName="extract-utilities" Feb 23 06:48:16 crc kubenswrapper[5118]: E0223 06:48:16.717790 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerName="extract-content" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.717804 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerName="extract-content" Feb 23 06:48:16 crc kubenswrapper[5118]: E0223 06:48:16.717834 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef91c5e-da32-44d0-8c91-64ea51c4208a" containerName="controller-manager" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.717849 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef91c5e-da32-44d0-8c91-64ea51c4208a" containerName="controller-manager" Feb 23 06:48:16 crc kubenswrapper[5118]: E0223 06:48:16.717872 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6667568-7ca5-4e89-935b-4de426ec1cf1" containerName="route-controller-manager" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.717886 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6667568-7ca5-4e89-935b-4de426ec1cf1" containerName="route-controller-manager" Feb 23 06:48:16 crc kubenswrapper[5118]: E0223 06:48:16.717906 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerName="extract-content" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.717919 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerName="extract-content" Feb 23 06:48:16 crc kubenswrapper[5118]: E0223 06:48:16.717937 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerName="extract-utilities" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.717952 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerName="extract-utilities" Feb 23 06:48:16 crc kubenswrapper[5118]: E0223 06:48:16.717974 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerName="registry-server" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.717988 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerName="registry-server" Feb 23 06:48:16 crc kubenswrapper[5118]: E0223 06:48:16.718017 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerName="registry-server" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.718030 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerName="registry-server" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.718245 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6667568-7ca5-4e89-935b-4de426ec1cf1" containerName="route-controller-manager" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.718267 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d061beb5-e2d8-49b3-b802-099dbdddafca" containerName="registry-server" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.718280 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e669f113-9929-4497-ba2b-e5cc2169ae24" containerName="registry-server" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.718302 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef91c5e-da32-44d0-8c91-64ea51c4208a" containerName="controller-manager" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.719183 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.720893 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv"] Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.722660 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.724331 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.724398 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.724578 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.726267 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.726720 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.726928 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.727084 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.727498 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.727821 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.728816 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.729239 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.729590 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.735680 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69d997f996-chn4s"] Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.738843 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.740316 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv"] Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.841320 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3857d16e-ae9b-4a74-9014-90c8ba05e61d-config\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.841752 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41afcfae-a0a3-4735-a5e2-06ba63da69d0-serving-cert\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.842025 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3857d16e-ae9b-4a74-9014-90c8ba05e61d-serving-cert\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.842325 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41afcfae-a0a3-4735-a5e2-06ba63da69d0-client-ca\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.842395 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41afcfae-a0a3-4735-a5e2-06ba63da69d0-proxy-ca-bundles\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.842562 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r67mm\" (UniqueName: \"kubernetes.io/projected/3857d16e-ae9b-4a74-9014-90c8ba05e61d-kube-api-access-r67mm\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.842680 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3857d16e-ae9b-4a74-9014-90c8ba05e61d-client-ca\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.842766 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41afcfae-a0a3-4735-a5e2-06ba63da69d0-config\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.842862 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvb8\" (UniqueName: \"kubernetes.io/projected/41afcfae-a0a3-4735-a5e2-06ba63da69d0-kube-api-access-7gvb8\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.944512 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3857d16e-ae9b-4a74-9014-90c8ba05e61d-config\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.944585 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41afcfae-a0a3-4735-a5e2-06ba63da69d0-serving-cert\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.944634 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3857d16e-ae9b-4a74-9014-90c8ba05e61d-serving-cert\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.944787 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41afcfae-a0a3-4735-a5e2-06ba63da69d0-client-ca\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.944897 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41afcfae-a0a3-4735-a5e2-06ba63da69d0-proxy-ca-bundles\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.944960 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r67mm\" (UniqueName: \"kubernetes.io/projected/3857d16e-ae9b-4a74-9014-90c8ba05e61d-kube-api-access-r67mm\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.944997 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3857d16e-ae9b-4a74-9014-90c8ba05e61d-client-ca\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.945035 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41afcfae-a0a3-4735-a5e2-06ba63da69d0-config\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.945071 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvb8\" (UniqueName: \"kubernetes.io/projected/41afcfae-a0a3-4735-a5e2-06ba63da69d0-kube-api-access-7gvb8\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.946539 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3857d16e-ae9b-4a74-9014-90c8ba05e61d-client-ca\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.946721 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3857d16e-ae9b-4a74-9014-90c8ba05e61d-config\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.949805 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41afcfae-a0a3-4735-a5e2-06ba63da69d0-config\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.950199 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/41afcfae-a0a3-4735-a5e2-06ba63da69d0-proxy-ca-bundles\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.952152 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/41afcfae-a0a3-4735-a5e2-06ba63da69d0-client-ca\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.953400 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41afcfae-a0a3-4735-a5e2-06ba63da69d0-serving-cert\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.953514 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3857d16e-ae9b-4a74-9014-90c8ba05e61d-serving-cert\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.965928 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r67mm\" (UniqueName: \"kubernetes.io/projected/3857d16e-ae9b-4a74-9014-90c8ba05e61d-kube-api-access-r67mm\") pod \"route-controller-manager-85685c49dc-xs8cv\" (UID: \"3857d16e-ae9b-4a74-9014-90c8ba05e61d\") " pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:16 crc kubenswrapper[5118]: I0223 06:48:16.976458 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvb8\" (UniqueName: \"kubernetes.io/projected/41afcfae-a0a3-4735-a5e2-06ba63da69d0-kube-api-access-7gvb8\") pod \"controller-manager-69d997f996-chn4s\" (UID: \"41afcfae-a0a3-4735-a5e2-06ba63da69d0\") " pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:17 crc kubenswrapper[5118]: I0223 06:48:17.087412 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:17 crc kubenswrapper[5118]: I0223 06:48:17.103582 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:17 crc kubenswrapper[5118]: I0223 06:48:17.384596 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv"] Feb 23 06:48:17 crc kubenswrapper[5118]: I0223 06:48:17.401791 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69d997f996-chn4s"] Feb 23 06:48:17 crc kubenswrapper[5118]: W0223 06:48:17.410365 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41afcfae_a0a3_4735_a5e2_06ba63da69d0.slice/crio-c80b37941a7573792bc4e15c2285539cd513447e8b885a5d43d7a46c8118d3ae WatchSource:0}: Error finding container c80b37941a7573792bc4e15c2285539cd513447e8b885a5d43d7a46c8118d3ae: Status 404 returned error can't find the container with id c80b37941a7573792bc4e15c2285539cd513447e8b885a5d43d7a46c8118d3ae Feb 23 06:48:17 crc kubenswrapper[5118]: I0223 06:48:17.514023 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" event={"ID":"3857d16e-ae9b-4a74-9014-90c8ba05e61d","Type":"ContainerStarted","Data":"f7fb7edf8c17be45db1c70b5436a0a08f2fddb3aac7949452020287429cf8fe1"} Feb 23 06:48:17 crc kubenswrapper[5118]: I0223 06:48:17.518460 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" event={"ID":"41afcfae-a0a3-4735-a5e2-06ba63da69d0","Type":"ContainerStarted","Data":"c80b37941a7573792bc4e15c2285539cd513447e8b885a5d43d7a46c8118d3ae"} Feb 23 06:48:17 crc kubenswrapper[5118]: I0223 06:48:17.707275 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef91c5e-da32-44d0-8c91-64ea51c4208a" path="/var/lib/kubelet/pods/2ef91c5e-da32-44d0-8c91-64ea51c4208a/volumes" Feb 23 06:48:17 crc kubenswrapper[5118]: I0223 06:48:17.708849 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6667568-7ca5-4e89-935b-4de426ec1cf1" path="/var/lib/kubelet/pods/d6667568-7ca5-4e89-935b-4de426ec1cf1/volumes" Feb 23 06:48:18 crc kubenswrapper[5118]: I0223 06:48:18.532184 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" event={"ID":"41afcfae-a0a3-4735-a5e2-06ba63da69d0","Type":"ContainerStarted","Data":"0ad88aa19ef45baf79872a811dd8242f872369200fca9b1da0a924d069ef6f53"} Feb 23 06:48:18 crc kubenswrapper[5118]: I0223 06:48:18.532819 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:18 crc kubenswrapper[5118]: I0223 06:48:18.535250 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" event={"ID":"3857d16e-ae9b-4a74-9014-90c8ba05e61d","Type":"ContainerStarted","Data":"62b2097371ab5daae904a0ca83529f565f653c0186e8dad46c01829cce767c50"} Feb 23 06:48:18 crc kubenswrapper[5118]: I0223 06:48:18.535961 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:18 crc kubenswrapper[5118]: I0223 06:48:18.541183 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" Feb 23 06:48:18 crc kubenswrapper[5118]: I0223 06:48:18.547765 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" Feb 23 06:48:18 crc kubenswrapper[5118]: I0223 06:48:18.566754 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69d997f996-chn4s" podStartSLOduration=3.566720772 podStartE2EDuration="3.566720772s" podCreationTimestamp="2026-02-23 06:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:18.562949507 +0000 UTC m=+161.566734120" watchObservedRunningTime="2026-02-23 06:48:18.566720772 +0000 UTC m=+161.570505375" Feb 23 06:48:18 crc kubenswrapper[5118]: I0223 06:48:18.625733 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85685c49dc-xs8cv" podStartSLOduration=3.625698022 podStartE2EDuration="3.625698022s" podCreationTimestamp="2026-02-23 06:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:18.624090511 +0000 UTC m=+161.627875114" watchObservedRunningTime="2026-02-23 06:48:18.625698022 +0000 UTC m=+161.629482635" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.120364 5118 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.121212 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d" gracePeriod=15 Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.121310 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e" gracePeriod=15 Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.121396 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb" gracePeriod=15 Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.121371 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b" gracePeriod=15 Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.121310 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61" gracePeriod=15 Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.128196 5118 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:48:20 crc kubenswrapper[5118]: E0223 06:48:20.128581 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.128604 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:48:20 crc kubenswrapper[5118]: E0223 06:48:20.128620 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.128636 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 06:48:20 crc kubenswrapper[5118]: E0223 06:48:20.128663 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.128682 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:48:20 crc kubenswrapper[5118]: E0223 06:48:20.128706 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.128719 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:48:20 crc kubenswrapper[5118]: E0223 06:48:20.128738 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.128751 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 06:48:20 crc kubenswrapper[5118]: E0223 06:48:20.128777 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.128788 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 06:48:20 crc kubenswrapper[5118]: E0223 06:48:20.128804 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.128816 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 23 06:48:20 crc kubenswrapper[5118]: E0223 06:48:20.128828 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.128839 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.129017 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.129039 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.129056 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.129074 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.129102 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.129224 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.129694 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.139555 5118 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.142447 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.151366 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 23 06:48:20 crc kubenswrapper[5118]: E0223 06:48:20.180115 5118 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.299251 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.299309 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.299337 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.299378 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.299536 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.299572 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.299601 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.299769 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.401771 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.402272 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.402490 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.402655 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.402831 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.402992 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.403200 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.403408 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.401933 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.402888 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.402556 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.403049 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.402341 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.403270 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.402731 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.403443 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.480997 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:20 crc kubenswrapper[5118]: W0223 06:48:20.516562 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-4586d9f69d8cf42d130b526678ca7df6bdd6bd0c5974eb1aa1113923fc936a97 WatchSource:0}: Error finding container 4586d9f69d8cf42d130b526678ca7df6bdd6bd0c5974eb1aa1113923fc936a97: Status 404 returned error can't find the container with id 4586d9f69d8cf42d130b526678ca7df6bdd6bd0c5974eb1aa1113923fc936a97 Feb 23 06:48:20 crc kubenswrapper[5118]: E0223 06:48:20.520955 5118 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.46:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896cd5c7c1e964a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:48:20.519933514 +0000 UTC m=+163.523718127,LastTimestamp:2026-02-23 06:48:20.519933514 +0000 UTC m=+163.523718127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.558709 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4586d9f69d8cf42d130b526678ca7df6bdd6bd0c5974eb1aa1113923fc936a97"} Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.563021 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.566206 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.567316 5118 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb" exitCode=0 Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.567360 5118 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e" exitCode=0 Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.567377 5118 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61" exitCode=0 Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.567394 5118 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b" exitCode=2 Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.567495 5118 scope.go:117] "RemoveContainer" containerID="38c2e880812327f88945cdcc0f5f26f9fb1fe825899287fe03200cba03496836" Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.571631 5118 generic.go:334] "Generic (PLEG): container finished" podID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" containerID="ab071a05c1d2ce5891440a4395a79e44dd77e844a504db6fe09dca6278279560" exitCode=0 Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.571675 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b844ac65-c3d9-4e59-82cd-71d83bc3198a","Type":"ContainerDied","Data":"ab071a05c1d2ce5891440a4395a79e44dd77e844a504db6fe09dca6278279560"} Feb 23 06:48:20 crc kubenswrapper[5118]: I0223 06:48:20.573862 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:21 crc kubenswrapper[5118]: I0223 06:48:21.588687 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:48:21 crc kubenswrapper[5118]: I0223 06:48:21.595575 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0"} Feb 23 06:48:21 crc kubenswrapper[5118]: E0223 06:48:21.596860 5118 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:21 crc kubenswrapper[5118]: I0223 06:48:21.596845 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:21 crc kubenswrapper[5118]: I0223 06:48:21.979322 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:48:21 crc kubenswrapper[5118]: I0223 06:48:21.980038 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.130726 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kubelet-dir\") pod \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.130792 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-var-lock\") pod \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.130835 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kube-api-access\") pod \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\" (UID: \"b844ac65-c3d9-4e59-82cd-71d83bc3198a\") " Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.130880 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b844ac65-c3d9-4e59-82cd-71d83bc3198a" (UID: "b844ac65-c3d9-4e59-82cd-71d83bc3198a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.130968 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-var-lock" (OuterVolumeSpecName: "var-lock") pod "b844ac65-c3d9-4e59-82cd-71d83bc3198a" (UID: "b844ac65-c3d9-4e59-82cd-71d83bc3198a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.131166 5118 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.131184 5118 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b844ac65-c3d9-4e59-82cd-71d83bc3198a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.143686 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b844ac65-c3d9-4e59-82cd-71d83bc3198a" (UID: "b844ac65-c3d9-4e59-82cd-71d83bc3198a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.235057 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b844ac65-c3d9-4e59-82cd-71d83bc3198a-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.518540 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.519940 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.520994 5118 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.521271 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.539549 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.539592 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.539646 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.539677 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.539710 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.539811 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.539875 5118 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.539889 5118 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.539897 5118 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.604554 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b844ac65-c3d9-4e59-82cd-71d83bc3198a","Type":"ContainerDied","Data":"b67696d47c8e503c246e279f14c611ab3d5339c993cf2bb3bbc927544477d176"} Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.605769 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b67696d47c8e503c246e279f14c611ab3d5339c993cf2bb3bbc927544477d176" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.604581 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.607731 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.608797 5118 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d" exitCode=0 Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.608905 5118 scope.go:117] "RemoveContainer" containerID="66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.608953 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:22 crc kubenswrapper[5118]: E0223 06:48:22.609964 5118 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.625234 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.625652 5118 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.633729 5118 scope.go:117] "RemoveContainer" containerID="6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.634615 5118 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.635068 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.652430 5118 scope.go:117] "RemoveContainer" containerID="59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.670228 5118 scope.go:117] "RemoveContainer" containerID="17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.689977 5118 scope.go:117] "RemoveContainer" containerID="53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.716047 5118 scope.go:117] "RemoveContainer" containerID="438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.754565 5118 scope.go:117] "RemoveContainer" containerID="66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb" Feb 23 06:48:22 crc kubenswrapper[5118]: E0223 06:48:22.755591 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb\": container with ID starting with 66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb not found: ID does not exist" containerID="66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.755751 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb"} err="failed to get container status \"66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb\": rpc error: code = NotFound desc = could not find container \"66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb\": container with ID starting with 66154d02eef776ec5585445a58dc170e2b0bfc8e8e8de12dcec8b4ce6734d8bb not found: ID does not exist" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.755826 5118 scope.go:117] "RemoveContainer" containerID="6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e" Feb 23 06:48:22 crc kubenswrapper[5118]: E0223 06:48:22.756379 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e\": container with ID starting with 6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e not found: ID does not exist" containerID="6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.756453 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e"} err="failed to get container status \"6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e\": rpc error: code = NotFound desc = could not find container \"6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e\": container with ID starting with 6315e54d8465e17a29337964947a415ae69876a333451140785d8d07ba55766e not found: ID does not exist" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.756482 5118 scope.go:117] "RemoveContainer" containerID="59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61" Feb 23 06:48:22 crc kubenswrapper[5118]: E0223 06:48:22.757047 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61\": container with ID starting with 59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61 not found: ID does not exist" containerID="59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.757326 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61"} err="failed to get container status \"59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61\": rpc error: code = NotFound desc = could not find container \"59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61\": container with ID starting with 59715f785902ea54a69ea8b8c5c257ef2101559800a9079f94e0ae0a6ed59b61 not found: ID does not exist" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.757390 5118 scope.go:117] "RemoveContainer" containerID="17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b" Feb 23 06:48:22 crc kubenswrapper[5118]: E0223 06:48:22.758376 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b\": container with ID starting with 17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b not found: ID does not exist" containerID="17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.758518 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b"} err="failed to get container status \"17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b\": rpc error: code = NotFound desc = could not find container \"17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b\": container with ID starting with 17cd8d1d7fa0457b4d2f16e822437ae496a6514703474257b75a64d2396f2a9b not found: ID does not exist" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.758633 5118 scope.go:117] "RemoveContainer" containerID="53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d" Feb 23 06:48:22 crc kubenswrapper[5118]: E0223 06:48:22.759208 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d\": container with ID starting with 53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d not found: ID does not exist" containerID="53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.759260 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d"} err="failed to get container status \"53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d\": rpc error: code = NotFound desc = could not find container \"53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d\": container with ID starting with 53d8e4a55f2a47da2eeb56025c6b3863c0be2af5f18850510b5cd580726aaf5d not found: ID does not exist" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.759291 5118 scope.go:117] "RemoveContainer" containerID="438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f" Feb 23 06:48:22 crc kubenswrapper[5118]: E0223 06:48:22.759867 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\": container with ID starting with 438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f not found: ID does not exist" containerID="438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f" Feb 23 06:48:22 crc kubenswrapper[5118]: I0223 06:48:22.759973 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f"} err="failed to get container status \"438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\": rpc error: code = NotFound desc = could not find container \"438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f\": container with ID starting with 438c9b782e8a765056c5b8ccb1270c7b49bad391e16b3bea52c3ceaccee21a8f not found: ID does not exist" Feb 23 06:48:23 crc kubenswrapper[5118]: I0223 06:48:23.711873 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 23 06:48:23 crc kubenswrapper[5118]: I0223 06:48:23.846238 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" containerName="oauth-openshift" containerID="cri-o://30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e" gracePeriod=15 Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.342527 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.343629 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.344452 5118 status_manager.go:851] "Failed to get status for pod" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h6hnk\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.475512 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-provider-selection\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.475709 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-ocp-branding-template\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.475775 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-session\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.475815 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-trusted-ca-bundle\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.475860 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-serving-cert\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.475919 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-error\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.476190 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-router-certs\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.477476 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.477463 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-login\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.478177 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-service-ca\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.478240 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-idp-0-file-data\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.478297 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-policies\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.478345 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-dir\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.478395 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-cliconfig\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.478511 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjkxx\" (UniqueName: \"kubernetes.io/projected/90b8cdf6-2770-4311-87a9-55c70e7967cf-kube-api-access-pjkxx\") pod \"90b8cdf6-2770-4311-87a9-55c70e7967cf\" (UID: \"90b8cdf6-2770-4311-87a9-55c70e7967cf\") " Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.479065 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.478480 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.479333 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.479431 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.480798 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.483912 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.484465 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.485053 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.485681 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.486279 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.486521 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.486793 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.490448 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b8cdf6-2770-4311-87a9-55c70e7967cf-kube-api-access-pjkxx" (OuterVolumeSpecName: "kube-api-access-pjkxx") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "kube-api-access-pjkxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.490476 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "90b8cdf6-2770-4311-87a9-55c70e7967cf" (UID: "90b8cdf6-2770-4311-87a9-55c70e7967cf"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581075 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581169 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581193 5118 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581215 5118 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b8cdf6-2770-4311-87a9-55c70e7967cf-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581232 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581252 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjkxx\" (UniqueName: \"kubernetes.io/projected/90b8cdf6-2770-4311-87a9-55c70e7967cf-kube-api-access-pjkxx\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581271 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581292 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581314 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581334 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581352 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581370 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.581388 5118 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/90b8cdf6-2770-4311-87a9-55c70e7967cf-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.626415 5118 generic.go:334] "Generic (PLEG): container finished" podID="90b8cdf6-2770-4311-87a9-55c70e7967cf" containerID="30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e" exitCode=0 Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.626503 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" event={"ID":"90b8cdf6-2770-4311-87a9-55c70e7967cf","Type":"ContainerDied","Data":"30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e"} Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.626556 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.626597 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" event={"ID":"90b8cdf6-2770-4311-87a9-55c70e7967cf","Type":"ContainerDied","Data":"cb4381f00dd44a5cc9866074f305f826582f23254003faf098d5c97662d535b6"} Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.626672 5118 scope.go:117] "RemoveContainer" containerID="30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.627568 5118 status_manager.go:851] "Failed to get status for pod" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h6hnk\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.628377 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.652317 5118 status_manager.go:851] "Failed to get status for pod" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h6hnk\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.652807 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.657017 5118 scope.go:117] "RemoveContainer" containerID="30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e" Feb 23 06:48:24 crc kubenswrapper[5118]: E0223 06:48:24.657741 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e\": container with ID starting with 30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e not found: ID does not exist" containerID="30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e" Feb 23 06:48:24 crc kubenswrapper[5118]: I0223 06:48:24.657804 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e"} err="failed to get container status \"30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e\": rpc error: code = NotFound desc = could not find container \"30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e\": container with ID starting with 30217c0d11c517da0c42d5db797692278a6b2401c27112319476c214144de85e not found: ID does not exist" Feb 23 06:48:26 crc kubenswrapper[5118]: E0223 06:48:26.182619 5118 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.46:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896cd5c7c1e964a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:48:20.519933514 +0000 UTC m=+163.523718127,LastTimestamp:2026-02-23 06:48:20.519933514 +0000 UTC m=+163.523718127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:48:27 crc kubenswrapper[5118]: I0223 06:48:27.702170 5118 status_manager.go:851] "Failed to get status for pod" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h6hnk\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:27 crc kubenswrapper[5118]: I0223 06:48:27.703704 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:28 crc kubenswrapper[5118]: E0223 06:48:28.879085 5118 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:28 crc kubenswrapper[5118]: E0223 06:48:28.879906 5118 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:28 crc kubenswrapper[5118]: E0223 06:48:28.880544 5118 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:28 crc kubenswrapper[5118]: E0223 06:48:28.881027 5118 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:28 crc kubenswrapper[5118]: E0223 06:48:28.881500 5118 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:28 crc kubenswrapper[5118]: I0223 06:48:28.881557 5118 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 23 06:48:28 crc kubenswrapper[5118]: E0223 06:48:28.881991 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="200ms" Feb 23 06:48:29 crc kubenswrapper[5118]: E0223 06:48:29.083042 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="400ms" Feb 23 06:48:29 crc kubenswrapper[5118]: E0223 06:48:29.483854 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="800ms" Feb 23 06:48:30 crc kubenswrapper[5118]: E0223 06:48:30.285243 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="1.6s" Feb 23 06:48:30 crc kubenswrapper[5118]: E0223 06:48:30.796763 5118 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" volumeName="registry-storage" Feb 23 06:48:31 crc kubenswrapper[5118]: E0223 06:48:31.886798 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="3.2s" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.400427 5118 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.400819 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.697449 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.699141 5118 status_manager.go:851] "Failed to get status for pod" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h6hnk\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.700086 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.723076 5118 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.723769 5118 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:34 crc kubenswrapper[5118]: E0223 06:48:34.724403 5118 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.725379 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.734361 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.735422 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.735496 5118 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9ff2beab0cd75d0d4df4d459e85b7c74dbdbf8965df9638cc7f614fede66b4b7" exitCode=1 Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.735542 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9ff2beab0cd75d0d4df4d459e85b7c74dbdbf8965df9638cc7f614fede66b4b7"} Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.736300 5118 scope.go:117] "RemoveContainer" containerID="9ff2beab0cd75d0d4df4d459e85b7c74dbdbf8965df9638cc7f614fede66b4b7" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.736795 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.737299 5118 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:34 crc kubenswrapper[5118]: I0223 06:48:34.737811 5118 status_manager.go:851] "Failed to get status for pod" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h6hnk\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:34 crc kubenswrapper[5118]: W0223 06:48:34.770312 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-22ff4a7993029f87674c6dede26c09afd3c219fc39f91f7c78b60b7da0189169 WatchSource:0}: Error finding container 22ff4a7993029f87674c6dede26c09afd3c219fc39f91f7c78b60b7da0189169: Status 404 returned error can't find the container with id 22ff4a7993029f87674c6dede26c09afd3c219fc39f91f7c78b60b7da0189169 Feb 23 06:48:35 crc kubenswrapper[5118]: E0223 06:48:35.089023 5118 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="6.4s" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.749508 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.750566 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.750717 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d1183de7bb2fdf478bebd89ba037089bde2dcc59d34bd7ad420961749a71373b"} Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.752271 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.753694 5118 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.753855 5118 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b08a7173a6391f5e1675b63238b8d7c84420f85d23a41c2a2acd37222326f17a" exitCode=0 Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.753929 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b08a7173a6391f5e1675b63238b8d7c84420f85d23a41c2a2acd37222326f17a"} Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.753992 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"22ff4a7993029f87674c6dede26c09afd3c219fc39f91f7c78b60b7da0189169"} Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.754482 5118 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.754519 5118 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.754525 5118 status_manager.go:851] "Failed to get status for pod" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h6hnk\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.755159 5118 status_manager.go:851] "Failed to get status for pod" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:35 crc kubenswrapper[5118]: E0223 06:48:35.755217 5118 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.755703 5118 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.756277 5118 status_manager.go:851] "Failed to get status for pod" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" pod="openshift-authentication/oauth-openshift-558db77b4-h6hnk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-h6hnk\": dial tcp 38.102.83.46:6443: connect: connection refused" Feb 23 06:48:35 crc kubenswrapper[5118]: I0223 06:48:35.990558 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:48:36 crc kubenswrapper[5118]: E0223 06:48:36.184196 5118 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.46:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896cd5c7c1e964a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 06:48:20.519933514 +0000 UTC m=+163.523718127,LastTimestamp:2026-02-23 06:48:20.519933514 +0000 UTC m=+163.523718127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 06:48:36 crc kubenswrapper[5118]: I0223 06:48:36.767524 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3d227b11069285bfc09950734cd4052889f4f624bc374403ccf2c2241e447b81"} Feb 23 06:48:36 crc kubenswrapper[5118]: I0223 06:48:36.768130 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"822d608556838d8c31bd4e023b80e500029bbc053289888bae62e9b41aa12edf"} Feb 23 06:48:36 crc kubenswrapper[5118]: I0223 06:48:36.768153 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c1f7355971cf7acba13b6d72d3e850a601b6c2d046f6669d8d8da723aaf315f8"} Feb 23 06:48:37 crc kubenswrapper[5118]: I0223 06:48:37.788330 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2066472b97986bd5ca6cd9c1aae0a53ce59629223a03433f69dc0c53f8d95d6d"} Feb 23 06:48:37 crc kubenswrapper[5118]: I0223 06:48:37.788669 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"94795ba64427075d09a034cc2b4d521288511bfdda15c8b05862b2e985becd18"} Feb 23 06:48:37 crc kubenswrapper[5118]: I0223 06:48:37.788795 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:37 crc kubenswrapper[5118]: I0223 06:48:37.788845 5118 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:37 crc kubenswrapper[5118]: I0223 06:48:37.788867 5118 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:39 crc kubenswrapper[5118]: I0223 06:48:39.056942 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:48:39 crc kubenswrapper[5118]: I0223 06:48:39.066747 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:48:39 crc kubenswrapper[5118]: I0223 06:48:39.726310 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:39 crc kubenswrapper[5118]: I0223 06:48:39.726421 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:39 crc kubenswrapper[5118]: I0223 06:48:39.735541 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:42 crc kubenswrapper[5118]: I0223 06:48:42.802573 5118 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:42 crc kubenswrapper[5118]: I0223 06:48:42.830264 5118 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:42 crc kubenswrapper[5118]: I0223 06:48:42.830299 5118 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:42 crc kubenswrapper[5118]: I0223 06:48:42.833821 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:42 crc kubenswrapper[5118]: I0223 06:48:42.879901 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="dbfd2374-ab19-4f9e-9eaa-f5c1f8a6f19d" Feb 23 06:48:43 crc kubenswrapper[5118]: I0223 06:48:43.866054 5118 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:43 crc kubenswrapper[5118]: I0223 06:48:43.866119 5118 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:43 crc kubenswrapper[5118]: I0223 06:48:43.882367 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="dbfd2374-ab19-4f9e-9eaa-f5c1f8a6f19d" Feb 23 06:48:45 crc kubenswrapper[5118]: I0223 06:48:45.997307 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 06:48:52 crc kubenswrapper[5118]: I0223 06:48:52.508230 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 06:48:52 crc kubenswrapper[5118]: I0223 06:48:52.986131 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 06:48:53 crc kubenswrapper[5118]: I0223 06:48:53.192008 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 06:48:53 crc kubenswrapper[5118]: I0223 06:48:53.395903 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 06:48:53 crc kubenswrapper[5118]: I0223 06:48:53.934031 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 06:48:54 crc kubenswrapper[5118]: I0223 06:48:54.063963 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 06:48:54 crc kubenswrapper[5118]: I0223 06:48:54.117401 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 06:48:54 crc kubenswrapper[5118]: I0223 06:48:54.154621 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 23 06:48:54 crc kubenswrapper[5118]: I0223 06:48:54.190623 5118 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 23 06:48:54 crc kubenswrapper[5118]: I0223 06:48:54.405282 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:48:54 crc kubenswrapper[5118]: I0223 06:48:54.446285 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 06:48:54 crc kubenswrapper[5118]: I0223 06:48:54.615889 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 06:48:54 crc kubenswrapper[5118]: I0223 06:48:54.684541 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 06:48:54 crc kubenswrapper[5118]: I0223 06:48:54.771744 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 06:48:54 crc kubenswrapper[5118]: I0223 06:48:54.855367 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 06:48:55 crc kubenswrapper[5118]: I0223 06:48:55.091870 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 23 06:48:55 crc kubenswrapper[5118]: I0223 06:48:55.233533 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 06:48:55 crc kubenswrapper[5118]: I0223 06:48:55.461483 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 23 06:48:55 crc kubenswrapper[5118]: I0223 06:48:55.562402 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 06:48:55 crc kubenswrapper[5118]: I0223 06:48:55.563384 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 06:48:55 crc kubenswrapper[5118]: I0223 06:48:55.686596 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 06:48:55 crc kubenswrapper[5118]: I0223 06:48:55.858502 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 06:48:55 crc kubenswrapper[5118]: I0223 06:48:55.895497 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 06:48:55 crc kubenswrapper[5118]: I0223 06:48:55.964409 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.114984 5118 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.148432 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.199416 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.204778 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.214784 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.330688 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.421936 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.533637 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.562905 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.590033 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.717818 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.794767 5118 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.855168 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.855722 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.898364 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 06:48:56 crc kubenswrapper[5118]: I0223 06:48:56.918556 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.042444 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.060045 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.105611 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.153905 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.159649 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.162405 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.272225 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.326399 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.336074 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.339864 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.352037 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.378981 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.445279 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.518250 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.556712 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.608711 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.629319 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.692090 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.694118 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.801712 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.844423 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.870900 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.881735 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.913802 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.951291 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 06:48:57 crc kubenswrapper[5118]: I0223 06:48:57.977348 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.022463 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.078120 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.092349 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.194533 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.218395 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.244455 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.253318 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.282271 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.294897 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.295085 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.309148 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.377212 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.384386 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.395214 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.701346 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.723830 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.783829 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.789236 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.830691 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.874459 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.908800 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.964391 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 23 06:48:58 crc kubenswrapper[5118]: I0223 06:48:58.974148 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.036344 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.175679 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.209191 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.296859 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.324605 5118 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.332287 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h6hnk","openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.332424 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-56c7c74f4-svhdq","openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 06:48:59 crc kubenswrapper[5118]: E0223 06:48:59.332812 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" containerName="installer" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.332847 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" containerName="installer" Feb 23 06:48:59 crc kubenswrapper[5118]: E0223 06:48:59.332875 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" containerName="oauth-openshift" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.332888 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" containerName="oauth-openshift" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.333018 5118 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.333074 5118 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b1845c79-b386-4abe-a0c6-dff68eafa20f" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.333237 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" containerName="oauth-openshift" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.333279 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b844ac65-c3d9-4e59-82cd-71d83bc3198a" containerName="installer" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.333887 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.340012 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.340368 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.340487 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.340535 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.340748 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.340933 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.341257 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.341507 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.341556 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.341683 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.342639 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.343378 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.344954 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.350542 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.362002 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.369723 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.373472 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.37345036 podStartE2EDuration="17.37345036s" podCreationTimestamp="2026-02-23 06:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:48:59.372411593 +0000 UTC m=+202.376196216" watchObservedRunningTime="2026-02-23 06:48:59.37345036 +0000 UTC m=+202.377234933" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441538 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-router-certs\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441581 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441602 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-template-login\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441646 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441666 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441688 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxjvz\" (UniqueName: \"kubernetes.io/projected/78c7daf0-3268-4942-a31a-32744ce0cb30-kube-api-access-fxjvz\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441708 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-audit-policies\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441742 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78c7daf0-3268-4942-a31a-32744ce0cb30-audit-dir\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441757 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441776 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441793 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441808 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-template-error\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441828 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-service-ca\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.441851 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-session\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.443721 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.465582 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544004 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-service-ca\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544157 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-session\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544209 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544242 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-router-certs\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544280 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-template-login\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544368 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544407 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544451 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxjvz\" (UniqueName: \"kubernetes.io/projected/78c7daf0-3268-4942-a31a-32744ce0cb30-kube-api-access-fxjvz\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544497 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-audit-policies\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544573 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78c7daf0-3268-4942-a31a-32744ce0cb30-audit-dir\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544611 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544648 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544680 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544715 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-template-error\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.544853 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78c7daf0-3268-4942-a31a-32744ce0cb30-audit-dir\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.553380 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-serving-cert\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.554062 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.554402 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-session\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.554598 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-cliconfig\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.554811 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.545727 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-service-ca\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.555810 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78c7daf0-3268-4942-a31a-32744ce0cb30-audit-policies\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.556343 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.557861 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-template-error\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.564502 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-user-template-login\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.564924 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.569224 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78c7daf0-3268-4942-a31a-32744ce0cb30-v4-0-config-system-router-certs\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.588845 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.599735 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxjvz\" (UniqueName: \"kubernetes.io/projected/78c7daf0-3268-4942-a31a-32744ce0cb30-kube-api-access-fxjvz\") pod \"oauth-openshift-56c7c74f4-svhdq\" (UID: \"78c7daf0-3268-4942-a31a-32744ce0cb30\") " pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.610898 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.676744 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.676832 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.706652 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b8cdf6-2770-4311-87a9-55c70e7967cf" path="/var/lib/kubelet/pods/90b8cdf6-2770-4311-87a9-55c70e7967cf/volumes" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.718219 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.761934 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.816570 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.865670 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 06:48:59 crc kubenswrapper[5118]: I0223 06:48:59.900291 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.044069 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.060919 5118 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.075570 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.105287 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.227224 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.391651 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.469290 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.597186 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.731389 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.759237 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.770700 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 23 06:49:00 crc kubenswrapper[5118]: I0223 06:49:00.899516 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.192367 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.205912 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.287679 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.295537 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.296436 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.310835 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.346934 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.403911 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.447203 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.459258 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.473967 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.511040 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.515950 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.709510 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.731747 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.736416 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.766113 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.838788 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 06:49:01 crc kubenswrapper[5118]: I0223 06:49:01.999254 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-56c7c74f4-svhdq"] Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.053768 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.073004 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.079138 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.120667 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.168344 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.301155 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.308770 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.340648 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.382352 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.475244 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.485745 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.501782 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.530633 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.545653 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-56c7c74f4-svhdq"] Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.664775 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.706745 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.885903 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.888069 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.975554 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:49:02 crc kubenswrapper[5118]: I0223 06:49:02.975626 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.002896 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" event={"ID":"78c7daf0-3268-4942-a31a-32744ce0cb30","Type":"ContainerStarted","Data":"1e07408926db1dc5070e59894d4383981581690a64576bfb7c97254bd88b8bab"} Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.002947 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" event={"ID":"78c7daf0-3268-4942-a31a-32744ce0cb30","Type":"ContainerStarted","Data":"955e4419785b029817fbac05c913daddf9f7e5f41aa6bf5ae8176db75da1490d"} Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.004554 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.027279 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" podStartSLOduration=65.027229806 podStartE2EDuration="1m5.027229806s" podCreationTimestamp="2026-02-23 06:47:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:49:03.021710486 +0000 UTC m=+206.025495109" watchObservedRunningTime="2026-02-23 06:49:03.027229806 +0000 UTC m=+206.031014389" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.048660 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.060175 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.066742 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.087647 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.111376 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.281684 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-56c7c74f4-svhdq" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.302694 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.407832 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.414694 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.454513 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.503948 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.527120 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.706148 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.748825 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.773628 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.781611 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.781720 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 06:49:03 crc kubenswrapper[5118]: I0223 06:49:03.935859 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.101941 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.142749 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.272075 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.272892 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.278933 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.297003 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.323080 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.403927 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.422577 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.490424 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.522556 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.643903 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.780025 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.799970 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.917649 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.937603 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 23 06:49:04 crc kubenswrapper[5118]: I0223 06:49:04.946008 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.089232 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.340424 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.375070 5118 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.375341 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0" gracePeriod=5 Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.380077 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.397068 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.475910 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.525910 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.601622 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.667130 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.690553 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.822998 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.850746 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.851255 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.870450 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.880072 5118 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.926024 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.926472 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 06:49:05 crc kubenswrapper[5118]: I0223 06:49:05.949764 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 06:49:06 crc kubenswrapper[5118]: I0223 06:49:06.015399 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 06:49:06 crc kubenswrapper[5118]: I0223 06:49:06.198534 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 06:49:06 crc kubenswrapper[5118]: I0223 06:49:06.292723 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 06:49:06 crc kubenswrapper[5118]: I0223 06:49:06.308881 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 06:49:06 crc kubenswrapper[5118]: I0223 06:49:06.375241 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 06:49:06 crc kubenswrapper[5118]: I0223 06:49:06.476626 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 06:49:06 crc kubenswrapper[5118]: I0223 06:49:06.559611 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 06:49:06 crc kubenswrapper[5118]: I0223 06:49:06.601174 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 23 06:49:06 crc kubenswrapper[5118]: I0223 06:49:06.786370 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 06:49:06 crc kubenswrapper[5118]: I0223 06:49:06.990175 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 23 06:49:07 crc kubenswrapper[5118]: I0223 06:49:07.011245 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 23 06:49:07 crc kubenswrapper[5118]: I0223 06:49:07.050495 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 23 06:49:07 crc kubenswrapper[5118]: I0223 06:49:07.272822 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 06:49:07 crc kubenswrapper[5118]: I0223 06:49:07.305784 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 06:49:07 crc kubenswrapper[5118]: I0223 06:49:07.420316 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 06:49:07 crc kubenswrapper[5118]: I0223 06:49:07.423876 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 06:49:07 crc kubenswrapper[5118]: I0223 06:49:07.494200 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 06:49:07 crc kubenswrapper[5118]: I0223 06:49:07.689171 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 06:49:07 crc kubenswrapper[5118]: I0223 06:49:07.903723 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 06:49:08 crc kubenswrapper[5118]: I0223 06:49:08.082195 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 06:49:08 crc kubenswrapper[5118]: I0223 06:49:08.141812 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 06:49:08 crc kubenswrapper[5118]: I0223 06:49:08.144927 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 23 06:49:08 crc kubenswrapper[5118]: I0223 06:49:08.171326 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 23 06:49:08 crc kubenswrapper[5118]: I0223 06:49:08.207716 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 06:49:08 crc kubenswrapper[5118]: I0223 06:49:08.245044 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 23 06:49:08 crc kubenswrapper[5118]: I0223 06:49:08.314447 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 06:49:08 crc kubenswrapper[5118]: I0223 06:49:08.931794 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 06:49:08 crc kubenswrapper[5118]: I0223 06:49:08.973894 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 06:49:09 crc kubenswrapper[5118]: I0223 06:49:09.092975 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 06:49:09 crc kubenswrapper[5118]: I0223 06:49:09.285564 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 06:49:09 crc kubenswrapper[5118]: I0223 06:49:09.327048 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 06:49:10 crc kubenswrapper[5118]: I0223 06:49:10.992850 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 23 06:49:10 crc kubenswrapper[5118]: I0223 06:49:10.993286 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.062738 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.063017 5118 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0" exitCode=137 Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.063195 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.063315 5118 scope.go:117] "RemoveContainer" containerID="b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.086188 5118 scope.go:117] "RemoveContainer" containerID="b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0" Feb 23 06:49:11 crc kubenswrapper[5118]: E0223 06:49:11.086740 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0\": container with ID starting with b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0 not found: ID does not exist" containerID="b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.086799 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0"} err="failed to get container status \"b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0\": rpc error: code = NotFound desc = could not find container \"b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0\": container with ID starting with b3f737c856defd6d1544c575e67b0405e8a569aba4e3cb40a0c04a0221dec5d0 not found: ID does not exist" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.128799 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.128935 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.128984 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.129050 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.129083 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.129299 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.129365 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.129396 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.129441 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.129492 5118 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.129519 5118 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.142368 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.230999 5118 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.231046 5118 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.231076 5118 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 23 06:49:11 crc kubenswrapper[5118]: I0223 06:49:11.712658 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 23 06:49:26 crc kubenswrapper[5118]: I0223 06:49:26.176237 5118 generic.go:334] "Generic (PLEG): container finished" podID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerID="0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b" exitCode=0 Feb 23 06:49:26 crc kubenswrapper[5118]: I0223 06:49:26.176369 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" event={"ID":"34479184-8c08-43c7-b0c6-7d46408f3f33","Type":"ContainerDied","Data":"0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b"} Feb 23 06:49:26 crc kubenswrapper[5118]: I0223 06:49:26.177406 5118 scope.go:117] "RemoveContainer" containerID="0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b" Feb 23 06:49:27 crc kubenswrapper[5118]: I0223 06:49:27.188329 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" event={"ID":"34479184-8c08-43c7-b0c6-7d46408f3f33","Type":"ContainerStarted","Data":"5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1"} Feb 23 06:49:27 crc kubenswrapper[5118]: I0223 06:49:27.188832 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:49:27 crc kubenswrapper[5118]: I0223 06:49:27.191350 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:49:32 crc kubenswrapper[5118]: I0223 06:49:32.975474 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:49:32 crc kubenswrapper[5118]: I0223 06:49:32.976173 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:49:37 crc kubenswrapper[5118]: I0223 06:49:37.517139 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 06:49:39 crc kubenswrapper[5118]: I0223 06:49:39.229016 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 06:49:43 crc kubenswrapper[5118]: I0223 06:49:43.671164 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 06:49:45 crc kubenswrapper[5118]: I0223 06:49:45.107228 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 23 06:50:02 crc kubenswrapper[5118]: I0223 06:50:02.975365 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:50:02 crc kubenswrapper[5118]: I0223 06:50:02.975979 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:50:02 crc kubenswrapper[5118]: I0223 06:50:02.976045 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:50:02 crc kubenswrapper[5118]: I0223 06:50:02.976911 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e16b5ff6d4e5c69b2f860e4229e93318a9d047d8241bd58ca765a2e5ab3beeeb"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:50:02 crc kubenswrapper[5118]: I0223 06:50:02.976997 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://e16b5ff6d4e5c69b2f860e4229e93318a9d047d8241bd58ca765a2e5ab3beeeb" gracePeriod=600 Feb 23 06:50:03 crc kubenswrapper[5118]: I0223 06:50:03.453542 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="e16b5ff6d4e5c69b2f860e4229e93318a9d047d8241bd58ca765a2e5ab3beeeb" exitCode=0 Feb 23 06:50:03 crc kubenswrapper[5118]: I0223 06:50:03.453626 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"e16b5ff6d4e5c69b2f860e4229e93318a9d047d8241bd58ca765a2e5ab3beeeb"} Feb 23 06:50:03 crc kubenswrapper[5118]: I0223 06:50:03.453940 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"68f4481a6269e5ecdc2ade0c03e9fffa62bc33bc2b72cad80c18c37653223298"} Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.833796 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8pc9p"] Feb 23 06:50:24 crc kubenswrapper[5118]: E0223 06:50:24.835191 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.835215 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.835434 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.836396 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.863085 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8pc9p"] Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.961411 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jk24\" (UniqueName: \"kubernetes.io/projected/8a856f53-6752-4770-a447-eceb0de89538-kube-api-access-5jk24\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.961476 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.961520 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a856f53-6752-4770-a447-eceb0de89538-trusted-ca\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.961551 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8a856f53-6752-4770-a447-eceb0de89538-registry-tls\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.961578 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8a856f53-6752-4770-a447-eceb0de89538-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.961610 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8a856f53-6752-4770-a447-eceb0de89538-registry-certificates\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.961683 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8a856f53-6752-4770-a447-eceb0de89538-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.961727 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a856f53-6752-4770-a447-eceb0de89538-bound-sa-token\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:24 crc kubenswrapper[5118]: I0223 06:50:24.990401 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.062864 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a856f53-6752-4770-a447-eceb0de89538-bound-sa-token\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.062970 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jk24\" (UniqueName: \"kubernetes.io/projected/8a856f53-6752-4770-a447-eceb0de89538-kube-api-access-5jk24\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.063049 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a856f53-6752-4770-a447-eceb0de89538-trusted-ca\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.063090 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8a856f53-6752-4770-a447-eceb0de89538-registry-tls\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.063161 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8a856f53-6752-4770-a447-eceb0de89538-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.063212 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8a856f53-6752-4770-a447-eceb0de89538-registry-certificates\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.063259 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8a856f53-6752-4770-a447-eceb0de89538-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.064250 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8a856f53-6752-4770-a447-eceb0de89538-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.065252 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8a856f53-6752-4770-a447-eceb0de89538-registry-certificates\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.065536 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a856f53-6752-4770-a447-eceb0de89538-trusted-ca\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.075179 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8a856f53-6752-4770-a447-eceb0de89538-registry-tls\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.075348 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8a856f53-6752-4770-a447-eceb0de89538-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.094554 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a856f53-6752-4770-a447-eceb0de89538-bound-sa-token\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.095508 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jk24\" (UniqueName: \"kubernetes.io/projected/8a856f53-6752-4770-a447-eceb0de89538-kube-api-access-5jk24\") pod \"image-registry-66df7c8f76-8pc9p\" (UID: \"8a856f53-6752-4770-a447-eceb0de89538\") " pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.187159 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.494371 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8pc9p"] Feb 23 06:50:25 crc kubenswrapper[5118]: I0223 06:50:25.590883 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" event={"ID":"8a856f53-6752-4770-a447-eceb0de89538","Type":"ContainerStarted","Data":"f4b31d6cf02dfe8466c28ed6cc6c050b4e5142243264975a041076fdf3123fb0"} Feb 23 06:50:26 crc kubenswrapper[5118]: I0223 06:50:26.601177 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" event={"ID":"8a856f53-6752-4770-a447-eceb0de89538","Type":"ContainerStarted","Data":"aeab90e9b836b57b4039d75cfe5d3e9e9d6d257d8e047489a5103847ed9f226b"} Feb 23 06:50:26 crc kubenswrapper[5118]: I0223 06:50:26.601364 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:26 crc kubenswrapper[5118]: I0223 06:50:26.633217 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" podStartSLOduration=2.633190575 podStartE2EDuration="2.633190575s" podCreationTimestamp="2026-02-23 06:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:50:26.632972438 +0000 UTC m=+289.636757061" watchObservedRunningTime="2026-02-23 06:50:26.633190575 +0000 UTC m=+289.636975188" Feb 23 06:50:37 crc kubenswrapper[5118]: I0223 06:50:37.485311 5118 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 23 06:50:45 crc kubenswrapper[5118]: I0223 06:50:45.197779 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8pc9p" Feb 23 06:50:45 crc kubenswrapper[5118]: I0223 06:50:45.307885 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sh4c"] Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.788181 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfmkj"] Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.791075 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jfmkj" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerName="registry-server" containerID="cri-o://573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212" gracePeriod=30 Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.800454 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ldm4m"] Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.801777 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ldm4m" podUID="b35a400e-565d-40e1-aa28-4896f541c19f" containerName="registry-server" containerID="cri-o://3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453" gracePeriod=30 Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.827669 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f42lm"] Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.828185 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" podUID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerName="marketplace-operator" containerID="cri-o://5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1" gracePeriod=30 Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.840795 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-89mtl"] Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.841246 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-89mtl" podUID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerName="registry-server" containerID="cri-o://052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af" gracePeriod=30 Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.850734 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-526dd"] Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.851056 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-526dd" podUID="67106794-6724-41de-9db8-d51c468e1e28" containerName="registry-server" containerID="cri-o://e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a" gracePeriod=30 Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.859279 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kzwjr"] Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.860214 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.868285 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kzwjr"] Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.988270 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2e64f70b-4830-4fef-94c0-d5eef18c5eb0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kzwjr\" (UID: \"2e64f70b-4830-4fef-94c0-d5eef18c5eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.988354 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e64f70b-4830-4fef-94c0-d5eef18c5eb0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kzwjr\" (UID: \"2e64f70b-4830-4fef-94c0-d5eef18c5eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:56 crc kubenswrapper[5118]: I0223 06:50:56.988400 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fspfq\" (UniqueName: \"kubernetes.io/projected/2e64f70b-4830-4fef-94c0-d5eef18c5eb0-kube-api-access-fspfq\") pod \"marketplace-operator-79b997595-kzwjr\" (UID: \"2e64f70b-4830-4fef-94c0-d5eef18c5eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.089918 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e64f70b-4830-4fef-94c0-d5eef18c5eb0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kzwjr\" (UID: \"2e64f70b-4830-4fef-94c0-d5eef18c5eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.090169 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fspfq\" (UniqueName: \"kubernetes.io/projected/2e64f70b-4830-4fef-94c0-d5eef18c5eb0-kube-api-access-fspfq\") pod \"marketplace-operator-79b997595-kzwjr\" (UID: \"2e64f70b-4830-4fef-94c0-d5eef18c5eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.090226 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2e64f70b-4830-4fef-94c0-d5eef18c5eb0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kzwjr\" (UID: \"2e64f70b-4830-4fef-94c0-d5eef18c5eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.093154 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e64f70b-4830-4fef-94c0-d5eef18c5eb0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kzwjr\" (UID: \"2e64f70b-4830-4fef-94c0-d5eef18c5eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.099021 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2e64f70b-4830-4fef-94c0-d5eef18c5eb0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kzwjr\" (UID: \"2e64f70b-4830-4fef-94c0-d5eef18c5eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.110993 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fspfq\" (UniqueName: \"kubernetes.io/projected/2e64f70b-4830-4fef-94c0-d5eef18c5eb0-kube-api-access-fspfq\") pod \"marketplace-operator-79b997595-kzwjr\" (UID: \"2e64f70b-4830-4fef-94c0-d5eef18c5eb0\") " pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.195623 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.203889 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.209314 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.212888 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.214273 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.291422 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294181 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-utilities\") pod \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294237 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-catalog-content\") pod \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294272 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-catalog-content\") pod \"67106794-6724-41de-9db8-d51c468e1e28\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294307 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsr2n\" (UniqueName: \"kubernetes.io/projected/b35a400e-565d-40e1-aa28-4896f541c19f-kube-api-access-bsr2n\") pod \"b35a400e-565d-40e1-aa28-4896f541c19f\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294329 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rvsr\" (UniqueName: \"kubernetes.io/projected/67106794-6724-41de-9db8-d51c468e1e28-kube-api-access-7rvsr\") pod \"67106794-6724-41de-9db8-d51c468e1e28\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294360 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-utilities\") pod \"67106794-6724-41de-9db8-d51c468e1e28\" (UID: \"67106794-6724-41de-9db8-d51c468e1e28\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294382 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-catalog-content\") pod \"b35a400e-565d-40e1-aa28-4896f541c19f\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294401 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-trusted-ca\") pod \"34479184-8c08-43c7-b0c6-7d46408f3f33\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294426 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-operator-metrics\") pod \"34479184-8c08-43c7-b0c6-7d46408f3f33\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294445 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mrhc\" (UniqueName: \"kubernetes.io/projected/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-kube-api-access-2mrhc\") pod \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\" (UID: \"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294473 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q67b6\" (UniqueName: \"kubernetes.io/projected/34479184-8c08-43c7-b0c6-7d46408f3f33-kube-api-access-q67b6\") pod \"34479184-8c08-43c7-b0c6-7d46408f3f33\" (UID: \"34479184-8c08-43c7-b0c6-7d46408f3f33\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.294497 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-utilities\") pod \"b35a400e-565d-40e1-aa28-4896f541c19f\" (UID: \"b35a400e-565d-40e1-aa28-4896f541c19f\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.295756 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-utilities" (OuterVolumeSpecName: "utilities") pod "4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" (UID: "4051ea46-bd23-4bc5-ae80-9b3cba5aa41f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.297355 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "34479184-8c08-43c7-b0c6-7d46408f3f33" (UID: "34479184-8c08-43c7-b0c6-7d46408f3f33"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.298353 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-utilities" (OuterVolumeSpecName: "utilities") pod "67106794-6724-41de-9db8-d51c468e1e28" (UID: "67106794-6724-41de-9db8-d51c468e1e28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.301661 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "34479184-8c08-43c7-b0c6-7d46408f3f33" (UID: "34479184-8c08-43c7-b0c6-7d46408f3f33"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.301961 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-kube-api-access-2mrhc" (OuterVolumeSpecName: "kube-api-access-2mrhc") pod "4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" (UID: "4051ea46-bd23-4bc5-ae80-9b3cba5aa41f"). InnerVolumeSpecName "kube-api-access-2mrhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.302794 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-utilities" (OuterVolumeSpecName: "utilities") pod "b35a400e-565d-40e1-aa28-4896f541c19f" (UID: "b35a400e-565d-40e1-aa28-4896f541c19f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.304866 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b35a400e-565d-40e1-aa28-4896f541c19f-kube-api-access-bsr2n" (OuterVolumeSpecName: "kube-api-access-bsr2n") pod "b35a400e-565d-40e1-aa28-4896f541c19f" (UID: "b35a400e-565d-40e1-aa28-4896f541c19f"). InnerVolumeSpecName "kube-api-access-bsr2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.305686 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34479184-8c08-43c7-b0c6-7d46408f3f33-kube-api-access-q67b6" (OuterVolumeSpecName: "kube-api-access-q67b6") pod "34479184-8c08-43c7-b0c6-7d46408f3f33" (UID: "34479184-8c08-43c7-b0c6-7d46408f3f33"). InnerVolumeSpecName "kube-api-access-q67b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.307210 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67106794-6724-41de-9db8-d51c468e1e28-kube-api-access-7rvsr" (OuterVolumeSpecName: "kube-api-access-7rvsr") pod "67106794-6724-41de-9db8-d51c468e1e28" (UID: "67106794-6724-41de-9db8-d51c468e1e28"). InnerVolumeSpecName "kube-api-access-7rvsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.366609 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" (UID: "4051ea46-bd23-4bc5-ae80-9b3cba5aa41f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.380661 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b35a400e-565d-40e1-aa28-4896f541c19f" (UID: "b35a400e-565d-40e1-aa28-4896f541c19f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395138 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-catalog-content\") pod \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395185 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkjfp\" (UniqueName: \"kubernetes.io/projected/ae00e6ae-91b4-48e7-8836-53d0fc36c777-kube-api-access-qkjfp\") pod \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395555 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-utilities\") pod \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\" (UID: \"ae00e6ae-91b4-48e7-8836-53d0fc36c777\") " Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395859 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395871 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsr2n\" (UniqueName: \"kubernetes.io/projected/b35a400e-565d-40e1-aa28-4896f541c19f-kube-api-access-bsr2n\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395884 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rvsr\" (UniqueName: \"kubernetes.io/projected/67106794-6724-41de-9db8-d51c468e1e28-kube-api-access-7rvsr\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395896 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395904 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395913 5118 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395922 5118 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/34479184-8c08-43c7-b0c6-7d46408f3f33-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395931 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mrhc\" (UniqueName: \"kubernetes.io/projected/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-kube-api-access-2mrhc\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395943 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q67b6\" (UniqueName: \"kubernetes.io/projected/34479184-8c08-43c7-b0c6-7d46408f3f33-kube-api-access-q67b6\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395951 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b35a400e-565d-40e1-aa28-4896f541c19f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.395959 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.396269 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-utilities" (OuterVolumeSpecName: "utilities") pod "ae00e6ae-91b4-48e7-8836-53d0fc36c777" (UID: "ae00e6ae-91b4-48e7-8836-53d0fc36c777"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.398830 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae00e6ae-91b4-48e7-8836-53d0fc36c777-kube-api-access-qkjfp" (OuterVolumeSpecName: "kube-api-access-qkjfp") pod "ae00e6ae-91b4-48e7-8836-53d0fc36c777" (UID: "ae00e6ae-91b4-48e7-8836-53d0fc36c777"). InnerVolumeSpecName "kube-api-access-qkjfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.422676 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae00e6ae-91b4-48e7-8836-53d0fc36c777" (UID: "ae00e6ae-91b4-48e7-8836-53d0fc36c777"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.449817 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67106794-6724-41de-9db8-d51c468e1e28" (UID: "67106794-6724-41de-9db8-d51c468e1e28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.470515 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kzwjr"] Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.496534 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.496566 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkjfp\" (UniqueName: \"kubernetes.io/projected/ae00e6ae-91b4-48e7-8836-53d0fc36c777-kube-api-access-qkjfp\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.496578 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67106794-6724-41de-9db8-d51c468e1e28-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.496588 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae00e6ae-91b4-48e7-8836-53d0fc36c777-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.837139 5118 generic.go:334] "Generic (PLEG): container finished" podID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerID="052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af" exitCode=0 Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.837236 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89mtl" event={"ID":"ae00e6ae-91b4-48e7-8836-53d0fc36c777","Type":"ContainerDied","Data":"052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.837328 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89mtl" event={"ID":"ae00e6ae-91b4-48e7-8836-53d0fc36c777","Type":"ContainerDied","Data":"797e0ec091fcede2fd9bc0c59729c3689f789c272023c7ad82cc7465e4211974"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.837374 5118 scope.go:117] "RemoveContainer" containerID="052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.838830 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89mtl" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.845895 5118 generic.go:334] "Generic (PLEG): container finished" podID="b35a400e-565d-40e1-aa28-4896f541c19f" containerID="3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453" exitCode=0 Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.845946 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldm4m" event={"ID":"b35a400e-565d-40e1-aa28-4896f541c19f","Type":"ContainerDied","Data":"3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.845975 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ldm4m" event={"ID":"b35a400e-565d-40e1-aa28-4896f541c19f","Type":"ContainerDied","Data":"f47d887f28583191e65ec216e11560234247effca3bb6f24ce7e42480d8e930f"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.846140 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ldm4m" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.850768 5118 generic.go:334] "Generic (PLEG): container finished" podID="67106794-6724-41de-9db8-d51c468e1e28" containerID="e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a" exitCode=0 Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.851622 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-526dd" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.850858 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-526dd" event={"ID":"67106794-6724-41de-9db8-d51c468e1e28","Type":"ContainerDied","Data":"e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.852091 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-526dd" event={"ID":"67106794-6724-41de-9db8-d51c468e1e28","Type":"ContainerDied","Data":"6cfdf60797c9c095e1906d90ce251c8d495ec5b4f750e025d77e8150c341f859"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.854821 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" event={"ID":"2e64f70b-4830-4fef-94c0-d5eef18c5eb0","Type":"ContainerStarted","Data":"a733a16c36223437a2e8dab53b464fd1579684a7a15699b75e0eabc1431ef356"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.854886 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" event={"ID":"2e64f70b-4830-4fef-94c0-d5eef18c5eb0","Type":"ContainerStarted","Data":"afe2c5a0bef30c4f7bdf80d9c3695e01d7ab3b2bea7e39991b1d1301d49d39e2"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.855463 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.857534 5118 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kzwjr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" start-of-body= Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.857606 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" podUID="2e64f70b-4830-4fef-94c0-d5eef18c5eb0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": dial tcp 10.217.0.65:8080: connect: connection refused" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.858658 5118 generic.go:334] "Generic (PLEG): container finished" podID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerID="5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1" exitCode=0 Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.858836 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" event={"ID":"34479184-8c08-43c7-b0c6-7d46408f3f33","Type":"ContainerDied","Data":"5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.858898 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" event={"ID":"34479184-8c08-43c7-b0c6-7d46408f3f33","Type":"ContainerDied","Data":"b280c9b5348694540fdc64fc0302c47165a5a62ec6924b8651e7def1c315bf3d"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.858992 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-f42lm" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.870293 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-89mtl"] Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.870479 5118 generic.go:334] "Generic (PLEG): container finished" podID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerID="573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212" exitCode=0 Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.870510 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfmkj" event={"ID":"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f","Type":"ContainerDied","Data":"573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.870566 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jfmkj" event={"ID":"4051ea46-bd23-4bc5-ae80-9b3cba5aa41f","Type":"ContainerDied","Data":"d4b94230197bc7f443ccb1e3e8d6e1919aeacf4a142772dcc3fe4c3b32b8921f"} Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.870598 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jfmkj" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.872995 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-89mtl"] Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.874327 5118 scope.go:117] "RemoveContainer" containerID="c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.908939 5118 scope.go:117] "RemoveContainer" containerID="8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.909903 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ldm4m"] Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.916629 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ldm4m"] Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.920359 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f42lm"] Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.927209 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-f42lm"] Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.931618 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" podStartSLOduration=1.931594809 podStartE2EDuration="1.931594809s" podCreationTimestamp="2026-02-23 06:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:50:57.925736915 +0000 UTC m=+320.929521518" watchObservedRunningTime="2026-02-23 06:50:57.931594809 +0000 UTC m=+320.935379392" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.933965 5118 scope.go:117] "RemoveContainer" containerID="052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af" Feb 23 06:50:57 crc kubenswrapper[5118]: E0223 06:50:57.934834 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af\": container with ID starting with 052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af not found: ID does not exist" containerID="052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.934882 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af"} err="failed to get container status \"052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af\": rpc error: code = NotFound desc = could not find container \"052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af\": container with ID starting with 052e2c8697ee4d41d42c3f6345df339b86a667eb4d4b98628851ff9fbfdb23af not found: ID does not exist" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.934913 5118 scope.go:117] "RemoveContainer" containerID="c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369" Feb 23 06:50:57 crc kubenswrapper[5118]: E0223 06:50:57.940678 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369\": container with ID starting with c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369 not found: ID does not exist" containerID="c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.948597 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369"} err="failed to get container status \"c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369\": rpc error: code = NotFound desc = could not find container \"c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369\": container with ID starting with c129603229312094dc993d2606086d308e61bb2528e0f6893827ecd456d82369 not found: ID does not exist" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.948756 5118 scope.go:117] "RemoveContainer" containerID="8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.947756 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jfmkj"] Feb 23 06:50:57 crc kubenswrapper[5118]: E0223 06:50:57.949358 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12\": container with ID starting with 8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12 not found: ID does not exist" containerID="8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.949420 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12"} err="failed to get container status \"8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12\": rpc error: code = NotFound desc = could not find container \"8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12\": container with ID starting with 8f9f54dab0cbdb82e3242e3f70275409aa9b27db2604a3d04b98e240c1fb8a12 not found: ID does not exist" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.949463 5118 scope.go:117] "RemoveContainer" containerID="3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453" Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.951624 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jfmkj"] Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.959759 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-526dd"] Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.962207 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-526dd"] Feb 23 06:50:57 crc kubenswrapper[5118]: I0223 06:50:57.990492 5118 scope.go:117] "RemoveContainer" containerID="7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.013067 5118 scope.go:117] "RemoveContainer" containerID="a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.030397 5118 scope.go:117] "RemoveContainer" containerID="3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.032946 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453\": container with ID starting with 3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453 not found: ID does not exist" containerID="3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.032987 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453"} err="failed to get container status \"3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453\": rpc error: code = NotFound desc = could not find container \"3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453\": container with ID starting with 3428b0c626488997be52dc8da1a65f65a11a433abb1678196e06cb4989680453 not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.033010 5118 scope.go:117] "RemoveContainer" containerID="7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.033470 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64\": container with ID starting with 7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64 not found: ID does not exist" containerID="7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.033520 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64"} err="failed to get container status \"7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64\": rpc error: code = NotFound desc = could not find container \"7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64\": container with ID starting with 7bf56c2fd5e6b4e79e0f7d2ff8a82e01cd2778e912486bf93473cb3f78b96e64 not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.033553 5118 scope.go:117] "RemoveContainer" containerID="a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.033904 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8\": container with ID starting with a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8 not found: ID does not exist" containerID="a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.033936 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8"} err="failed to get container status \"a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8\": rpc error: code = NotFound desc = could not find container \"a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8\": container with ID starting with a2b01d6edab1ab7385d13d17c694cf809a656788bd157ae5bf360dc4e5c48ed8 not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.033951 5118 scope.go:117] "RemoveContainer" containerID="e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.049777 5118 scope.go:117] "RemoveContainer" containerID="d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.076689 5118 scope.go:117] "RemoveContainer" containerID="4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.104599 5118 scope.go:117] "RemoveContainer" containerID="e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.105685 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a\": container with ID starting with e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a not found: ID does not exist" containerID="e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.105749 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a"} err="failed to get container status \"e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a\": rpc error: code = NotFound desc = could not find container \"e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a\": container with ID starting with e1e6389008f57542acd0d8d922e246660d27b6a09b6557acf443d5feb5d9925a not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.105797 5118 scope.go:117] "RemoveContainer" containerID="d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.106742 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665\": container with ID starting with d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665 not found: ID does not exist" containerID="d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.106787 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665"} err="failed to get container status \"d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665\": rpc error: code = NotFound desc = could not find container \"d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665\": container with ID starting with d78af841c12af7be434cdfa2eb8de9d99b5504b6501e80152a2fd1f172225665 not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.106827 5118 scope.go:117] "RemoveContainer" containerID="4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.107573 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268\": container with ID starting with 4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268 not found: ID does not exist" containerID="4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.107631 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268"} err="failed to get container status \"4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268\": rpc error: code = NotFound desc = could not find container \"4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268\": container with ID starting with 4f3faca5e68cf2380435a25794a6ae658167d1f720bb0c440ebe5d3158a8f268 not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.107690 5118 scope.go:117] "RemoveContainer" containerID="5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.127652 5118 scope.go:117] "RemoveContainer" containerID="0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.143493 5118 scope.go:117] "RemoveContainer" containerID="5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.144138 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1\": container with ID starting with 5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1 not found: ID does not exist" containerID="5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.144204 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1"} err="failed to get container status \"5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1\": rpc error: code = NotFound desc = could not find container \"5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1\": container with ID starting with 5335e01c4fc085932423380269593afe397564085f7a68e813cf161860d6c3d1 not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.144244 5118 scope.go:117] "RemoveContainer" containerID="0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.144798 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b\": container with ID starting with 0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b not found: ID does not exist" containerID="0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.144865 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b"} err="failed to get container status \"0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b\": rpc error: code = NotFound desc = could not find container \"0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b\": container with ID starting with 0b8fc0f1072acaaf250251582fa0ae52bfb1958e245e72eece2b44d7130d216b not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.144894 5118 scope.go:117] "RemoveContainer" containerID="573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.163507 5118 scope.go:117] "RemoveContainer" containerID="0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.182568 5118 scope.go:117] "RemoveContainer" containerID="a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.203115 5118 scope.go:117] "RemoveContainer" containerID="573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.203843 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212\": container with ID starting with 573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212 not found: ID does not exist" containerID="573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.203909 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212"} err="failed to get container status \"573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212\": rpc error: code = NotFound desc = could not find container \"573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212\": container with ID starting with 573f5873a50da7b4505d67dc22725eb72054514cfafa0cea8a93b497894da212 not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.203953 5118 scope.go:117] "RemoveContainer" containerID="0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.204393 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da\": container with ID starting with 0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da not found: ID does not exist" containerID="0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.204429 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da"} err="failed to get container status \"0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da\": rpc error: code = NotFound desc = could not find container \"0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da\": container with ID starting with 0228d4885cc63f453011546f37e539652d06a5ee6ce8a38e98c7efd2c303b7da not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.204456 5118 scope.go:117] "RemoveContainer" containerID="a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd" Feb 23 06:50:58 crc kubenswrapper[5118]: E0223 06:50:58.204922 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd\": container with ID starting with a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd not found: ID does not exist" containerID="a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.204975 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd"} err="failed to get container status \"a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd\": rpc error: code = NotFound desc = could not find container \"a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd\": container with ID starting with a4977c62a4b59e1319e5c0a6be4d40d85dab0dae1a596469df3a412ec7fa8acd not found: ID does not exist" Feb 23 06:50:58 crc kubenswrapper[5118]: I0223 06:50:58.893482 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.012297 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qgpxb"] Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.012686 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerName="extract-utilities" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.012725 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerName="extract-utilities" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.012748 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67106794-6724-41de-9db8-d51c468e1e28" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.012762 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="67106794-6724-41de-9db8-d51c468e1e28" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.012779 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67106794-6724-41de-9db8-d51c468e1e28" containerName="extract-utilities" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.012795 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="67106794-6724-41de-9db8-d51c468e1e28" containerName="extract-utilities" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.012810 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35a400e-565d-40e1-aa28-4896f541c19f" containerName="extract-content" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.012822 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35a400e-565d-40e1-aa28-4896f541c19f" containerName="extract-content" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.012837 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67106794-6724-41de-9db8-d51c468e1e28" containerName="extract-content" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.012851 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="67106794-6724-41de-9db8-d51c468e1e28" containerName="extract-content" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.012871 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerName="extract-content" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.012884 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerName="extract-content" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.012912 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35a400e-565d-40e1-aa28-4896f541c19f" containerName="extract-utilities" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.012924 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35a400e-565d-40e1-aa28-4896f541c19f" containerName="extract-utilities" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.012946 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35a400e-565d-40e1-aa28-4896f541c19f" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.012959 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35a400e-565d-40e1-aa28-4896f541c19f" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.012977 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerName="marketplace-operator" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.012991 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerName="marketplace-operator" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.013007 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerName="extract-utilities" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013019 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerName="extract-utilities" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.013039 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013051 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.013071 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013084 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.013134 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerName="marketplace-operator" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013148 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerName="marketplace-operator" Feb 23 06:50:59 crc kubenswrapper[5118]: E0223 06:50:59.013166 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerName="extract-content" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013178 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerName="extract-content" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013390 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerName="marketplace-operator" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013421 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b35a400e-565d-40e1-aa28-4896f541c19f" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013442 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013465 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013500 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="67106794-6724-41de-9db8-d51c468e1e28" containerName="registry-server" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.013899 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="34479184-8c08-43c7-b0c6-7d46408f3f33" containerName="marketplace-operator" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.014834 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.018893 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.024824 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qgpxb"] Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.120466 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnl2c\" (UniqueName: \"kubernetes.io/projected/2eea3976-c9c9-4160-861a-251e0822d119-kube-api-access-lnl2c\") pod \"certified-operators-qgpxb\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.120541 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-catalog-content\") pod \"certified-operators-qgpxb\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.120692 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-utilities\") pod \"certified-operators-qgpxb\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.200174 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wgrql"] Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.201652 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.204418 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.221815 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnl2c\" (UniqueName: \"kubernetes.io/projected/2eea3976-c9c9-4160-861a-251e0822d119-kube-api-access-lnl2c\") pod \"certified-operators-qgpxb\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.222143 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-catalog-content\") pod \"certified-operators-qgpxb\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.222239 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-utilities\") pod \"certified-operators-qgpxb\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.223008 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgrql"] Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.223317 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-utilities\") pod \"certified-operators-qgpxb\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.223523 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-catalog-content\") pod \"certified-operators-qgpxb\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.277866 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnl2c\" (UniqueName: \"kubernetes.io/projected/2eea3976-c9c9-4160-861a-251e0822d119-kube-api-access-lnl2c\") pod \"certified-operators-qgpxb\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.324176 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ba3929-8fce-46de-ac77-f0ac791d194c-utilities\") pod \"community-operators-wgrql\" (UID: \"70ba3929-8fce-46de-ac77-f0ac791d194c\") " pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.324297 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmkgs\" (UniqueName: \"kubernetes.io/projected/70ba3929-8fce-46de-ac77-f0ac791d194c-kube-api-access-rmkgs\") pod \"community-operators-wgrql\" (UID: \"70ba3929-8fce-46de-ac77-f0ac791d194c\") " pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.324371 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ba3929-8fce-46de-ac77-f0ac791d194c-catalog-content\") pod \"community-operators-wgrql\" (UID: \"70ba3929-8fce-46de-ac77-f0ac791d194c\") " pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.386561 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.426275 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ba3929-8fce-46de-ac77-f0ac791d194c-catalog-content\") pod \"community-operators-wgrql\" (UID: \"70ba3929-8fce-46de-ac77-f0ac791d194c\") " pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.426778 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ba3929-8fce-46de-ac77-f0ac791d194c-utilities\") pod \"community-operators-wgrql\" (UID: \"70ba3929-8fce-46de-ac77-f0ac791d194c\") " pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.426859 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmkgs\" (UniqueName: \"kubernetes.io/projected/70ba3929-8fce-46de-ac77-f0ac791d194c-kube-api-access-rmkgs\") pod \"community-operators-wgrql\" (UID: \"70ba3929-8fce-46de-ac77-f0ac791d194c\") " pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.426926 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70ba3929-8fce-46de-ac77-f0ac791d194c-catalog-content\") pod \"community-operators-wgrql\" (UID: \"70ba3929-8fce-46de-ac77-f0ac791d194c\") " pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.427725 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70ba3929-8fce-46de-ac77-f0ac791d194c-utilities\") pod \"community-operators-wgrql\" (UID: \"70ba3929-8fce-46de-ac77-f0ac791d194c\") " pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.459942 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmkgs\" (UniqueName: \"kubernetes.io/projected/70ba3929-8fce-46de-ac77-f0ac791d194c-kube-api-access-rmkgs\") pod \"community-operators-wgrql\" (UID: \"70ba3929-8fce-46de-ac77-f0ac791d194c\") " pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.525592 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.651770 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qgpxb"] Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.704456 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34479184-8c08-43c7-b0c6-7d46408f3f33" path="/var/lib/kubelet/pods/34479184-8c08-43c7-b0c6-7d46408f3f33/volumes" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.705086 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4051ea46-bd23-4bc5-ae80-9b3cba5aa41f" path="/var/lib/kubelet/pods/4051ea46-bd23-4bc5-ae80-9b3cba5aa41f/volumes" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.705875 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67106794-6724-41de-9db8-d51c468e1e28" path="/var/lib/kubelet/pods/67106794-6724-41de-9db8-d51c468e1e28/volumes" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.707306 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae00e6ae-91b4-48e7-8836-53d0fc36c777" path="/var/lib/kubelet/pods/ae00e6ae-91b4-48e7-8836-53d0fc36c777/volumes" Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.708143 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b35a400e-565d-40e1-aa28-4896f541c19f" path="/var/lib/kubelet/pods/b35a400e-565d-40e1-aa28-4896f541c19f/volumes" Feb 23 06:50:59 crc kubenswrapper[5118]: W0223 06:50:59.798798 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70ba3929_8fce_46de_ac77_f0ac791d194c.slice/crio-977d25aba44e06a88c8c68a4e02d501da81eb55bf49e711fb29d370091f31455 WatchSource:0}: Error finding container 977d25aba44e06a88c8c68a4e02d501da81eb55bf49e711fb29d370091f31455: Status 404 returned error can't find the container with id 977d25aba44e06a88c8c68a4e02d501da81eb55bf49e711fb29d370091f31455 Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.799725 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wgrql"] Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.896078 5118 generic.go:334] "Generic (PLEG): container finished" podID="2eea3976-c9c9-4160-861a-251e0822d119" containerID="1852d0a45e6c6264a786f75fdc453d244ecee2dcf202f634ca697c21f6bfaeaa" exitCode=0 Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.896168 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgpxb" event={"ID":"2eea3976-c9c9-4160-861a-251e0822d119","Type":"ContainerDied","Data":"1852d0a45e6c6264a786f75fdc453d244ecee2dcf202f634ca697c21f6bfaeaa"} Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.896245 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgpxb" event={"ID":"2eea3976-c9c9-4160-861a-251e0822d119","Type":"ContainerStarted","Data":"ecf056beba4d1c834e9287ee2a800f2c2486f3b7dcc7cbaee46bc1e92227bea6"} Feb 23 06:50:59 crc kubenswrapper[5118]: I0223 06:50:59.902479 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgrql" event={"ID":"70ba3929-8fce-46de-ac77-f0ac791d194c","Type":"ContainerStarted","Data":"977d25aba44e06a88c8c68a4e02d501da81eb55bf49e711fb29d370091f31455"} Feb 23 06:51:00 crc kubenswrapper[5118]: I0223 06:51:00.910413 5118 generic.go:334] "Generic (PLEG): container finished" podID="70ba3929-8fce-46de-ac77-f0ac791d194c" containerID="3c446e8046218d27398a503decbbbe4a62a02cb22f3d1015b29533ea748bebdc" exitCode=0 Feb 23 06:51:00 crc kubenswrapper[5118]: I0223 06:51:00.910574 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgrql" event={"ID":"70ba3929-8fce-46de-ac77-f0ac791d194c","Type":"ContainerDied","Data":"3c446e8046218d27398a503decbbbe4a62a02cb22f3d1015b29533ea748bebdc"} Feb 23 06:51:00 crc kubenswrapper[5118]: I0223 06:51:00.914443 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgpxb" event={"ID":"2eea3976-c9c9-4160-861a-251e0822d119","Type":"ContainerStarted","Data":"1441700cfa0e441468f6451100683aad2b71408650770b08579d10650fc5d65b"} Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.415746 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-svbnv"] Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.417668 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.420702 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.423981 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svbnv"] Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.563710 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38dd814b-2c41-4f46-a3c3-e54c5bcf2c43-utilities\") pod \"redhat-marketplace-svbnv\" (UID: \"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43\") " pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.563816 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk5z4\" (UniqueName: \"kubernetes.io/projected/38dd814b-2c41-4f46-a3c3-e54c5bcf2c43-kube-api-access-mk5z4\") pod \"redhat-marketplace-svbnv\" (UID: \"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43\") " pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.563909 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38dd814b-2c41-4f46-a3c3-e54c5bcf2c43-catalog-content\") pod \"redhat-marketplace-svbnv\" (UID: \"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43\") " pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.610261 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m8j96"] Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.612462 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.616972 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.623088 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m8j96"] Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.665723 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk5z4\" (UniqueName: \"kubernetes.io/projected/38dd814b-2c41-4f46-a3c3-e54c5bcf2c43-kube-api-access-mk5z4\") pod \"redhat-marketplace-svbnv\" (UID: \"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43\") " pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.665821 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38dd814b-2c41-4f46-a3c3-e54c5bcf2c43-catalog-content\") pod \"redhat-marketplace-svbnv\" (UID: \"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43\") " pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.665891 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051f6240-bd14-4b49-b892-318d70c8a38c-catalog-content\") pod \"redhat-operators-m8j96\" (UID: \"051f6240-bd14-4b49-b892-318d70c8a38c\") " pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.665948 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6sgg\" (UniqueName: \"kubernetes.io/projected/051f6240-bd14-4b49-b892-318d70c8a38c-kube-api-access-t6sgg\") pod \"redhat-operators-m8j96\" (UID: \"051f6240-bd14-4b49-b892-318d70c8a38c\") " pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.665999 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051f6240-bd14-4b49-b892-318d70c8a38c-utilities\") pod \"redhat-operators-m8j96\" (UID: \"051f6240-bd14-4b49-b892-318d70c8a38c\") " pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.666166 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38dd814b-2c41-4f46-a3c3-e54c5bcf2c43-utilities\") pod \"redhat-marketplace-svbnv\" (UID: \"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43\") " pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.666480 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38dd814b-2c41-4f46-a3c3-e54c5bcf2c43-catalog-content\") pod \"redhat-marketplace-svbnv\" (UID: \"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43\") " pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.667265 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38dd814b-2c41-4f46-a3c3-e54c5bcf2c43-utilities\") pod \"redhat-marketplace-svbnv\" (UID: \"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43\") " pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.687970 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk5z4\" (UniqueName: \"kubernetes.io/projected/38dd814b-2c41-4f46-a3c3-e54c5bcf2c43-kube-api-access-mk5z4\") pod \"redhat-marketplace-svbnv\" (UID: \"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43\") " pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.762628 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.766971 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051f6240-bd14-4b49-b892-318d70c8a38c-catalog-content\") pod \"redhat-operators-m8j96\" (UID: \"051f6240-bd14-4b49-b892-318d70c8a38c\") " pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.767009 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6sgg\" (UniqueName: \"kubernetes.io/projected/051f6240-bd14-4b49-b892-318d70c8a38c-kube-api-access-t6sgg\") pod \"redhat-operators-m8j96\" (UID: \"051f6240-bd14-4b49-b892-318d70c8a38c\") " pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.767032 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051f6240-bd14-4b49-b892-318d70c8a38c-utilities\") pod \"redhat-operators-m8j96\" (UID: \"051f6240-bd14-4b49-b892-318d70c8a38c\") " pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.767588 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/051f6240-bd14-4b49-b892-318d70c8a38c-catalog-content\") pod \"redhat-operators-m8j96\" (UID: \"051f6240-bd14-4b49-b892-318d70c8a38c\") " pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.767653 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/051f6240-bd14-4b49-b892-318d70c8a38c-utilities\") pod \"redhat-operators-m8j96\" (UID: \"051f6240-bd14-4b49-b892-318d70c8a38c\") " pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.786028 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6sgg\" (UniqueName: \"kubernetes.io/projected/051f6240-bd14-4b49-b892-318d70c8a38c-kube-api-access-t6sgg\") pod \"redhat-operators-m8j96\" (UID: \"051f6240-bd14-4b49-b892-318d70c8a38c\") " pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.950858 5118 generic.go:334] "Generic (PLEG): container finished" podID="2eea3976-c9c9-4160-861a-251e0822d119" containerID="1441700cfa0e441468f6451100683aad2b71408650770b08579d10650fc5d65b" exitCode=0 Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.951005 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgpxb" event={"ID":"2eea3976-c9c9-4160-861a-251e0822d119","Type":"ContainerDied","Data":"1441700cfa0e441468f6451100683aad2b71408650770b08579d10650fc5d65b"} Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.955366 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgrql" event={"ID":"70ba3929-8fce-46de-ac77-f0ac791d194c","Type":"ContainerStarted","Data":"985d3b3bd1bcc6efb10b19b7f2425153482cdad9b243d9a1609179dd8e77d37b"} Feb 23 06:51:01 crc kubenswrapper[5118]: I0223 06:51:01.986177 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.001418 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-svbnv"] Feb 23 06:51:02 crc kubenswrapper[5118]: W0223 06:51:02.008226 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38dd814b_2c41_4f46_a3c3_e54c5bcf2c43.slice/crio-dbf498cd2a0855ad8cbb9ff953ab232308f5c240897fc8fdf23b49ce0f44cf6c WatchSource:0}: Error finding container dbf498cd2a0855ad8cbb9ff953ab232308f5c240897fc8fdf23b49ce0f44cf6c: Status 404 returned error can't find the container with id dbf498cd2a0855ad8cbb9ff953ab232308f5c240897fc8fdf23b49ce0f44cf6c Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.261914 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m8j96"] Feb 23 06:51:02 crc kubenswrapper[5118]: W0223 06:51:02.267747 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod051f6240_bd14_4b49_b892_318d70c8a38c.slice/crio-5fa72e0f7879a6fb1a9cf1af6669951eba91f8f92f0d8a258a419b9d4eb55792 WatchSource:0}: Error finding container 5fa72e0f7879a6fb1a9cf1af6669951eba91f8f92f0d8a258a419b9d4eb55792: Status 404 returned error can't find the container with id 5fa72e0f7879a6fb1a9cf1af6669951eba91f8f92f0d8a258a419b9d4eb55792 Feb 23 06:51:02 crc kubenswrapper[5118]: E0223 06:51:02.285377 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38dd814b_2c41_4f46_a3c3_e54c5bcf2c43.slice/crio-193daa4816c527cbe23d3180d9bfcbc6345221a92f07ebe8b4d0446361136b91.scope\": RecentStats: unable to find data in memory cache]" Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.964395 5118 generic.go:334] "Generic (PLEG): container finished" podID="051f6240-bd14-4b49-b892-318d70c8a38c" containerID="8322b0a89ec15b1e190db86cc2b1da03653da3130b73389218ec77213aeb6137" exitCode=0 Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.964477 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8j96" event={"ID":"051f6240-bd14-4b49-b892-318d70c8a38c","Type":"ContainerDied","Data":"8322b0a89ec15b1e190db86cc2b1da03653da3130b73389218ec77213aeb6137"} Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.966696 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8j96" event={"ID":"051f6240-bd14-4b49-b892-318d70c8a38c","Type":"ContainerStarted","Data":"5fa72e0f7879a6fb1a9cf1af6669951eba91f8f92f0d8a258a419b9d4eb55792"} Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.971941 5118 generic.go:334] "Generic (PLEG): container finished" podID="70ba3929-8fce-46de-ac77-f0ac791d194c" containerID="985d3b3bd1bcc6efb10b19b7f2425153482cdad9b243d9a1609179dd8e77d37b" exitCode=0 Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.972043 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgrql" event={"ID":"70ba3929-8fce-46de-ac77-f0ac791d194c","Type":"ContainerDied","Data":"985d3b3bd1bcc6efb10b19b7f2425153482cdad9b243d9a1609179dd8e77d37b"} Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.976276 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgpxb" event={"ID":"2eea3976-c9c9-4160-861a-251e0822d119","Type":"ContainerStarted","Data":"88f93f31c396438b5dc50e1f8b57df9623d49192bf1f6d9d4ae9f0531e266159"} Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.979119 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svbnv" event={"ID":"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43","Type":"ContainerDied","Data":"193daa4816c527cbe23d3180d9bfcbc6345221a92f07ebe8b4d0446361136b91"} Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.979158 5118 generic.go:334] "Generic (PLEG): container finished" podID="38dd814b-2c41-4f46-a3c3-e54c5bcf2c43" containerID="193daa4816c527cbe23d3180d9bfcbc6345221a92f07ebe8b4d0446361136b91" exitCode=0 Feb 23 06:51:02 crc kubenswrapper[5118]: I0223 06:51:02.979389 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svbnv" event={"ID":"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43","Type":"ContainerStarted","Data":"dbf498cd2a0855ad8cbb9ff953ab232308f5c240897fc8fdf23b49ce0f44cf6c"} Feb 23 06:51:03 crc kubenswrapper[5118]: I0223 06:51:03.059761 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qgpxb" podStartSLOduration=2.5958458049999997 podStartE2EDuration="5.059723452s" podCreationTimestamp="2026-02-23 06:50:58 +0000 UTC" firstStartedPulling="2026-02-23 06:50:59.897782591 +0000 UTC m=+322.901567164" lastFinishedPulling="2026-02-23 06:51:02.361660238 +0000 UTC m=+325.365444811" observedRunningTime="2026-02-23 06:51:03.05462017 +0000 UTC m=+326.058404743" watchObservedRunningTime="2026-02-23 06:51:03.059723452 +0000 UTC m=+326.063508045" Feb 23 06:51:03 crc kubenswrapper[5118]: I0223 06:51:03.988838 5118 generic.go:334] "Generic (PLEG): container finished" podID="38dd814b-2c41-4f46-a3c3-e54c5bcf2c43" containerID="f6a9a6effea92172f124e54cb11c89c6b1e6443d55b10b8fc72a54fd28cd5e0e" exitCode=0 Feb 23 06:51:03 crc kubenswrapper[5118]: I0223 06:51:03.989023 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svbnv" event={"ID":"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43","Type":"ContainerDied","Data":"f6a9a6effea92172f124e54cb11c89c6b1e6443d55b10b8fc72a54fd28cd5e0e"} Feb 23 06:51:03 crc kubenswrapper[5118]: I0223 06:51:03.995771 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8j96" event={"ID":"051f6240-bd14-4b49-b892-318d70c8a38c","Type":"ContainerStarted","Data":"66210f0243f2f44f7bba3f01a304282afce2dd342830c20d01a680989789d431"} Feb 23 06:51:04 crc kubenswrapper[5118]: I0223 06:51:04.003953 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wgrql" event={"ID":"70ba3929-8fce-46de-ac77-f0ac791d194c","Type":"ContainerStarted","Data":"b594587029e68fa078ead78a7804f73fe0ec8ad83f674889ecb1b2d3306e6f31"} Feb 23 06:51:04 crc kubenswrapper[5118]: I0223 06:51:04.038750 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wgrql" podStartSLOduration=2.52857991 podStartE2EDuration="5.03872539s" podCreationTimestamp="2026-02-23 06:50:59 +0000 UTC" firstStartedPulling="2026-02-23 06:51:00.912903264 +0000 UTC m=+323.916687837" lastFinishedPulling="2026-02-23 06:51:03.423048734 +0000 UTC m=+326.426833317" observedRunningTime="2026-02-23 06:51:04.037244649 +0000 UTC m=+327.041029242" watchObservedRunningTime="2026-02-23 06:51:04.03872539 +0000 UTC m=+327.042509963" Feb 23 06:51:05 crc kubenswrapper[5118]: I0223 06:51:05.013632 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-svbnv" event={"ID":"38dd814b-2c41-4f46-a3c3-e54c5bcf2c43","Type":"ContainerStarted","Data":"eb91e4d969d9d425089385bf7ae53a5d9466266e015f66eaeca8462f254aa363"} Feb 23 06:51:05 crc kubenswrapper[5118]: I0223 06:51:05.027522 5118 generic.go:334] "Generic (PLEG): container finished" podID="051f6240-bd14-4b49-b892-318d70c8a38c" containerID="66210f0243f2f44f7bba3f01a304282afce2dd342830c20d01a680989789d431" exitCode=0 Feb 23 06:51:05 crc kubenswrapper[5118]: I0223 06:51:05.027636 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8j96" event={"ID":"051f6240-bd14-4b49-b892-318d70c8a38c","Type":"ContainerDied","Data":"66210f0243f2f44f7bba3f01a304282afce2dd342830c20d01a680989789d431"} Feb 23 06:51:05 crc kubenswrapper[5118]: I0223 06:51:05.046308 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-svbnv" podStartSLOduration=2.633398199 podStartE2EDuration="4.046295906s" podCreationTimestamp="2026-02-23 06:51:01 +0000 UTC" firstStartedPulling="2026-02-23 06:51:02.996781277 +0000 UTC m=+326.000565850" lastFinishedPulling="2026-02-23 06:51:04.409678974 +0000 UTC m=+327.413463557" observedRunningTime="2026-02-23 06:51:05.043852927 +0000 UTC m=+328.047637500" watchObservedRunningTime="2026-02-23 06:51:05.046295906 +0000 UTC m=+328.050080489" Feb 23 06:51:06 crc kubenswrapper[5118]: I0223 06:51:06.038166 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8j96" event={"ID":"051f6240-bd14-4b49-b892-318d70c8a38c","Type":"ContainerStarted","Data":"8fa877de2f0f4156b61b9f66690e77536755f9f4c7c2509da8d611b590a22b33"} Feb 23 06:51:06 crc kubenswrapper[5118]: I0223 06:51:06.063592 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m8j96" podStartSLOduration=2.617943487 podStartE2EDuration="5.063572681s" podCreationTimestamp="2026-02-23 06:51:01 +0000 UTC" firstStartedPulling="2026-02-23 06:51:02.968838138 +0000 UTC m=+325.972622711" lastFinishedPulling="2026-02-23 06:51:05.414467322 +0000 UTC m=+328.418251905" observedRunningTime="2026-02-23 06:51:06.060614269 +0000 UTC m=+329.064398842" watchObservedRunningTime="2026-02-23 06:51:06.063572681 +0000 UTC m=+329.067357254" Feb 23 06:51:09 crc kubenswrapper[5118]: I0223 06:51:09.387601 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:51:09 crc kubenswrapper[5118]: I0223 06:51:09.387663 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:51:09 crc kubenswrapper[5118]: I0223 06:51:09.459527 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:51:09 crc kubenswrapper[5118]: I0223 06:51:09.526744 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:51:09 crc kubenswrapper[5118]: I0223 06:51:09.526788 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:51:09 crc kubenswrapper[5118]: I0223 06:51:09.579873 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.135818 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wgrql" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.138450 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.366271 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" podUID="08634871-e819-4b75-93e5-fed45013b977" containerName="registry" containerID="cri-o://56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa" gracePeriod=30 Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.747767 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.807618 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-trusted-ca\") pod \"08634871-e819-4b75-93e5-fed45013b977\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.807702 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-registry-tls\") pod \"08634871-e819-4b75-93e5-fed45013b977\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.807762 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08634871-e819-4b75-93e5-fed45013b977-ca-trust-extracted\") pod \"08634871-e819-4b75-93e5-fed45013b977\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.807794 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-bound-sa-token\") pod \"08634871-e819-4b75-93e5-fed45013b977\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.807917 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-registry-certificates\") pod \"08634871-e819-4b75-93e5-fed45013b977\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.807988 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcpd6\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-kube-api-access-fcpd6\") pod \"08634871-e819-4b75-93e5-fed45013b977\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.808027 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08634871-e819-4b75-93e5-fed45013b977-installation-pull-secrets\") pod \"08634871-e819-4b75-93e5-fed45013b977\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.808229 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"08634871-e819-4b75-93e5-fed45013b977\" (UID: \"08634871-e819-4b75-93e5-fed45013b977\") " Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.815442 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "08634871-e819-4b75-93e5-fed45013b977" (UID: "08634871-e819-4b75-93e5-fed45013b977"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.815555 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "08634871-e819-4b75-93e5-fed45013b977" (UID: "08634871-e819-4b75-93e5-fed45013b977"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.828469 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "08634871-e819-4b75-93e5-fed45013b977" (UID: "08634871-e819-4b75-93e5-fed45013b977"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.828789 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-kube-api-access-fcpd6" (OuterVolumeSpecName: "kube-api-access-fcpd6") pod "08634871-e819-4b75-93e5-fed45013b977" (UID: "08634871-e819-4b75-93e5-fed45013b977"). InnerVolumeSpecName "kube-api-access-fcpd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.837223 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "08634871-e819-4b75-93e5-fed45013b977" (UID: "08634871-e819-4b75-93e5-fed45013b977"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.842001 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08634871-e819-4b75-93e5-fed45013b977-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "08634871-e819-4b75-93e5-fed45013b977" (UID: "08634871-e819-4b75-93e5-fed45013b977"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.843032 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "08634871-e819-4b75-93e5-fed45013b977" (UID: "08634871-e819-4b75-93e5-fed45013b977"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.845312 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08634871-e819-4b75-93e5-fed45013b977-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "08634871-e819-4b75-93e5-fed45013b977" (UID: "08634871-e819-4b75-93e5-fed45013b977"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.910955 5118 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.910998 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcpd6\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-kube-api-access-fcpd6\") on node \"crc\" DevicePath \"\"" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.911014 5118 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/08634871-e819-4b75-93e5-fed45013b977-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.911027 5118 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08634871-e819-4b75-93e5-fed45013b977-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.911044 5118 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.911055 5118 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/08634871-e819-4b75-93e5-fed45013b977-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 23 06:51:10 crc kubenswrapper[5118]: I0223 06:51:10.911067 5118 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/08634871-e819-4b75-93e5-fed45013b977-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.073871 5118 generic.go:334] "Generic (PLEG): container finished" podID="08634871-e819-4b75-93e5-fed45013b977" containerID="56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa" exitCode=0 Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.074570 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.077148 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" event={"ID":"08634871-e819-4b75-93e5-fed45013b977","Type":"ContainerDied","Data":"56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa"} Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.077187 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sh4c" event={"ID":"08634871-e819-4b75-93e5-fed45013b977","Type":"ContainerDied","Data":"86cf8a2fc28135093762a9bafb873d6876ced90069ae33ebc61099b61cde6192"} Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.077208 5118 scope.go:117] "RemoveContainer" containerID="56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa" Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.110675 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sh4c"] Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.113505 5118 scope.go:117] "RemoveContainer" containerID="56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa" Feb 23 06:51:11 crc kubenswrapper[5118]: E0223 06:51:11.116151 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa\": container with ID starting with 56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa not found: ID does not exist" containerID="56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa" Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.116207 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa"} err="failed to get container status \"56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa\": rpc error: code = NotFound desc = could not find container \"56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa\": container with ID starting with 56c4b84f76a97929b97c7659680f491dddc414a2e85740b53ca450c9a918beaa not found: ID does not exist" Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.123653 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sh4c"] Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.705698 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08634871-e819-4b75-93e5-fed45013b977" path="/var/lib/kubelet/pods/08634871-e819-4b75-93e5-fed45013b977/volumes" Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.763703 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.763759 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.821010 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.988004 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:11 crc kubenswrapper[5118]: I0223 06:51:11.988489 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:12 crc kubenswrapper[5118]: I0223 06:51:12.028132 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:51:12 crc kubenswrapper[5118]: I0223 06:51:12.130230 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-svbnv" Feb 23 06:51:12 crc kubenswrapper[5118]: I0223 06:51:12.137209 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m8j96" Feb 23 06:52:32 crc kubenswrapper[5118]: I0223 06:52:32.975330 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:52:32 crc kubenswrapper[5118]: I0223 06:52:32.976018 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:53:02 crc kubenswrapper[5118]: I0223 06:53:02.975252 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:53:02 crc kubenswrapper[5118]: I0223 06:53:02.975881 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:53:32 crc kubenswrapper[5118]: I0223 06:53:32.975452 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:53:32 crc kubenswrapper[5118]: I0223 06:53:32.976283 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:53:32 crc kubenswrapper[5118]: I0223 06:53:32.976344 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:53:32 crc kubenswrapper[5118]: I0223 06:53:32.977025 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68f4481a6269e5ecdc2ade0c03e9fffa62bc33bc2b72cad80c18c37653223298"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:53:32 crc kubenswrapper[5118]: I0223 06:53:32.977070 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://68f4481a6269e5ecdc2ade0c03e9fffa62bc33bc2b72cad80c18c37653223298" gracePeriod=600 Feb 23 06:53:34 crc kubenswrapper[5118]: I0223 06:53:34.061475 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="68f4481a6269e5ecdc2ade0c03e9fffa62bc33bc2b72cad80c18c37653223298" exitCode=0 Feb 23 06:53:34 crc kubenswrapper[5118]: I0223 06:53:34.061607 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"68f4481a6269e5ecdc2ade0c03e9fffa62bc33bc2b72cad80c18c37653223298"} Feb 23 06:53:34 crc kubenswrapper[5118]: I0223 06:53:34.061861 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"d49cae129c2c4cc8ccc31e30be8a372290b506680a9107dd5091173906b5a117"} Feb 23 06:53:34 crc kubenswrapper[5118]: I0223 06:53:34.061901 5118 scope.go:117] "RemoveContainer" containerID="e16b5ff6d4e5c69b2f860e4229e93318a9d047d8241bd58ca765a2e5ab3beeeb" Feb 23 06:56:02 crc kubenswrapper[5118]: I0223 06:56:02.975422 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:56:02 crc kubenswrapper[5118]: I0223 06:56:02.976056 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:56:32 crc kubenswrapper[5118]: I0223 06:56:32.976027 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:56:32 crc kubenswrapper[5118]: I0223 06:56:32.977200 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:57:00 crc kubenswrapper[5118]: I0223 06:57:00.023993 5118 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 06:57:02 crc kubenswrapper[5118]: I0223 06:57:02.975475 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:57:02 crc kubenswrapper[5118]: I0223 06:57:02.976471 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:57:02 crc kubenswrapper[5118]: I0223 06:57:02.976591 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 06:57:02 crc kubenswrapper[5118]: I0223 06:57:02.977578 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d49cae129c2c4cc8ccc31e30be8a372290b506680a9107dd5091173906b5a117"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 06:57:02 crc kubenswrapper[5118]: I0223 06:57:02.977685 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://d49cae129c2c4cc8ccc31e30be8a372290b506680a9107dd5091173906b5a117" gracePeriod=600 Feb 23 06:57:03 crc kubenswrapper[5118]: I0223 06:57:03.611935 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"d49cae129c2c4cc8ccc31e30be8a372290b506680a9107dd5091173906b5a117"} Feb 23 06:57:03 crc kubenswrapper[5118]: I0223 06:57:03.611961 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="d49cae129c2c4cc8ccc31e30be8a372290b506680a9107dd5091173906b5a117" exitCode=0 Feb 23 06:57:03 crc kubenswrapper[5118]: I0223 06:57:03.612256 5118 scope.go:117] "RemoveContainer" containerID="68f4481a6269e5ecdc2ade0c03e9fffa62bc33bc2b72cad80c18c37653223298" Feb 23 06:57:03 crc kubenswrapper[5118]: I0223 06:57:03.612293 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"52cbafc39cdb826b54cd76a2466f2669d01f6a03956f23813b0d4a4b258ebd73"} Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.264500 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4gnlk"] Feb 23 06:57:30 crc kubenswrapper[5118]: E0223 06:57:30.266853 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08634871-e819-4b75-93e5-fed45013b977" containerName="registry" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.266883 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="08634871-e819-4b75-93e5-fed45013b977" containerName="registry" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.267173 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="08634871-e819-4b75-93e5-fed45013b977" containerName="registry" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.269228 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.278184 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gnlk"] Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.421196 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-catalog-content\") pod \"redhat-operators-4gnlk\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.421291 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-utilities\") pod \"redhat-operators-4gnlk\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.421349 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqw28\" (UniqueName: \"kubernetes.io/projected/8921c292-f7ba-475b-b635-93f3c34c9c4f-kube-api-access-hqw28\") pod \"redhat-operators-4gnlk\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.523031 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-catalog-content\") pod \"redhat-operators-4gnlk\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.523088 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-utilities\") pod \"redhat-operators-4gnlk\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.523142 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqw28\" (UniqueName: \"kubernetes.io/projected/8921c292-f7ba-475b-b635-93f3c34c9c4f-kube-api-access-hqw28\") pod \"redhat-operators-4gnlk\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.523754 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-catalog-content\") pod \"redhat-operators-4gnlk\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.523959 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-utilities\") pod \"redhat-operators-4gnlk\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.555178 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqw28\" (UniqueName: \"kubernetes.io/projected/8921c292-f7ba-475b-b635-93f3c34c9c4f-kube-api-access-hqw28\") pod \"redhat-operators-4gnlk\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.592681 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:30 crc kubenswrapper[5118]: I0223 06:57:30.841109 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4gnlk"] Feb 23 06:57:30 crc kubenswrapper[5118]: W0223 06:57:30.847919 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8921c292_f7ba_475b_b635_93f3c34c9c4f.slice/crio-c51ca6d4c52f3defb8508b6ab06bd3662099fc0215c512a7ce9b1ad564b3b619 WatchSource:0}: Error finding container c51ca6d4c52f3defb8508b6ab06bd3662099fc0215c512a7ce9b1ad564b3b619: Status 404 returned error can't find the container with id c51ca6d4c52f3defb8508b6ab06bd3662099fc0215c512a7ce9b1ad564b3b619 Feb 23 06:57:31 crc kubenswrapper[5118]: I0223 06:57:31.836303 5118 generic.go:334] "Generic (PLEG): container finished" podID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerID="3d7c9cfb34f846ed35d850c7e64c8e8c3e1afae0b7e401ec3fe781689535a7de" exitCode=0 Feb 23 06:57:31 crc kubenswrapper[5118]: I0223 06:57:31.836426 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gnlk" event={"ID":"8921c292-f7ba-475b-b635-93f3c34c9c4f","Type":"ContainerDied","Data":"3d7c9cfb34f846ed35d850c7e64c8e8c3e1afae0b7e401ec3fe781689535a7de"} Feb 23 06:57:31 crc kubenswrapper[5118]: I0223 06:57:31.836632 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gnlk" event={"ID":"8921c292-f7ba-475b-b635-93f3c34c9c4f","Type":"ContainerStarted","Data":"c51ca6d4c52f3defb8508b6ab06bd3662099fc0215c512a7ce9b1ad564b3b619"} Feb 23 06:57:31 crc kubenswrapper[5118]: I0223 06:57:31.839304 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 06:57:32 crc kubenswrapper[5118]: I0223 06:57:32.844586 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gnlk" event={"ID":"8921c292-f7ba-475b-b635-93f3c34c9c4f","Type":"ContainerStarted","Data":"8ceb73e4c268d4d75eda4ac1741632656cd5ad921e9c5573832fb7f97c9961ed"} Feb 23 06:57:33 crc kubenswrapper[5118]: I0223 06:57:33.854271 5118 generic.go:334] "Generic (PLEG): container finished" podID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerID="8ceb73e4c268d4d75eda4ac1741632656cd5ad921e9c5573832fb7f97c9961ed" exitCode=0 Feb 23 06:57:33 crc kubenswrapper[5118]: I0223 06:57:33.854410 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gnlk" event={"ID":"8921c292-f7ba-475b-b635-93f3c34c9c4f","Type":"ContainerDied","Data":"8ceb73e4c268d4d75eda4ac1741632656cd5ad921e9c5573832fb7f97c9961ed"} Feb 23 06:57:34 crc kubenswrapper[5118]: I0223 06:57:34.866936 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gnlk" event={"ID":"8921c292-f7ba-475b-b635-93f3c34c9c4f","Type":"ContainerStarted","Data":"6b2874e71a3ac99c884670f5ca263aa6923bdb121369f78cb60afaf9d60ddfa9"} Feb 23 06:57:34 crc kubenswrapper[5118]: I0223 06:57:34.899811 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4gnlk" podStartSLOduration=2.4842120149999998 podStartE2EDuration="4.899769246s" podCreationTimestamp="2026-02-23 06:57:30 +0000 UTC" firstStartedPulling="2026-02-23 06:57:31.838969378 +0000 UTC m=+714.842753951" lastFinishedPulling="2026-02-23 06:57:34.254526569 +0000 UTC m=+717.258311182" observedRunningTime="2026-02-23 06:57:34.897497718 +0000 UTC m=+717.901282301" watchObservedRunningTime="2026-02-23 06:57:34.899769246 +0000 UTC m=+717.903553899" Feb 23 06:57:40 crc kubenswrapper[5118]: I0223 06:57:40.593391 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:40 crc kubenswrapper[5118]: I0223 06:57:40.593628 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:41 crc kubenswrapper[5118]: I0223 06:57:41.658286 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4gnlk" podUID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerName="registry-server" probeResult="failure" output=< Feb 23 06:57:41 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 06:57:41 crc kubenswrapper[5118]: > Feb 23 06:57:50 crc kubenswrapper[5118]: I0223 06:57:50.640758 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:50 crc kubenswrapper[5118]: I0223 06:57:50.701199 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:50 crc kubenswrapper[5118]: I0223 06:57:50.877858 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4gnlk"] Feb 23 06:57:51 crc kubenswrapper[5118]: I0223 06:57:51.983292 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4gnlk" podUID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerName="registry-server" containerID="cri-o://6b2874e71a3ac99c884670f5ca263aa6923bdb121369f78cb60afaf9d60ddfa9" gracePeriod=2 Feb 23 06:57:52 crc kubenswrapper[5118]: I0223 06:57:52.992070 5118 generic.go:334] "Generic (PLEG): container finished" podID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerID="6b2874e71a3ac99c884670f5ca263aa6923bdb121369f78cb60afaf9d60ddfa9" exitCode=0 Feb 23 06:57:52 crc kubenswrapper[5118]: I0223 06:57:52.992124 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gnlk" event={"ID":"8921c292-f7ba-475b-b635-93f3c34c9c4f","Type":"ContainerDied","Data":"6b2874e71a3ac99c884670f5ca263aa6923bdb121369f78cb60afaf9d60ddfa9"} Feb 23 06:57:53 crc kubenswrapper[5118]: I0223 06:57:53.102151 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:53 crc kubenswrapper[5118]: I0223 06:57:53.257537 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqw28\" (UniqueName: \"kubernetes.io/projected/8921c292-f7ba-475b-b635-93f3c34c9c4f-kube-api-access-hqw28\") pod \"8921c292-f7ba-475b-b635-93f3c34c9c4f\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " Feb 23 06:57:53 crc kubenswrapper[5118]: I0223 06:57:53.257663 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-utilities\") pod \"8921c292-f7ba-475b-b635-93f3c34c9c4f\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " Feb 23 06:57:53 crc kubenswrapper[5118]: I0223 06:57:53.257841 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-catalog-content\") pod \"8921c292-f7ba-475b-b635-93f3c34c9c4f\" (UID: \"8921c292-f7ba-475b-b635-93f3c34c9c4f\") " Feb 23 06:57:53 crc kubenswrapper[5118]: I0223 06:57:53.259738 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-utilities" (OuterVolumeSpecName: "utilities") pod "8921c292-f7ba-475b-b635-93f3c34c9c4f" (UID: "8921c292-f7ba-475b-b635-93f3c34c9c4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[5118]: I0223 06:57:53.268346 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8921c292-f7ba-475b-b635-93f3c34c9c4f-kube-api-access-hqw28" (OuterVolumeSpecName: "kube-api-access-hqw28") pod "8921c292-f7ba-475b-b635-93f3c34c9c4f" (UID: "8921c292-f7ba-475b-b635-93f3c34c9c4f"). InnerVolumeSpecName "kube-api-access-hqw28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[5118]: I0223 06:57:53.359995 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqw28\" (UniqueName: \"kubernetes.io/projected/8921c292-f7ba-475b-b635-93f3c34c9c4f-kube-api-access-hqw28\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[5118]: I0223 06:57:53.360062 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:53 crc kubenswrapper[5118]: I0223 06:57:53.404146 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8921c292-f7ba-475b-b635-93f3c34c9c4f" (UID: "8921c292-f7ba-475b-b635-93f3c34c9c4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:57:53 crc kubenswrapper[5118]: I0223 06:57:53.461568 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8921c292-f7ba-475b-b635-93f3c34c9c4f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:57:54 crc kubenswrapper[5118]: I0223 06:57:54.011330 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4gnlk" event={"ID":"8921c292-f7ba-475b-b635-93f3c34c9c4f","Type":"ContainerDied","Data":"c51ca6d4c52f3defb8508b6ab06bd3662099fc0215c512a7ce9b1ad564b3b619"} Feb 23 06:57:54 crc kubenswrapper[5118]: I0223 06:57:54.013177 5118 scope.go:117] "RemoveContainer" containerID="6b2874e71a3ac99c884670f5ca263aa6923bdb121369f78cb60afaf9d60ddfa9" Feb 23 06:57:54 crc kubenswrapper[5118]: I0223 06:57:54.011353 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4gnlk" Feb 23 06:57:54 crc kubenswrapper[5118]: I0223 06:57:54.046955 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4gnlk"] Feb 23 06:57:54 crc kubenswrapper[5118]: I0223 06:57:54.048223 5118 scope.go:117] "RemoveContainer" containerID="8ceb73e4c268d4d75eda4ac1741632656cd5ad921e9c5573832fb7f97c9961ed" Feb 23 06:57:54 crc kubenswrapper[5118]: I0223 06:57:54.051291 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4gnlk"] Feb 23 06:57:54 crc kubenswrapper[5118]: I0223 06:57:54.078819 5118 scope.go:117] "RemoveContainer" containerID="3d7c9cfb34f846ed35d850c7e64c8e8c3e1afae0b7e401ec3fe781689535a7de" Feb 23 06:57:55 crc kubenswrapper[5118]: I0223 06:57:55.710786 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8921c292-f7ba-475b-b635-93f3c34c9c4f" path="/var/lib/kubelet/pods/8921c292-f7ba-475b-b635-93f3c34c9c4f/volumes" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.201663 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p48pl"] Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.203153 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovn-controller" containerID="cri-o://c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5" gracePeriod=30 Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.203434 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="kube-rbac-proxy-node" containerID="cri-o://89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03" gracePeriod=30 Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.203520 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9" gracePeriod=30 Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.203507 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="northd" containerID="cri-o://a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5" gracePeriod=30 Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.203582 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="sbdb" containerID="cri-o://1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc" gracePeriod=30 Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.203716 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovn-acl-logging" containerID="cri-o://a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43" gracePeriod=30 Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.203746 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="nbdb" containerID="cri-o://e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273" gracePeriod=30 Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.288851 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovnkube-controller" containerID="cri-o://e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb" gracePeriod=30 Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.575558 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p48pl_542be1be-130f-46d0-9284-80695c2b17b4/ovn-acl-logging/0.log" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.576436 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p48pl_542be1be-130f-46d0-9284-80695c2b17b4/ovn-controller/0.log" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.576960 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.639679 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vxqgm"] Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.639960 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="nbdb" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.639978 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="nbdb" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.639997 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="northd" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640006 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="northd" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.640022 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="kubecfg-setup" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640030 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="kubecfg-setup" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.640039 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="sbdb" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640046 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="sbdb" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.640062 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovnkube-controller" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640071 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovnkube-controller" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.640079 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640087 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.640121 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerName="extract-content" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640129 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerName="extract-content" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.640140 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovn-controller" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640149 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovn-controller" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.640161 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovn-acl-logging" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640168 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovn-acl-logging" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.640178 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="kube-rbac-proxy-node" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640185 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="kube-rbac-proxy-node" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.640197 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerName="extract-utilities" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640205 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerName="extract-utilities" Feb 23 06:58:31 crc kubenswrapper[5118]: E0223 06:58:31.640214 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerName="registry-server" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640221 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerName="registry-server" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640346 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovn-controller" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640365 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8921c292-f7ba-475b-b635-93f3c34c9c4f" containerName="registry-server" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640378 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovn-acl-logging" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640387 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="sbdb" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640398 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="kube-rbac-proxy-node" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640407 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="nbdb" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640416 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="northd" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640449 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.640457 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="542be1be-130f-46d0-9284-80695c2b17b4" containerName="ovnkube-controller" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.642509 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772336 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-ovn-kubernetes\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772395 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-var-lib-openvswitch\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772455 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-netd\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772450 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772503 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfbm5\" (UniqueName: \"kubernetes.io/projected/542be1be-130f-46d0-9284-80695c2b17b4-kube-api-access-sfbm5\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772509 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772533 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772535 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-bin\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772591 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-kubelet\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772597 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772675 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-env-overrides\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772693 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-systemd\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772728 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-node-log\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772747 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-netns\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772765 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-config\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772790 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-etc-openvswitch\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772810 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-openvswitch\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772831 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-ovn\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772870 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/542be1be-130f-46d0-9284-80695c2b17b4-ovn-node-metrics-cert\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772893 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-slash\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772937 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-script-lib\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772959 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-log-socket\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.772983 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-systemd-units\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773019 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"542be1be-130f-46d0-9284-80695c2b17b4\" (UID: \"542be1be-130f-46d0-9284-80695c2b17b4\") " Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773259 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-slash" (OuterVolumeSpecName: "host-slash") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773312 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773350 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-node-log" (OuterVolumeSpecName: "node-log") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773306 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773362 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773413 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-log-socket" (OuterVolumeSpecName: "log-socket") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773388 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773873 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773925 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773949 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773966 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.773977 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774032 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774554 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-node-log\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774592 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-cni-bin\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774608 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlvsv\" (UniqueName: \"kubernetes.io/projected/c0ca7093-42f4-4588-a317-da04e1136821-kube-api-access-jlvsv\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774627 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c0ca7093-42f4-4588-a317-da04e1136821-ovn-node-metrics-cert\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774645 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-run-netns\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774665 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-run-ovn-kubernetes\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774699 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-run-openvswitch\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774747 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-systemd-units\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774790 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-etc-openvswitch\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774806 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774824 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-cni-netd\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774843 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c0ca7093-42f4-4588-a317-da04e1136821-env-overrides\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774858 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c0ca7093-42f4-4588-a317-da04e1136821-ovnkube-config\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774876 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c0ca7093-42f4-4588-a317-da04e1136821-ovnkube-script-lib\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774892 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-run-systemd\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774916 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-run-ovn\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774954 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-kubelet\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774969 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-slash\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.774986 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-log-socket\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775009 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-var-lib-openvswitch\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775068 5118 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-log-socket\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775079 5118 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775089 5118 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775118 5118 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775144 5118 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775153 5118 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775163 5118 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775172 5118 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775185 5118 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775196 5118 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-node-log\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775207 5118 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775218 5118 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775230 5118 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775239 5118 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775248 5118 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775256 5118 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-host-slash\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.775265 5118 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/542be1be-130f-46d0-9284-80695c2b17b4-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.781799 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/542be1be-130f-46d0-9284-80695c2b17b4-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.781922 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/542be1be-130f-46d0-9284-80695c2b17b4-kube-api-access-sfbm5" (OuterVolumeSpecName: "kube-api-access-sfbm5") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "kube-api-access-sfbm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.789412 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "542be1be-130f-46d0-9284-80695c2b17b4" (UID: "542be1be-130f-46d0-9284-80695c2b17b4"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.876072 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c0ca7093-42f4-4588-a317-da04e1136821-env-overrides\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.876496 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c0ca7093-42f4-4588-a317-da04e1136821-ovnkube-config\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.876675 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c0ca7093-42f4-4588-a317-da04e1136821-ovnkube-script-lib\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.876729 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c0ca7093-42f4-4588-a317-da04e1136821-env-overrides\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.876840 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-run-systemd\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877071 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-run-ovn\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877031 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-run-ovn\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877177 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-kubelet\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877231 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-kubelet\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877198 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-slash\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877326 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-slash\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877352 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-log-socket\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877372 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-var-lib-openvswitch\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877414 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-log-socket\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877440 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c0ca7093-42f4-4588-a317-da04e1136821-ovnkube-config\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877500 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-var-lib-openvswitch\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877544 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-node-log\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877575 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-cni-bin\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877618 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-node-log\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877655 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlvsv\" (UniqueName: \"kubernetes.io/projected/c0ca7093-42f4-4588-a317-da04e1136821-kube-api-access-jlvsv\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877669 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c0ca7093-42f4-4588-a317-da04e1136821-ovnkube-script-lib\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877713 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-cni-bin\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877686 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c0ca7093-42f4-4588-a317-da04e1136821-ovn-node-metrics-cert\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.877774 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-run-netns\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878251 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-run-ovn-kubernetes\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878252 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-run-netns\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878331 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-run-ovn-kubernetes\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878366 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-run-openvswitch\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878411 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-run-openvswitch\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878478 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-systemd-units\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878550 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-etc-openvswitch\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878590 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878640 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-etc-openvswitch\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878635 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-systemd-units\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878619 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-cni-netd\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878680 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878696 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-host-cni-netd\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878887 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/542be1be-130f-46d0-9284-80695c2b17b4-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878904 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfbm5\" (UniqueName: \"kubernetes.io/projected/542be1be-130f-46d0-9284-80695c2b17b4-kube-api-access-sfbm5\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.878918 5118 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/542be1be-130f-46d0-9284-80695c2b17b4-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.879788 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c0ca7093-42f4-4588-a317-da04e1136821-run-systemd\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.884327 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c0ca7093-42f4-4588-a317-da04e1136821-ovn-node-metrics-cert\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.896269 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlvsv\" (UniqueName: \"kubernetes.io/projected/c0ca7093-42f4-4588-a317-da04e1136821-kube-api-access-jlvsv\") pod \"ovnkube-node-vxqgm\" (UID: \"c0ca7093-42f4-4588-a317-da04e1136821\") " pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:31 crc kubenswrapper[5118]: I0223 06:58:31.958315 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.314012 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p48pl_542be1be-130f-46d0-9284-80695c2b17b4/ovn-acl-logging/0.log" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.315543 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p48pl_542be1be-130f-46d0-9284-80695c2b17b4/ovn-controller/0.log" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317073 5118 generic.go:334] "Generic (PLEG): container finished" podID="542be1be-130f-46d0-9284-80695c2b17b4" containerID="e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb" exitCode=0 Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317236 5118 generic.go:334] "Generic (PLEG): container finished" podID="542be1be-130f-46d0-9284-80695c2b17b4" containerID="1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc" exitCode=0 Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317317 5118 generic.go:334] "Generic (PLEG): container finished" podID="542be1be-130f-46d0-9284-80695c2b17b4" containerID="e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273" exitCode=0 Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317335 5118 generic.go:334] "Generic (PLEG): container finished" podID="542be1be-130f-46d0-9284-80695c2b17b4" containerID="a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5" exitCode=0 Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317348 5118 generic.go:334] "Generic (PLEG): container finished" podID="542be1be-130f-46d0-9284-80695c2b17b4" containerID="1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9" exitCode=0 Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317363 5118 generic.go:334] "Generic (PLEG): container finished" podID="542be1be-130f-46d0-9284-80695c2b17b4" containerID="89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03" exitCode=0 Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317378 5118 generic.go:334] "Generic (PLEG): container finished" podID="542be1be-130f-46d0-9284-80695c2b17b4" containerID="a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43" exitCode=143 Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317390 5118 generic.go:334] "Generic (PLEG): container finished" podID="542be1be-130f-46d0-9284-80695c2b17b4" containerID="c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5" exitCode=143 Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317472 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerDied","Data":"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317519 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerDied","Data":"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317581 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerDied","Data":"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317607 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerDied","Data":"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317626 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerDied","Data":"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317660 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319212 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerDied","Data":"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319260 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319283 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319295 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319310 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerDied","Data":"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319327 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319340 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319351 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319363 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319375 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319390 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319402 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319413 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319424 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319440 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerDied","Data":"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319456 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319468 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319480 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319491 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319506 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319517 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319530 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319542 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319553 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319567 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p48pl" event={"ID":"542be1be-130f-46d0-9284-80695c2b17b4","Type":"ContainerDied","Data":"32d3b59891e6e435f7646b74dfdc7379a80499da5f75691c15033df1edb67a62"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319584 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319596 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319607 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319618 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319629 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319640 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319650 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319661 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.319672 5118 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.317634 5118 scope.go:117] "RemoveContainer" containerID="e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.320297 5118 generic.go:334] "Generic (PLEG): container finished" podID="c0ca7093-42f4-4588-a317-da04e1136821" containerID="672ca0a7bcca41dc9d49c5379171c505d8280b73bafa5c95dea74bd73432e1a9" exitCode=0 Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.320355 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" event={"ID":"c0ca7093-42f4-4588-a317-da04e1136821","Type":"ContainerDied","Data":"672ca0a7bcca41dc9d49c5379171c505d8280b73bafa5c95dea74bd73432e1a9"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.320406 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" event={"ID":"c0ca7093-42f4-4588-a317-da04e1136821","Type":"ContainerStarted","Data":"b4c4cb66f61a1a4d65806f1ea10a6b22f5f46e03d586d2d2f7872c421f64bd80"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.322208 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzr6d_282b5bbe-e1e1-4b22-a815-bb27d70e550d/kube-multus/0.log" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.322238 5118 generic.go:334] "Generic (PLEG): container finished" podID="282b5bbe-e1e1-4b22-a815-bb27d70e550d" containerID="ffa6846e24eb1902b29fbc8f9415724c2183c693000f6a00607cfed96c60878b" exitCode=2 Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.322257 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzr6d" event={"ID":"282b5bbe-e1e1-4b22-a815-bb27d70e550d","Type":"ContainerDied","Data":"ffa6846e24eb1902b29fbc8f9415724c2183c693000f6a00607cfed96c60878b"} Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.322943 5118 scope.go:117] "RemoveContainer" containerID="ffa6846e24eb1902b29fbc8f9415724c2183c693000f6a00607cfed96c60878b" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.421976 5118 scope.go:117] "RemoveContainer" containerID="1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.446857 5118 scope.go:117] "RemoveContainer" containerID="e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.464344 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p48pl"] Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.465553 5118 scope.go:117] "RemoveContainer" containerID="a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.467582 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p48pl"] Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.483757 5118 scope.go:117] "RemoveContainer" containerID="1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.500017 5118 scope.go:117] "RemoveContainer" containerID="89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.514809 5118 scope.go:117] "RemoveContainer" containerID="a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.537341 5118 scope.go:117] "RemoveContainer" containerID="c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.558023 5118 scope.go:117] "RemoveContainer" containerID="71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.605810 5118 scope.go:117] "RemoveContainer" containerID="e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb" Feb 23 06:58:32 crc kubenswrapper[5118]: E0223 06:58:32.606387 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": container with ID starting with e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb not found: ID does not exist" containerID="e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.606418 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb"} err="failed to get container status \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": rpc error: code = NotFound desc = could not find container \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": container with ID starting with e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.606447 5118 scope.go:117] "RemoveContainer" containerID="1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc" Feb 23 06:58:32 crc kubenswrapper[5118]: E0223 06:58:32.606851 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": container with ID starting with 1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc not found: ID does not exist" containerID="1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.606873 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc"} err="failed to get container status \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": rpc error: code = NotFound desc = could not find container \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": container with ID starting with 1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.606887 5118 scope.go:117] "RemoveContainer" containerID="e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273" Feb 23 06:58:32 crc kubenswrapper[5118]: E0223 06:58:32.607166 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": container with ID starting with e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273 not found: ID does not exist" containerID="e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.607186 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273"} err="failed to get container status \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": rpc error: code = NotFound desc = could not find container \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": container with ID starting with e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.607201 5118 scope.go:117] "RemoveContainer" containerID="a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5" Feb 23 06:58:32 crc kubenswrapper[5118]: E0223 06:58:32.607437 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": container with ID starting with a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5 not found: ID does not exist" containerID="a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.607457 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5"} err="failed to get container status \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": rpc error: code = NotFound desc = could not find container \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": container with ID starting with a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.607468 5118 scope.go:117] "RemoveContainer" containerID="1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9" Feb 23 06:58:32 crc kubenswrapper[5118]: E0223 06:58:32.607711 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": container with ID starting with 1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9 not found: ID does not exist" containerID="1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.607731 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9"} err="failed to get container status \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": rpc error: code = NotFound desc = could not find container \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": container with ID starting with 1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.607743 5118 scope.go:117] "RemoveContainer" containerID="89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03" Feb 23 06:58:32 crc kubenswrapper[5118]: E0223 06:58:32.607984 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": container with ID starting with 89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03 not found: ID does not exist" containerID="89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.608004 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03"} err="failed to get container status \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": rpc error: code = NotFound desc = could not find container \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": container with ID starting with 89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.608016 5118 scope.go:117] "RemoveContainer" containerID="a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43" Feb 23 06:58:32 crc kubenswrapper[5118]: E0223 06:58:32.608454 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43\": container with ID starting with a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43 not found: ID does not exist" containerID="a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.608475 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43"} err="failed to get container status \"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43\": rpc error: code = NotFound desc = could not find container \"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43\": container with ID starting with a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.608487 5118 scope.go:117] "RemoveContainer" containerID="c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5" Feb 23 06:58:32 crc kubenswrapper[5118]: E0223 06:58:32.608731 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5\": container with ID starting with c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5 not found: ID does not exist" containerID="c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.608751 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5"} err="failed to get container status \"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5\": rpc error: code = NotFound desc = could not find container \"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5\": container with ID starting with c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.608767 5118 scope.go:117] "RemoveContainer" containerID="71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15" Feb 23 06:58:32 crc kubenswrapper[5118]: E0223 06:58:32.609016 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15\": container with ID starting with 71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15 not found: ID does not exist" containerID="71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.609036 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15"} err="failed to get container status \"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15\": rpc error: code = NotFound desc = could not find container \"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15\": container with ID starting with 71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.609049 5118 scope.go:117] "RemoveContainer" containerID="e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.609288 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb"} err="failed to get container status \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": rpc error: code = NotFound desc = could not find container \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": container with ID starting with e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.609308 5118 scope.go:117] "RemoveContainer" containerID="1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.609614 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc"} err="failed to get container status \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": rpc error: code = NotFound desc = could not find container \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": container with ID starting with 1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.609632 5118 scope.go:117] "RemoveContainer" containerID="e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.609917 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273"} err="failed to get container status \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": rpc error: code = NotFound desc = could not find container \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": container with ID starting with e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.609935 5118 scope.go:117] "RemoveContainer" containerID="a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.610352 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5"} err="failed to get container status \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": rpc error: code = NotFound desc = could not find container \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": container with ID starting with a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.610371 5118 scope.go:117] "RemoveContainer" containerID="1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.610578 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9"} err="failed to get container status \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": rpc error: code = NotFound desc = could not find container \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": container with ID starting with 1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.610597 5118 scope.go:117] "RemoveContainer" containerID="89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.610995 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03"} err="failed to get container status \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": rpc error: code = NotFound desc = could not find container \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": container with ID starting with 89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.611013 5118 scope.go:117] "RemoveContainer" containerID="a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.611234 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43"} err="failed to get container status \"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43\": rpc error: code = NotFound desc = could not find container \"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43\": container with ID starting with a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.611254 5118 scope.go:117] "RemoveContainer" containerID="c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.611468 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5"} err="failed to get container status \"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5\": rpc error: code = NotFound desc = could not find container \"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5\": container with ID starting with c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.611488 5118 scope.go:117] "RemoveContainer" containerID="71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.611735 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15"} err="failed to get container status \"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15\": rpc error: code = NotFound desc = could not find container \"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15\": container with ID starting with 71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.611753 5118 scope.go:117] "RemoveContainer" containerID="e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.612208 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb"} err="failed to get container status \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": rpc error: code = NotFound desc = could not find container \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": container with ID starting with e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.612225 5118 scope.go:117] "RemoveContainer" containerID="1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.612488 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc"} err="failed to get container status \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": rpc error: code = NotFound desc = could not find container \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": container with ID starting with 1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.612506 5118 scope.go:117] "RemoveContainer" containerID="e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.612922 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273"} err="failed to get container status \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": rpc error: code = NotFound desc = could not find container \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": container with ID starting with e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.612940 5118 scope.go:117] "RemoveContainer" containerID="a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.613182 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5"} err="failed to get container status \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": rpc error: code = NotFound desc = could not find container \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": container with ID starting with a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.613200 5118 scope.go:117] "RemoveContainer" containerID="1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.613590 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9"} err="failed to get container status \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": rpc error: code = NotFound desc = could not find container \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": container with ID starting with 1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.613611 5118 scope.go:117] "RemoveContainer" containerID="89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.619381 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03"} err="failed to get container status \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": rpc error: code = NotFound desc = could not find container \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": container with ID starting with 89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.619421 5118 scope.go:117] "RemoveContainer" containerID="a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.620010 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43"} err="failed to get container status \"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43\": rpc error: code = NotFound desc = could not find container \"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43\": container with ID starting with a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.620029 5118 scope.go:117] "RemoveContainer" containerID="c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.620621 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5"} err="failed to get container status \"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5\": rpc error: code = NotFound desc = could not find container \"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5\": container with ID starting with c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.620641 5118 scope.go:117] "RemoveContainer" containerID="71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.620940 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15"} err="failed to get container status \"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15\": rpc error: code = NotFound desc = could not find container \"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15\": container with ID starting with 71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.620957 5118 scope.go:117] "RemoveContainer" containerID="e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.621584 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb"} err="failed to get container status \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": rpc error: code = NotFound desc = could not find container \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": container with ID starting with e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.621609 5118 scope.go:117] "RemoveContainer" containerID="1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.622035 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc"} err="failed to get container status \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": rpc error: code = NotFound desc = could not find container \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": container with ID starting with 1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.622052 5118 scope.go:117] "RemoveContainer" containerID="e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.622426 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273"} err="failed to get container status \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": rpc error: code = NotFound desc = could not find container \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": container with ID starting with e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.622445 5118 scope.go:117] "RemoveContainer" containerID="a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.623308 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5"} err="failed to get container status \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": rpc error: code = NotFound desc = could not find container \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": container with ID starting with a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.623330 5118 scope.go:117] "RemoveContainer" containerID="1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.623750 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9"} err="failed to get container status \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": rpc error: code = NotFound desc = could not find container \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": container with ID starting with 1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.623767 5118 scope.go:117] "RemoveContainer" containerID="89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.624492 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03"} err="failed to get container status \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": rpc error: code = NotFound desc = could not find container \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": container with ID starting with 89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.624599 5118 scope.go:117] "RemoveContainer" containerID="a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.625024 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43"} err="failed to get container status \"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43\": rpc error: code = NotFound desc = could not find container \"a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43\": container with ID starting with a5d0bbf165d2070dfdc7567af707a6e97df5b08b2f38cf35568a6e6d9f145a43 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.625049 5118 scope.go:117] "RemoveContainer" containerID="c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.625300 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5"} err="failed to get container status \"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5\": rpc error: code = NotFound desc = could not find container \"c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5\": container with ID starting with c127ba7b82fe8306912a48475e58b7be0583525933cdfbe05f85036cb0575fa5 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.625321 5118 scope.go:117] "RemoveContainer" containerID="71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.625563 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15"} err="failed to get container status \"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15\": rpc error: code = NotFound desc = could not find container \"71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15\": container with ID starting with 71ffa16c756e3d0626d9a3f0a07b6f6ede5e6c8b78a5161618aac9626429dd15 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.625583 5118 scope.go:117] "RemoveContainer" containerID="e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.625802 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb"} err="failed to get container status \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": rpc error: code = NotFound desc = could not find container \"e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb\": container with ID starting with e936fcf2c6862999a019186cb16a84619ed8300ca4926cc6ef61ff012658fdeb not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.625823 5118 scope.go:117] "RemoveContainer" containerID="1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.626028 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc"} err="failed to get container status \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": rpc error: code = NotFound desc = could not find container \"1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc\": container with ID starting with 1b0410c7cf6fcd39f2771adc4eca38bda0973eda883e994f6d736ae95e1e9adc not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.626050 5118 scope.go:117] "RemoveContainer" containerID="e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.626275 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273"} err="failed to get container status \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": rpc error: code = NotFound desc = could not find container \"e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273\": container with ID starting with e2410d00a835c555ae711e36ee7710a07179b8b3b63921871696c4c7b85f6273 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.626295 5118 scope.go:117] "RemoveContainer" containerID="a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.626475 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5"} err="failed to get container status \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": rpc error: code = NotFound desc = could not find container \"a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5\": container with ID starting with a482020edbcda6dee2136e5a3f66228727f47ea5fcceb593fd3212640a75d5e5 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.626494 5118 scope.go:117] "RemoveContainer" containerID="1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.626679 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9"} err="failed to get container status \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": rpc error: code = NotFound desc = could not find container \"1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9\": container with ID starting with 1b945928a7b6f5f303fc8db2252d183e737420e30fbd6397d3d5260b3bdb0ca9 not found: ID does not exist" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.626699 5118 scope.go:117] "RemoveContainer" containerID="89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03" Feb 23 06:58:32 crc kubenswrapper[5118]: I0223 06:58:32.626884 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03"} err="failed to get container status \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": rpc error: code = NotFound desc = could not find container \"89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03\": container with ID starting with 89a0a00083eeb9edb2deaae6e6ef13f90330c7b7a32b598e37d8e18c99b94f03 not found: ID does not exist" Feb 23 06:58:33 crc kubenswrapper[5118]: I0223 06:58:33.337686 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" event={"ID":"c0ca7093-42f4-4588-a317-da04e1136821","Type":"ContainerStarted","Data":"fbaf525befeac356a40386765259331a8fe40677b35518d45e3c3740a18a7c0c"} Feb 23 06:58:33 crc kubenswrapper[5118]: I0223 06:58:33.338306 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" event={"ID":"c0ca7093-42f4-4588-a317-da04e1136821","Type":"ContainerStarted","Data":"6a65662e7206798cd934c266dde454f6d85ac92b8f0dfcd6302d52bc62356125"} Feb 23 06:58:33 crc kubenswrapper[5118]: I0223 06:58:33.338329 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" event={"ID":"c0ca7093-42f4-4588-a317-da04e1136821","Type":"ContainerStarted","Data":"6bd0f6817c7541277ddeb8485645ea16b395e1e68989d3ed7f60574e982dc8f5"} Feb 23 06:58:33 crc kubenswrapper[5118]: I0223 06:58:33.338351 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" event={"ID":"c0ca7093-42f4-4588-a317-da04e1136821","Type":"ContainerStarted","Data":"6e5689652b85fdb244f3fa4615c7eb38b344ad88ce6f4be168b966d5353bdf7e"} Feb 23 06:58:33 crc kubenswrapper[5118]: I0223 06:58:33.338372 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" event={"ID":"c0ca7093-42f4-4588-a317-da04e1136821","Type":"ContainerStarted","Data":"0681cff55a416f2843ad441de193d475234e3628ca370b0656cf8d6677ff1157"} Feb 23 06:58:33 crc kubenswrapper[5118]: I0223 06:58:33.338391 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" event={"ID":"c0ca7093-42f4-4588-a317-da04e1136821","Type":"ContainerStarted","Data":"88bcaa2ccb5e321917eaf59d216d5090e06dc546b1117672614ea48a21ffe2a7"} Feb 23 06:58:33 crc kubenswrapper[5118]: I0223 06:58:33.340796 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xzr6d_282b5bbe-e1e1-4b22-a815-bb27d70e550d/kube-multus/0.log" Feb 23 06:58:33 crc kubenswrapper[5118]: I0223 06:58:33.340937 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xzr6d" event={"ID":"282b5bbe-e1e1-4b22-a815-bb27d70e550d","Type":"ContainerStarted","Data":"57cac255b57dc4694211072bf9b0ecf37bc7ecc0176cc3a83f00f1a08ad33442"} Feb 23 06:58:33 crc kubenswrapper[5118]: I0223 06:58:33.706262 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="542be1be-130f-46d0-9284-80695c2b17b4" path="/var/lib/kubelet/pods/542be1be-130f-46d0-9284-80695c2b17b4/volumes" Feb 23 06:58:36 crc kubenswrapper[5118]: I0223 06:58:36.391387 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" event={"ID":"c0ca7093-42f4-4588-a317-da04e1136821","Type":"ContainerStarted","Data":"e7a6b19ba2c25840d977721e897f108c72e55390a9112d99b06861423a8457f8"} Feb 23 06:58:38 crc kubenswrapper[5118]: I0223 06:58:38.412676 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" event={"ID":"c0ca7093-42f4-4588-a317-da04e1136821","Type":"ContainerStarted","Data":"e3e51a491aa14cc3eff06d09998c10b94d898921824f33cc6be1510f96ff3615"} Feb 23 06:58:38 crc kubenswrapper[5118]: I0223 06:58:38.413199 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:38 crc kubenswrapper[5118]: I0223 06:58:38.413219 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:38 crc kubenswrapper[5118]: I0223 06:58:38.453790 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:38 crc kubenswrapper[5118]: I0223 06:58:38.459560 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" podStartSLOduration=7.459532118 podStartE2EDuration="7.459532118s" podCreationTimestamp="2026-02-23 06:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:58:38.453986319 +0000 UTC m=+781.457770912" watchObservedRunningTime="2026-02-23 06:58:38.459532118 +0000 UTC m=+781.463316731" Feb 23 06:58:39 crc kubenswrapper[5118]: I0223 06:58:39.419030 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:39 crc kubenswrapper[5118]: I0223 06:58:39.460681 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:58:40 crc kubenswrapper[5118]: I0223 06:58:40.925285 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-cgwkc"] Feb 23 06:58:40 crc kubenswrapper[5118]: I0223 06:58:40.927834 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:40 crc kubenswrapper[5118]: I0223 06:58:40.931476 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 23 06:58:40 crc kubenswrapper[5118]: I0223 06:58:40.931748 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 23 06:58:40 crc kubenswrapper[5118]: I0223 06:58:40.931786 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 23 06:58:40 crc kubenswrapper[5118]: I0223 06:58:40.931482 5118 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-hsj24" Feb 23 06:58:40 crc kubenswrapper[5118]: I0223 06:58:40.932523 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5b146f5c-57ff-45e2-862d-41329e4fd358-node-mnt\") pod \"crc-storage-crc-cgwkc\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:40 crc kubenswrapper[5118]: I0223 06:58:40.932639 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5b146f5c-57ff-45e2-862d-41329e4fd358-crc-storage\") pod \"crc-storage-crc-cgwkc\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:40 crc kubenswrapper[5118]: I0223 06:58:40.932860 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nkd7\" (UniqueName: \"kubernetes.io/projected/5b146f5c-57ff-45e2-862d-41329e4fd358-kube-api-access-4nkd7\") pod \"crc-storage-crc-cgwkc\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:40 crc kubenswrapper[5118]: I0223 06:58:40.934720 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-cgwkc"] Feb 23 06:58:41 crc kubenswrapper[5118]: I0223 06:58:41.034665 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5b146f5c-57ff-45e2-862d-41329e4fd358-node-mnt\") pod \"crc-storage-crc-cgwkc\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: I0223 06:58:41.034779 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5b146f5c-57ff-45e2-862d-41329e4fd358-crc-storage\") pod \"crc-storage-crc-cgwkc\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: I0223 06:58:41.035050 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nkd7\" (UniqueName: \"kubernetes.io/projected/5b146f5c-57ff-45e2-862d-41329e4fd358-kube-api-access-4nkd7\") pod \"crc-storage-crc-cgwkc\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: I0223 06:58:41.035209 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5b146f5c-57ff-45e2-862d-41329e4fd358-node-mnt\") pod \"crc-storage-crc-cgwkc\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: I0223 06:58:41.035860 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5b146f5c-57ff-45e2-862d-41329e4fd358-crc-storage\") pod \"crc-storage-crc-cgwkc\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: I0223 06:58:41.060349 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nkd7\" (UniqueName: \"kubernetes.io/projected/5b146f5c-57ff-45e2-862d-41329e4fd358-kube-api-access-4nkd7\") pod \"crc-storage-crc-cgwkc\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: I0223 06:58:41.247437 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: E0223 06:58:41.291038 5118 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-cgwkc_crc-storage_5b146f5c-57ff-45e2-862d-41329e4fd358_0(bbfeda66bec19dca9785605d68629b2772e3a219161542829937ed63492a4387): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:58:41 crc kubenswrapper[5118]: E0223 06:58:41.291174 5118 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-cgwkc_crc-storage_5b146f5c-57ff-45e2-862d-41329e4fd358_0(bbfeda66bec19dca9785605d68629b2772e3a219161542829937ed63492a4387): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: E0223 06:58:41.291226 5118 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-cgwkc_crc-storage_5b146f5c-57ff-45e2-862d-41329e4fd358_0(bbfeda66bec19dca9785605d68629b2772e3a219161542829937ed63492a4387): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: E0223 06:58:41.291310 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-cgwkc_crc-storage(5b146f5c-57ff-45e2-862d-41329e4fd358)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-cgwkc_crc-storage(5b146f5c-57ff-45e2-862d-41329e4fd358)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-cgwkc_crc-storage_5b146f5c-57ff-45e2-862d-41329e4fd358_0(bbfeda66bec19dca9785605d68629b2772e3a219161542829937ed63492a4387): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-cgwkc" podUID="5b146f5c-57ff-45e2-862d-41329e4fd358" Feb 23 06:58:41 crc kubenswrapper[5118]: I0223 06:58:41.431045 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: I0223 06:58:41.431932 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: E0223 06:58:41.478262 5118 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-cgwkc_crc-storage_5b146f5c-57ff-45e2-862d-41329e4fd358_0(6e9b4cf4e8f4b83a2e896f4849d238ad38817385d48de67705aae4b23776114c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 06:58:41 crc kubenswrapper[5118]: E0223 06:58:41.478349 5118 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-cgwkc_crc-storage_5b146f5c-57ff-45e2-862d-41329e4fd358_0(6e9b4cf4e8f4b83a2e896f4849d238ad38817385d48de67705aae4b23776114c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: E0223 06:58:41.478374 5118 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-cgwkc_crc-storage_5b146f5c-57ff-45e2-862d-41329e4fd358_0(6e9b4cf4e8f4b83a2e896f4849d238ad38817385d48de67705aae4b23776114c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:41 crc kubenswrapper[5118]: E0223 06:58:41.478444 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-cgwkc_crc-storage(5b146f5c-57ff-45e2-862d-41329e4fd358)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-cgwkc_crc-storage(5b146f5c-57ff-45e2-862d-41329e4fd358)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-cgwkc_crc-storage_5b146f5c-57ff-45e2-862d-41329e4fd358_0(6e9b4cf4e8f4b83a2e896f4849d238ad38817385d48de67705aae4b23776114c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-cgwkc" podUID="5b146f5c-57ff-45e2-862d-41329e4fd358" Feb 23 06:58:56 crc kubenswrapper[5118]: I0223 06:58:56.697055 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:56 crc kubenswrapper[5118]: I0223 06:58:56.700089 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:58:57 crc kubenswrapper[5118]: I0223 06:58:57.029696 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-cgwkc"] Feb 23 06:58:57 crc kubenswrapper[5118]: W0223 06:58:57.039652 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b146f5c_57ff_45e2_862d_41329e4fd358.slice/crio-fee65ded9a4516b2f78562e4a5025248f9b9cf01564394c7d9c4f8c5a92b977d WatchSource:0}: Error finding container fee65ded9a4516b2f78562e4a5025248f9b9cf01564394c7d9c4f8c5a92b977d: Status 404 returned error can't find the container with id fee65ded9a4516b2f78562e4a5025248f9b9cf01564394c7d9c4f8c5a92b977d Feb 23 06:58:57 crc kubenswrapper[5118]: I0223 06:58:57.542757 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-cgwkc" event={"ID":"5b146f5c-57ff-45e2-862d-41329e4fd358","Type":"ContainerStarted","Data":"fee65ded9a4516b2f78562e4a5025248f9b9cf01564394c7d9c4f8c5a92b977d"} Feb 23 06:58:59 crc kubenswrapper[5118]: I0223 06:58:59.557504 5118 generic.go:334] "Generic (PLEG): container finished" podID="5b146f5c-57ff-45e2-862d-41329e4fd358" containerID="68148643f0bfaec0fd75a57daaf7377976aaddd6db878112a4bb51c63721ea41" exitCode=0 Feb 23 06:58:59 crc kubenswrapper[5118]: I0223 06:58:59.557598 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-cgwkc" event={"ID":"5b146f5c-57ff-45e2-862d-41329e4fd358","Type":"ContainerDied","Data":"68148643f0bfaec0fd75a57daaf7377976aaddd6db878112a4bb51c63721ea41"} Feb 23 06:59:00 crc kubenswrapper[5118]: I0223 06:59:00.801855 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:59:00 crc kubenswrapper[5118]: I0223 06:59:00.866177 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nkd7\" (UniqueName: \"kubernetes.io/projected/5b146f5c-57ff-45e2-862d-41329e4fd358-kube-api-access-4nkd7\") pod \"5b146f5c-57ff-45e2-862d-41329e4fd358\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " Feb 23 06:59:00 crc kubenswrapper[5118]: I0223 06:59:00.866263 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5b146f5c-57ff-45e2-862d-41329e4fd358-node-mnt\") pod \"5b146f5c-57ff-45e2-862d-41329e4fd358\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " Feb 23 06:59:00 crc kubenswrapper[5118]: I0223 06:59:00.866370 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5b146f5c-57ff-45e2-862d-41329e4fd358-crc-storage\") pod \"5b146f5c-57ff-45e2-862d-41329e4fd358\" (UID: \"5b146f5c-57ff-45e2-862d-41329e4fd358\") " Feb 23 06:59:00 crc kubenswrapper[5118]: I0223 06:59:00.866493 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b146f5c-57ff-45e2-862d-41329e4fd358-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "5b146f5c-57ff-45e2-862d-41329e4fd358" (UID: "5b146f5c-57ff-45e2-862d-41329e4fd358"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 06:59:00 crc kubenswrapper[5118]: I0223 06:59:00.866957 5118 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/5b146f5c-57ff-45e2-862d-41329e4fd358-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:00 crc kubenswrapper[5118]: I0223 06:59:00.875479 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b146f5c-57ff-45e2-862d-41329e4fd358-kube-api-access-4nkd7" (OuterVolumeSpecName: "kube-api-access-4nkd7") pod "5b146f5c-57ff-45e2-862d-41329e4fd358" (UID: "5b146f5c-57ff-45e2-862d-41329e4fd358"). InnerVolumeSpecName "kube-api-access-4nkd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:00 crc kubenswrapper[5118]: I0223 06:59:00.888502 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b146f5c-57ff-45e2-862d-41329e4fd358-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "5b146f5c-57ff-45e2-862d-41329e4fd358" (UID: "5b146f5c-57ff-45e2-862d-41329e4fd358"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 06:59:00 crc kubenswrapper[5118]: I0223 06:59:00.968570 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nkd7\" (UniqueName: \"kubernetes.io/projected/5b146f5c-57ff-45e2-862d-41329e4fd358-kube-api-access-4nkd7\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:00 crc kubenswrapper[5118]: I0223 06:59:00.968623 5118 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/5b146f5c-57ff-45e2-862d-41329e4fd358-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:01 crc kubenswrapper[5118]: I0223 06:59:01.574043 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-cgwkc" event={"ID":"5b146f5c-57ff-45e2-862d-41329e4fd358","Type":"ContainerDied","Data":"fee65ded9a4516b2f78562e4a5025248f9b9cf01564394c7d9c4f8c5a92b977d"} Feb 23 06:59:01 crc kubenswrapper[5118]: I0223 06:59:01.574205 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fee65ded9a4516b2f78562e4a5025248f9b9cf01564394c7d9c4f8c5a92b977d" Feb 23 06:59:01 crc kubenswrapper[5118]: I0223 06:59:01.574268 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-cgwkc" Feb 23 06:59:01 crc kubenswrapper[5118]: I0223 06:59:01.986868 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vxqgm" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.484297 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7ggqg"] Feb 23 06:59:02 crc kubenswrapper[5118]: E0223 06:59:02.485002 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b146f5c-57ff-45e2-862d-41329e4fd358" containerName="storage" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.485018 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b146f5c-57ff-45e2-862d-41329e4fd358" containerName="storage" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.485168 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b146f5c-57ff-45e2-862d-41329e4fd358" containerName="storage" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.486140 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.507319 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ggqg"] Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.592795 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fh6k\" (UniqueName: \"kubernetes.io/projected/9c70aa25-7823-4713-9e15-a05939f28d00-kube-api-access-9fh6k\") pod \"redhat-marketplace-7ggqg\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.592945 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-utilities\") pod \"redhat-marketplace-7ggqg\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.592981 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-catalog-content\") pod \"redhat-marketplace-7ggqg\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.693987 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-utilities\") pod \"redhat-marketplace-7ggqg\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.694045 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-catalog-content\") pod \"redhat-marketplace-7ggqg\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.694121 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fh6k\" (UniqueName: \"kubernetes.io/projected/9c70aa25-7823-4713-9e15-a05939f28d00-kube-api-access-9fh6k\") pod \"redhat-marketplace-7ggqg\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.694980 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-utilities\") pod \"redhat-marketplace-7ggqg\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.695092 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-catalog-content\") pod \"redhat-marketplace-7ggqg\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.726402 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fh6k\" (UniqueName: \"kubernetes.io/projected/9c70aa25-7823-4713-9e15-a05939f28d00-kube-api-access-9fh6k\") pod \"redhat-marketplace-7ggqg\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:02 crc kubenswrapper[5118]: I0223 06:59:02.808853 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:03 crc kubenswrapper[5118]: I0223 06:59:03.031848 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ggqg"] Feb 23 06:59:03 crc kubenswrapper[5118]: I0223 06:59:03.599131 5118 generic.go:334] "Generic (PLEG): container finished" podID="9c70aa25-7823-4713-9e15-a05939f28d00" containerID="e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247" exitCode=0 Feb 23 06:59:03 crc kubenswrapper[5118]: I0223 06:59:03.599216 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ggqg" event={"ID":"9c70aa25-7823-4713-9e15-a05939f28d00","Type":"ContainerDied","Data":"e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247"} Feb 23 06:59:03 crc kubenswrapper[5118]: I0223 06:59:03.599260 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ggqg" event={"ID":"9c70aa25-7823-4713-9e15-a05939f28d00","Type":"ContainerStarted","Data":"6e3a5c55dc861be265516aa0a8336d9d74c59892ab53813497e037ef40a18fb2"} Feb 23 06:59:04 crc kubenswrapper[5118]: I0223 06:59:04.609034 5118 generic.go:334] "Generic (PLEG): container finished" podID="9c70aa25-7823-4713-9e15-a05939f28d00" containerID="13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295" exitCode=0 Feb 23 06:59:04 crc kubenswrapper[5118]: I0223 06:59:04.609264 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ggqg" event={"ID":"9c70aa25-7823-4713-9e15-a05939f28d00","Type":"ContainerDied","Data":"13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295"} Feb 23 06:59:05 crc kubenswrapper[5118]: I0223 06:59:05.619064 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ggqg" event={"ID":"9c70aa25-7823-4713-9e15-a05939f28d00","Type":"ContainerStarted","Data":"016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007"} Feb 23 06:59:05 crc kubenswrapper[5118]: I0223 06:59:05.645953 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7ggqg" podStartSLOduration=2.278768364 podStartE2EDuration="3.645918532s" podCreationTimestamp="2026-02-23 06:59:02 +0000 UTC" firstStartedPulling="2026-02-23 06:59:03.602601111 +0000 UTC m=+806.606385684" lastFinishedPulling="2026-02-23 06:59:04.969751289 +0000 UTC m=+807.973535852" observedRunningTime="2026-02-23 06:59:05.639745317 +0000 UTC m=+808.643529890" watchObservedRunningTime="2026-02-23 06:59:05.645918532 +0000 UTC m=+808.649703115" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.158743 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d"] Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.181463 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.185025 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.194070 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d"] Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.286581 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.286682 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.286703 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft4qc\" (UniqueName: \"kubernetes.io/projected/dd42c5ed-566b-45c4-b508-b14c7f0a2512-kube-api-access-ft4qc\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.387801 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.387873 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft4qc\" (UniqueName: \"kubernetes.io/projected/dd42c5ed-566b-45c4-b508-b14c7f0a2512-kube-api-access-ft4qc\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.387909 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.388818 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.388876 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.415246 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft4qc\" (UniqueName: \"kubernetes.io/projected/dd42c5ed-566b-45c4-b508-b14c7f0a2512-kube-api-access-ft4qc\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.524555 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:08 crc kubenswrapper[5118]: I0223 06:59:08.802693 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d"] Feb 23 06:59:08 crc kubenswrapper[5118]: W0223 06:59:08.822380 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd42c5ed_566b_45c4_b508_b14c7f0a2512.slice/crio-c168bf7215b1442f56bb2c7a68a6f32e2e5f8bf02ea05167f80c1d77fb015b56 WatchSource:0}: Error finding container c168bf7215b1442f56bb2c7a68a6f32e2e5f8bf02ea05167f80c1d77fb015b56: Status 404 returned error can't find the container with id c168bf7215b1442f56bb2c7a68a6f32e2e5f8bf02ea05167f80c1d77fb015b56 Feb 23 06:59:09 crc kubenswrapper[5118]: I0223 06:59:09.651197 5118 generic.go:334] "Generic (PLEG): container finished" podID="dd42c5ed-566b-45c4-b508-b14c7f0a2512" containerID="9e94daec13f34becb258ce01400e7476c732f11236a6c2a4f537fd498351e9ff" exitCode=0 Feb 23 06:59:09 crc kubenswrapper[5118]: I0223 06:59:09.651263 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" event={"ID":"dd42c5ed-566b-45c4-b508-b14c7f0a2512","Type":"ContainerDied","Data":"9e94daec13f34becb258ce01400e7476c732f11236a6c2a4f537fd498351e9ff"} Feb 23 06:59:09 crc kubenswrapper[5118]: I0223 06:59:09.651711 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" event={"ID":"dd42c5ed-566b-45c4-b508-b14c7f0a2512","Type":"ContainerStarted","Data":"c168bf7215b1442f56bb2c7a68a6f32e2e5f8bf02ea05167f80c1d77fb015b56"} Feb 23 06:59:11 crc kubenswrapper[5118]: I0223 06:59:11.672942 5118 generic.go:334] "Generic (PLEG): container finished" podID="dd42c5ed-566b-45c4-b508-b14c7f0a2512" containerID="d0347108ae36e541e4c42185f97da15297992ec7d64644b98484ff3bdcc9574c" exitCode=0 Feb 23 06:59:11 crc kubenswrapper[5118]: I0223 06:59:11.673053 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" event={"ID":"dd42c5ed-566b-45c4-b508-b14c7f0a2512","Type":"ContainerDied","Data":"d0347108ae36e541e4c42185f97da15297992ec7d64644b98484ff3bdcc9574c"} Feb 23 06:59:12 crc kubenswrapper[5118]: I0223 06:59:12.682901 5118 generic.go:334] "Generic (PLEG): container finished" podID="dd42c5ed-566b-45c4-b508-b14c7f0a2512" containerID="0d5d42cbdfa012f6b8881d4cd8037bf33811dae6e3b5c07ffe1846fdb8dcb46c" exitCode=0 Feb 23 06:59:12 crc kubenswrapper[5118]: I0223 06:59:12.683342 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" event={"ID":"dd42c5ed-566b-45c4-b508-b14c7f0a2512","Type":"ContainerDied","Data":"0d5d42cbdfa012f6b8881d4cd8037bf33811dae6e3b5c07ffe1846fdb8dcb46c"} Feb 23 06:59:12 crc kubenswrapper[5118]: I0223 06:59:12.809401 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:12 crc kubenswrapper[5118]: I0223 06:59:12.809498 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:12 crc kubenswrapper[5118]: I0223 06:59:12.887692 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:13 crc kubenswrapper[5118]: I0223 06:59:13.773007 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.024293 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.183149 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft4qc\" (UniqueName: \"kubernetes.io/projected/dd42c5ed-566b-45c4-b508-b14c7f0a2512-kube-api-access-ft4qc\") pod \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.183226 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-util\") pod \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.183273 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-bundle\") pod \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\" (UID: \"dd42c5ed-566b-45c4-b508-b14c7f0a2512\") " Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.184398 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-bundle" (OuterVolumeSpecName: "bundle") pod "dd42c5ed-566b-45c4-b508-b14c7f0a2512" (UID: "dd42c5ed-566b-45c4-b508-b14c7f0a2512"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.191551 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd42c5ed-566b-45c4-b508-b14c7f0a2512-kube-api-access-ft4qc" (OuterVolumeSpecName: "kube-api-access-ft4qc") pod "dd42c5ed-566b-45c4-b508-b14c7f0a2512" (UID: "dd42c5ed-566b-45c4-b508-b14c7f0a2512"). InnerVolumeSpecName "kube-api-access-ft4qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.284951 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft4qc\" (UniqueName: \"kubernetes.io/projected/dd42c5ed-566b-45c4-b508-b14c7f0a2512-kube-api-access-ft4qc\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.284984 5118 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.374012 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-util" (OuterVolumeSpecName: "util") pod "dd42c5ed-566b-45c4-b508-b14c7f0a2512" (UID: "dd42c5ed-566b-45c4-b508-b14c7f0a2512"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.386679 5118 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd42c5ed-566b-45c4-b508-b14c7f0a2512-util\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.702485 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" event={"ID":"dd42c5ed-566b-45c4-b508-b14c7f0a2512","Type":"ContainerDied","Data":"c168bf7215b1442f56bb2c7a68a6f32e2e5f8bf02ea05167f80c1d77fb015b56"} Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.702840 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c168bf7215b1442f56bb2c7a68a6f32e2e5f8bf02ea05167f80c1d77fb015b56" Feb 23 06:59:14 crc kubenswrapper[5118]: I0223 06:59:14.702508 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d" Feb 23 06:59:15 crc kubenswrapper[5118]: I0223 06:59:15.851815 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ggqg"] Feb 23 06:59:15 crc kubenswrapper[5118]: I0223 06:59:15.852663 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7ggqg" podUID="9c70aa25-7823-4713-9e15-a05939f28d00" containerName="registry-server" containerID="cri-o://016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007" gracePeriod=2 Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.270277 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.426893 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-catalog-content\") pod \"9c70aa25-7823-4713-9e15-a05939f28d00\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.427015 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fh6k\" (UniqueName: \"kubernetes.io/projected/9c70aa25-7823-4713-9e15-a05939f28d00-kube-api-access-9fh6k\") pod \"9c70aa25-7823-4713-9e15-a05939f28d00\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.427089 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-utilities\") pod \"9c70aa25-7823-4713-9e15-a05939f28d00\" (UID: \"9c70aa25-7823-4713-9e15-a05939f28d00\") " Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.429672 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-utilities" (OuterVolumeSpecName: "utilities") pod "9c70aa25-7823-4713-9e15-a05939f28d00" (UID: "9c70aa25-7823-4713-9e15-a05939f28d00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.439133 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c70aa25-7823-4713-9e15-a05939f28d00-kube-api-access-9fh6k" (OuterVolumeSpecName: "kube-api-access-9fh6k") pod "9c70aa25-7823-4713-9e15-a05939f28d00" (UID: "9c70aa25-7823-4713-9e15-a05939f28d00"). InnerVolumeSpecName "kube-api-access-9fh6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.478865 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c70aa25-7823-4713-9e15-a05939f28d00" (UID: "9c70aa25-7823-4713-9e15-a05939f28d00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.529512 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.529572 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fh6k\" (UniqueName: \"kubernetes.io/projected/9c70aa25-7823-4713-9e15-a05939f28d00-kube-api-access-9fh6k\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.529594 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c70aa25-7823-4713-9e15-a05939f28d00-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.726821 5118 generic.go:334] "Generic (PLEG): container finished" podID="9c70aa25-7823-4713-9e15-a05939f28d00" containerID="016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007" exitCode=0 Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.726892 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ggqg" event={"ID":"9c70aa25-7823-4713-9e15-a05939f28d00","Type":"ContainerDied","Data":"016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007"} Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.726927 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ggqg" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.726950 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ggqg" event={"ID":"9c70aa25-7823-4713-9e15-a05939f28d00","Type":"ContainerDied","Data":"6e3a5c55dc861be265516aa0a8336d9d74c59892ab53813497e037ef40a18fb2"} Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.726989 5118 scope.go:117] "RemoveContainer" containerID="016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.751611 5118 scope.go:117] "RemoveContainer" containerID="13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.777370 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ggqg"] Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.783834 5118 scope.go:117] "RemoveContainer" containerID="e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.785978 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ggqg"] Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.810302 5118 scope.go:117] "RemoveContainer" containerID="016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007" Feb 23 06:59:16 crc kubenswrapper[5118]: E0223 06:59:16.810890 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007\": container with ID starting with 016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007 not found: ID does not exist" containerID="016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.810929 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007"} err="failed to get container status \"016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007\": rpc error: code = NotFound desc = could not find container \"016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007\": container with ID starting with 016b1950242699023ffd7c8bb1d28fdf8c883a128dcdb51070c04fa801dc7007 not found: ID does not exist" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.810962 5118 scope.go:117] "RemoveContainer" containerID="13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295" Feb 23 06:59:16 crc kubenswrapper[5118]: E0223 06:59:16.811597 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295\": container with ID starting with 13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295 not found: ID does not exist" containerID="13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.811659 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295"} err="failed to get container status \"13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295\": rpc error: code = NotFound desc = could not find container \"13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295\": container with ID starting with 13c3bfa00202122936f4d3f24c935b5ddeac9268e11313cd5a3a8eed97161295 not found: ID does not exist" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.811704 5118 scope.go:117] "RemoveContainer" containerID="e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247" Feb 23 06:59:16 crc kubenswrapper[5118]: E0223 06:59:16.812213 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247\": container with ID starting with e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247 not found: ID does not exist" containerID="e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247" Feb 23 06:59:16 crc kubenswrapper[5118]: I0223 06:59:16.812252 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247"} err="failed to get container status \"e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247\": rpc error: code = NotFound desc = could not find container \"e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247\": container with ID starting with e73c9e73d528ca4510faf0482bb7b19cc07f3754762eb2d525257671e4159247 not found: ID does not exist" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.710000 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c70aa25-7823-4713-9e15-a05939f28d00" path="/var/lib/kubelet/pods/9c70aa25-7823-4713-9e15-a05939f28d00/volumes" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.852443 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-gg2kz"] Feb 23 06:59:17 crc kubenswrapper[5118]: E0223 06:59:17.852891 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c70aa25-7823-4713-9e15-a05939f28d00" containerName="extract-content" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.852931 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c70aa25-7823-4713-9e15-a05939f28d00" containerName="extract-content" Feb 23 06:59:17 crc kubenswrapper[5118]: E0223 06:59:17.852977 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd42c5ed-566b-45c4-b508-b14c7f0a2512" containerName="pull" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.852995 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd42c5ed-566b-45c4-b508-b14c7f0a2512" containerName="pull" Feb 23 06:59:17 crc kubenswrapper[5118]: E0223 06:59:17.853020 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c70aa25-7823-4713-9e15-a05939f28d00" containerName="registry-server" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.853041 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c70aa25-7823-4713-9e15-a05939f28d00" containerName="registry-server" Feb 23 06:59:17 crc kubenswrapper[5118]: E0223 06:59:17.853070 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd42c5ed-566b-45c4-b508-b14c7f0a2512" containerName="extract" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.853086 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd42c5ed-566b-45c4-b508-b14c7f0a2512" containerName="extract" Feb 23 06:59:17 crc kubenswrapper[5118]: E0223 06:59:17.853140 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd42c5ed-566b-45c4-b508-b14c7f0a2512" containerName="util" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.853157 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd42c5ed-566b-45c4-b508-b14c7f0a2512" containerName="util" Feb 23 06:59:17 crc kubenswrapper[5118]: E0223 06:59:17.853191 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c70aa25-7823-4713-9e15-a05939f28d00" containerName="extract-utilities" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.853208 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c70aa25-7823-4713-9e15-a05939f28d00" containerName="extract-utilities" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.853457 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd42c5ed-566b-45c4-b508-b14c7f0a2512" containerName="extract" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.853507 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c70aa25-7823-4713-9e15-a05939f28d00" containerName="registry-server" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.855373 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-gg2kz" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.861350 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-gg2kz"] Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.896917 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.898543 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-9mfcp" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.898545 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 23 06:59:17 crc kubenswrapper[5118]: I0223 06:59:17.950989 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52tcc\" (UniqueName: \"kubernetes.io/projected/9334f91a-86de-49e4-bc57-04b55a83f25d-kube-api-access-52tcc\") pod \"nmstate-operator-694c9596b7-gg2kz\" (UID: \"9334f91a-86de-49e4-bc57-04b55a83f25d\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-gg2kz" Feb 23 06:59:18 crc kubenswrapper[5118]: I0223 06:59:18.052651 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52tcc\" (UniqueName: \"kubernetes.io/projected/9334f91a-86de-49e4-bc57-04b55a83f25d-kube-api-access-52tcc\") pod \"nmstate-operator-694c9596b7-gg2kz\" (UID: \"9334f91a-86de-49e4-bc57-04b55a83f25d\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-gg2kz" Feb 23 06:59:18 crc kubenswrapper[5118]: I0223 06:59:18.096443 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52tcc\" (UniqueName: \"kubernetes.io/projected/9334f91a-86de-49e4-bc57-04b55a83f25d-kube-api-access-52tcc\") pod \"nmstate-operator-694c9596b7-gg2kz\" (UID: \"9334f91a-86de-49e4-bc57-04b55a83f25d\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-gg2kz" Feb 23 06:59:18 crc kubenswrapper[5118]: I0223 06:59:18.214705 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-gg2kz" Feb 23 06:59:18 crc kubenswrapper[5118]: I0223 06:59:18.452896 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-gg2kz"] Feb 23 06:59:18 crc kubenswrapper[5118]: I0223 06:59:18.742609 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-gg2kz" event={"ID":"9334f91a-86de-49e4-bc57-04b55a83f25d","Type":"ContainerStarted","Data":"e728706d9d288ecb66b6c5fe300c2c4fdf062782c387370a85a746130fbddb7d"} Feb 23 06:59:20 crc kubenswrapper[5118]: I0223 06:59:20.766791 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-gg2kz" event={"ID":"9334f91a-86de-49e4-bc57-04b55a83f25d","Type":"ContainerStarted","Data":"73e6a4c6f3301b7959699636c0cb17a5366822187e652bab6a6efa7934f1848a"} Feb 23 06:59:20 crc kubenswrapper[5118]: I0223 06:59:20.794575 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-gg2kz" podStartSLOduration=1.963124498 podStartE2EDuration="3.794552264s" podCreationTimestamp="2026-02-23 06:59:17 +0000 UTC" firstStartedPulling="2026-02-23 06:59:18.463465473 +0000 UTC m=+821.467250066" lastFinishedPulling="2026-02-23 06:59:20.294893249 +0000 UTC m=+823.298677832" observedRunningTime="2026-02-23 06:59:20.790438261 +0000 UTC m=+823.794222854" watchObservedRunningTime="2026-02-23 06:59:20.794552264 +0000 UTC m=+823.798336837" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.780409 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb"] Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.782257 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.784162 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-w7sd5" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.801054 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2"] Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.802016 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.804712 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.809573 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb"] Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.814954 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-kfjtf"] Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.815889 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.836811 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2"] Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.902736 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s6cc\" (UniqueName: \"kubernetes.io/projected/2113d5b6-58a4-48b3-87ce-d61904a474f9-kube-api-access-6s6cc\") pod \"nmstate-metrics-58c85c668d-xsnnb\" (UID: \"2113d5b6-58a4-48b3-87ce-d61904a474f9\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.904358 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c824e133-db80-49dc-a6a9-e90850e89a2a-nmstate-lock\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.904402 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/03b5790f-4c4a-44ae-8236-96ec6d31ab74-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-vvxz2\" (UID: \"03b5790f-4c4a-44ae-8236-96ec6d31ab74\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.904467 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fsbg\" (UniqueName: \"kubernetes.io/projected/c824e133-db80-49dc-a6a9-e90850e89a2a-kube-api-access-2fsbg\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.904664 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnz7l\" (UniqueName: \"kubernetes.io/projected/03b5790f-4c4a-44ae-8236-96ec6d31ab74-kube-api-access-mnz7l\") pod \"nmstate-webhook-866bcb46dc-vvxz2\" (UID: \"03b5790f-4c4a-44ae-8236-96ec6d31ab74\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.904748 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c824e133-db80-49dc-a6a9-e90850e89a2a-ovs-socket\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.905320 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c824e133-db80-49dc-a6a9-e90850e89a2a-dbus-socket\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.944699 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh"] Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.945513 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.952484 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.952789 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.954581 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qw25r" Feb 23 06:59:27 crc kubenswrapper[5118]: I0223 06:59:27.964277 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh"] Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.006829 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c824e133-db80-49dc-a6a9-e90850e89a2a-dbus-socket\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.006886 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s6cc\" (UniqueName: \"kubernetes.io/projected/2113d5b6-58a4-48b3-87ce-d61904a474f9-kube-api-access-6s6cc\") pod \"nmstate-metrics-58c85c668d-xsnnb\" (UID: \"2113d5b6-58a4-48b3-87ce-d61904a474f9\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.006917 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c824e133-db80-49dc-a6a9-e90850e89a2a-nmstate-lock\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.006943 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/03b5790f-4c4a-44ae-8236-96ec6d31ab74-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-vvxz2\" (UID: \"03b5790f-4c4a-44ae-8236-96ec6d31ab74\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.006989 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fsbg\" (UniqueName: \"kubernetes.io/projected/c824e133-db80-49dc-a6a9-e90850e89a2a-kube-api-access-2fsbg\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.007028 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnz7l\" (UniqueName: \"kubernetes.io/projected/03b5790f-4c4a-44ae-8236-96ec6d31ab74-kube-api-access-mnz7l\") pod \"nmstate-webhook-866bcb46dc-vvxz2\" (UID: \"03b5790f-4c4a-44ae-8236-96ec6d31ab74\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.007050 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c824e133-db80-49dc-a6a9-e90850e89a2a-ovs-socket\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:28 crc kubenswrapper[5118]: E0223 06:59:28.007161 5118 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.007212 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c824e133-db80-49dc-a6a9-e90850e89a2a-ovs-socket\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:28 crc kubenswrapper[5118]: E0223 06:59:28.007240 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b5790f-4c4a-44ae-8236-96ec6d31ab74-tls-key-pair podName:03b5790f-4c4a-44ae-8236-96ec6d31ab74 nodeName:}" failed. No retries permitted until 2026-02-23 06:59:28.507219213 +0000 UTC m=+831.511003786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/03b5790f-4c4a-44ae-8236-96ec6d31ab74-tls-key-pair") pod "nmstate-webhook-866bcb46dc-vvxz2" (UID: "03b5790f-4c4a-44ae-8236-96ec6d31ab74") : secret "openshift-nmstate-webhook" not found Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.007400 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c824e133-db80-49dc-a6a9-e90850e89a2a-dbus-socket\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.007544 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c824e133-db80-49dc-a6a9-e90850e89a2a-nmstate-lock\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.028027 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fsbg\" (UniqueName: \"kubernetes.io/projected/c824e133-db80-49dc-a6a9-e90850e89a2a-kube-api-access-2fsbg\") pod \"nmstate-handler-kfjtf\" (UID: \"c824e133-db80-49dc-a6a9-e90850e89a2a\") " pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.040330 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s6cc\" (UniqueName: \"kubernetes.io/projected/2113d5b6-58a4-48b3-87ce-d61904a474f9-kube-api-access-6s6cc\") pod \"nmstate-metrics-58c85c668d-xsnnb\" (UID: \"2113d5b6-58a4-48b3-87ce-d61904a474f9\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.041431 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnz7l\" (UniqueName: \"kubernetes.io/projected/03b5790f-4c4a-44ae-8236-96ec6d31ab74-kube-api-access-mnz7l\") pod \"nmstate-webhook-866bcb46dc-vvxz2\" (UID: \"03b5790f-4c4a-44ae-8236-96ec6d31ab74\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.108299 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.109016 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de2854b-06ec-4c6f-816e-f6aba8930366-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-ff9mh\" (UID: \"0de2854b-06ec-4c6f-816e-f6aba8930366\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.109141 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d848f\" (UniqueName: \"kubernetes.io/projected/0de2854b-06ec-4c6f-816e-f6aba8930366-kube-api-access-d848f\") pod \"nmstate-console-plugin-5c78fc5d65-ff9mh\" (UID: \"0de2854b-06ec-4c6f-816e-f6aba8930366\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.109270 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0de2854b-06ec-4c6f-816e-f6aba8930366-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-ff9mh\" (UID: \"0de2854b-06ec-4c6f-816e-f6aba8930366\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.140703 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.160186 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-8495c6f9db-8tm7q"] Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.161538 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.170332 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8495c6f9db-8tm7q"] Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.210630 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de2854b-06ec-4c6f-816e-f6aba8930366-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-ff9mh\" (UID: \"0de2854b-06ec-4c6f-816e-f6aba8930366\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.210784 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d848f\" (UniqueName: \"kubernetes.io/projected/0de2854b-06ec-4c6f-816e-f6aba8930366-kube-api-access-d848f\") pod \"nmstate-console-plugin-5c78fc5d65-ff9mh\" (UID: \"0de2854b-06ec-4c6f-816e-f6aba8930366\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.210908 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0de2854b-06ec-4c6f-816e-f6aba8930366-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-ff9mh\" (UID: \"0de2854b-06ec-4c6f-816e-f6aba8930366\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:28 crc kubenswrapper[5118]: E0223 06:59:28.211590 5118 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 23 06:59:28 crc kubenswrapper[5118]: E0223 06:59:28.211687 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0de2854b-06ec-4c6f-816e-f6aba8930366-plugin-serving-cert podName:0de2854b-06ec-4c6f-816e-f6aba8930366 nodeName:}" failed. No retries permitted until 2026-02-23 06:59:28.711648781 +0000 UTC m=+831.715433524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/0de2854b-06ec-4c6f-816e-f6aba8930366-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-ff9mh" (UID: "0de2854b-06ec-4c6f-816e-f6aba8930366") : secret "plugin-serving-cert" not found Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.217665 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0de2854b-06ec-4c6f-816e-f6aba8930366-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-ff9mh\" (UID: \"0de2854b-06ec-4c6f-816e-f6aba8930366\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.232685 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d848f\" (UniqueName: \"kubernetes.io/projected/0de2854b-06ec-4c6f-816e-f6aba8930366-kube-api-access-d848f\") pod \"nmstate-console-plugin-5c78fc5d65-ff9mh\" (UID: \"0de2854b-06ec-4c6f-816e-f6aba8930366\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.312268 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-console-serving-cert\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.312334 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-599r4\" (UniqueName: \"kubernetes.io/projected/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-kube-api-access-599r4\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.312415 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-console-config\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.312442 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-console-oauth-config\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.312466 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-oauth-serving-cert\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.312495 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-service-ca\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.312571 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-trusted-ca-bundle\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.414176 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-console-config\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.414239 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-console-oauth-config\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.414263 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-oauth-serving-cert\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.414284 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-service-ca\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.414329 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-trusted-ca-bundle\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.414360 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-console-serving-cert\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.414378 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-599r4\" (UniqueName: \"kubernetes.io/projected/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-kube-api-access-599r4\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.415904 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-console-config\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.416884 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-service-ca\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.419188 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-trusted-ca-bundle\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.420321 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-console-serving-cert\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.421471 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-console-oauth-config\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.422324 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-oauth-serving-cert\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.434012 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-599r4\" (UniqueName: \"kubernetes.io/projected/a2e66d5e-cb8b-4d14-974c-cb8601c749b2-kube-api-access-599r4\") pod \"console-8495c6f9db-8tm7q\" (UID: \"a2e66d5e-cb8b-4d14-974c-cb8601c749b2\") " pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.507401 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.515449 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/03b5790f-4c4a-44ae-8236-96ec6d31ab74-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-vvxz2\" (UID: \"03b5790f-4c4a-44ae-8236-96ec6d31ab74\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.525236 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/03b5790f-4c4a-44ae-8236-96ec6d31ab74-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-vvxz2\" (UID: \"03b5790f-4c4a-44ae-8236-96ec6d31ab74\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.564055 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb"] Feb 23 06:59:28 crc kubenswrapper[5118]: W0223 06:59:28.569601 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2113d5b6_58a4_48b3_87ce_d61904a474f9.slice/crio-2f75f429d0471f0bb33664aadbd57c90abcdb8a6f05b7e0b66dd8d457dde0483 WatchSource:0}: Error finding container 2f75f429d0471f0bb33664aadbd57c90abcdb8a6f05b7e0b66dd8d457dde0483: Status 404 returned error can't find the container with id 2f75f429d0471f0bb33664aadbd57c90abcdb8a6f05b7e0b66dd8d457dde0483 Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.718948 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de2854b-06ec-4c6f-816e-f6aba8930366-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-ff9mh\" (UID: \"0de2854b-06ec-4c6f-816e-f6aba8930366\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.721066 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.723679 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0de2854b-06ec-4c6f-816e-f6aba8930366-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-ff9mh\" (UID: \"0de2854b-06ec-4c6f-816e-f6aba8930366\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.736236 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8495c6f9db-8tm7q"] Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.824326 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8495c6f9db-8tm7q" event={"ID":"a2e66d5e-cb8b-4d14-974c-cb8601c749b2","Type":"ContainerStarted","Data":"3fed7a9b64105f6168046e011a48d8e80a619365d80971043caaffcd3d3a437f"} Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.825827 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kfjtf" event={"ID":"c824e133-db80-49dc-a6a9-e90850e89a2a","Type":"ContainerStarted","Data":"46990b9a4ebc45b6e5df900350c9881064d68d8dc30c2155377c6f5d9e7c5106"} Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.827170 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb" event={"ID":"2113d5b6-58a4-48b3-87ce-d61904a474f9","Type":"ContainerStarted","Data":"2f75f429d0471f0bb33664aadbd57c90abcdb8a6f05b7e0b66dd8d457dde0483"} Feb 23 06:59:28 crc kubenswrapper[5118]: I0223 06:59:28.878004 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" Feb 23 06:59:29 crc kubenswrapper[5118]: I0223 06:59:29.186614 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh"] Feb 23 06:59:29 crc kubenswrapper[5118]: I0223 06:59:29.203506 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2"] Feb 23 06:59:29 crc kubenswrapper[5118]: W0223 06:59:29.212859 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b5790f_4c4a_44ae_8236_96ec6d31ab74.slice/crio-0d67427649e8b3bc85721bec8fe7d6c40ef80371596f016460d72f83910fbaa8 WatchSource:0}: Error finding container 0d67427649e8b3bc85721bec8fe7d6c40ef80371596f016460d72f83910fbaa8: Status 404 returned error can't find the container with id 0d67427649e8b3bc85721bec8fe7d6c40ef80371596f016460d72f83910fbaa8 Feb 23 06:59:29 crc kubenswrapper[5118]: I0223 06:59:29.834288 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" event={"ID":"0de2854b-06ec-4c6f-816e-f6aba8930366","Type":"ContainerStarted","Data":"9090b384ac4ec803a1fbaf79058ae04d5ef2ca52205f37ad9e844c96feb67269"} Feb 23 06:59:29 crc kubenswrapper[5118]: I0223 06:59:29.835707 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" event={"ID":"03b5790f-4c4a-44ae-8236-96ec6d31ab74","Type":"ContainerStarted","Data":"0d67427649e8b3bc85721bec8fe7d6c40ef80371596f016460d72f83910fbaa8"} Feb 23 06:59:29 crc kubenswrapper[5118]: I0223 06:59:29.838447 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8495c6f9db-8tm7q" event={"ID":"a2e66d5e-cb8b-4d14-974c-cb8601c749b2","Type":"ContainerStarted","Data":"eab65c2593c91914d04fa6e96b30a3951792802589a968fa90d101611b0c33f1"} Feb 23 06:59:29 crc kubenswrapper[5118]: I0223 06:59:29.872312 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8495c6f9db-8tm7q" podStartSLOduration=1.872279432 podStartE2EDuration="1.872279432s" podCreationTimestamp="2026-02-23 06:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 06:59:29.860353322 +0000 UTC m=+832.864137975" watchObservedRunningTime="2026-02-23 06:59:29.872279432 +0000 UTC m=+832.876064045" Feb 23 06:59:32 crc kubenswrapper[5118]: I0223 06:59:32.858907 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" event={"ID":"0de2854b-06ec-4c6f-816e-f6aba8930366","Type":"ContainerStarted","Data":"0e413b0367b1d669211b8c2477ef1d41bdb95243e4eebac0392b9e0bf03005ce"} Feb 23 06:59:32 crc kubenswrapper[5118]: I0223 06:59:32.861436 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" event={"ID":"03b5790f-4c4a-44ae-8236-96ec6d31ab74","Type":"ContainerStarted","Data":"308afad734a24508cfa2eecc609f7f026909a72cbccd2674be8a306c1546a080"} Feb 23 06:59:32 crc kubenswrapper[5118]: I0223 06:59:32.861582 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:32 crc kubenswrapper[5118]: I0223 06:59:32.864001 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb" event={"ID":"2113d5b6-58a4-48b3-87ce-d61904a474f9","Type":"ContainerStarted","Data":"5246e1ab27d7fa8884ada571d9a0e0def02b9d1dcaf5f33ba7a38ebaa4e61528"} Feb 23 06:59:32 crc kubenswrapper[5118]: I0223 06:59:32.886971 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ff9mh" podStartSLOduration=3.232367536 podStartE2EDuration="5.886937612s" podCreationTimestamp="2026-02-23 06:59:27 +0000 UTC" firstStartedPulling="2026-02-23 06:59:29.199850472 +0000 UTC m=+832.203635045" lastFinishedPulling="2026-02-23 06:59:31.854420548 +0000 UTC m=+834.858205121" observedRunningTime="2026-02-23 06:59:32.881729161 +0000 UTC m=+835.885513774" watchObservedRunningTime="2026-02-23 06:59:32.886937612 +0000 UTC m=+835.890722195" Feb 23 06:59:32 crc kubenswrapper[5118]: I0223 06:59:32.918794 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" podStartSLOduration=3.267557568 podStartE2EDuration="5.918766041s" podCreationTimestamp="2026-02-23 06:59:27 +0000 UTC" firstStartedPulling="2026-02-23 06:59:29.216439828 +0000 UTC m=+832.220224401" lastFinishedPulling="2026-02-23 06:59:31.867648301 +0000 UTC m=+834.871432874" observedRunningTime="2026-02-23 06:59:32.912751649 +0000 UTC m=+835.916536262" watchObservedRunningTime="2026-02-23 06:59:32.918766041 +0000 UTC m=+835.922550654" Feb 23 06:59:32 crc kubenswrapper[5118]: I0223 06:59:32.975640 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 06:59:32 crc kubenswrapper[5118]: I0223 06:59:32.975730 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 06:59:34 crc kubenswrapper[5118]: I0223 06:59:34.878338 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb" event={"ID":"2113d5b6-58a4-48b3-87ce-d61904a474f9","Type":"ContainerStarted","Data":"59ebb8c87de59e900420f47a45957d4173297b0afab731d9fd3ceca76400de31"} Feb 23 06:59:34 crc kubenswrapper[5118]: I0223 06:59:34.905262 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-xsnnb" podStartSLOduration=1.8375162409999999 podStartE2EDuration="7.905237585s" podCreationTimestamp="2026-02-23 06:59:27 +0000 UTC" firstStartedPulling="2026-02-23 06:59:28.573897708 +0000 UTC m=+831.577682281" lastFinishedPulling="2026-02-23 06:59:34.641619052 +0000 UTC m=+837.645403625" observedRunningTime="2026-02-23 06:59:34.898931808 +0000 UTC m=+837.902716401" watchObservedRunningTime="2026-02-23 06:59:34.905237585 +0000 UTC m=+837.909022158" Feb 23 06:59:35 crc kubenswrapper[5118]: I0223 06:59:35.887707 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kfjtf" event={"ID":"c824e133-db80-49dc-a6a9-e90850e89a2a","Type":"ContainerStarted","Data":"f4567c3fe1d2ac933e703a835c4bda83e5ab94052bdaf0f65d015b23c96cbc7d"} Feb 23 06:59:35 crc kubenswrapper[5118]: I0223 06:59:35.908361 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-kfjtf" podStartSLOduration=2.479315525 podStartE2EDuration="8.908196368s" podCreationTimestamp="2026-02-23 06:59:27 +0000 UTC" firstStartedPulling="2026-02-23 06:59:28.203429945 +0000 UTC m=+831.207214518" lastFinishedPulling="2026-02-23 06:59:34.632310778 +0000 UTC m=+837.636095361" observedRunningTime="2026-02-23 06:59:35.901480799 +0000 UTC m=+838.905265362" watchObservedRunningTime="2026-02-23 06:59:35.908196368 +0000 UTC m=+838.911980941" Feb 23 06:59:36 crc kubenswrapper[5118]: I0223 06:59:36.896304 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:38 crc kubenswrapper[5118]: I0223 06:59:38.507908 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:38 crc kubenswrapper[5118]: I0223 06:59:38.507993 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:38 crc kubenswrapper[5118]: I0223 06:59:38.517022 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:38 crc kubenswrapper[5118]: I0223 06:59:38.917838 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8495c6f9db-8tm7q" Feb 23 06:59:38 crc kubenswrapper[5118]: I0223 06:59:38.987569 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-r9t74"] Feb 23 06:59:43 crc kubenswrapper[5118]: I0223 06:59:43.180758 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-kfjtf" Feb 23 06:59:48 crc kubenswrapper[5118]: I0223 06:59:48.727300 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-vvxz2" Feb 23 06:59:51 crc kubenswrapper[5118]: E0223 06:59:51.052629 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/systemd-tmpfiles-clean.service\": RecentStats: unable to find data in memory cache]" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.181501 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl"] Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.183433 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.189855 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.190318 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.190840 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64221749-8f82-47ac-8b0f-cfef43a17733-config-volume\") pod \"collect-profiles-29530500-sqsbl\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.190891 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvhsk\" (UniqueName: \"kubernetes.io/projected/64221749-8f82-47ac-8b0f-cfef43a17733-kube-api-access-cvhsk\") pod \"collect-profiles-29530500-sqsbl\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.191120 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64221749-8f82-47ac-8b0f-cfef43a17733-secret-volume\") pod \"collect-profiles-29530500-sqsbl\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.192590 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl"] Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.292761 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64221749-8f82-47ac-8b0f-cfef43a17733-secret-volume\") pod \"collect-profiles-29530500-sqsbl\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.292924 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64221749-8f82-47ac-8b0f-cfef43a17733-config-volume\") pod \"collect-profiles-29530500-sqsbl\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.292967 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvhsk\" (UniqueName: \"kubernetes.io/projected/64221749-8f82-47ac-8b0f-cfef43a17733-kube-api-access-cvhsk\") pod \"collect-profiles-29530500-sqsbl\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.294543 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64221749-8f82-47ac-8b0f-cfef43a17733-config-volume\") pod \"collect-profiles-29530500-sqsbl\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.308967 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64221749-8f82-47ac-8b0f-cfef43a17733-secret-volume\") pod \"collect-profiles-29530500-sqsbl\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.320637 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvhsk\" (UniqueName: \"kubernetes.io/projected/64221749-8f82-47ac-8b0f-cfef43a17733-kube-api-access-cvhsk\") pod \"collect-profiles-29530500-sqsbl\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.518736 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:00 crc kubenswrapper[5118]: I0223 07:00:00.768719 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl"] Feb 23 07:00:01 crc kubenswrapper[5118]: I0223 07:00:01.102145 5118 generic.go:334] "Generic (PLEG): container finished" podID="64221749-8f82-47ac-8b0f-cfef43a17733" containerID="daa505d71a81ac09727ecefc26fb5bbbe89811b2651cf6ab2719021aa0fa6d91" exitCode=0 Feb 23 07:00:01 crc kubenswrapper[5118]: I0223 07:00:01.102291 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" event={"ID":"64221749-8f82-47ac-8b0f-cfef43a17733","Type":"ContainerDied","Data":"daa505d71a81ac09727ecefc26fb5bbbe89811b2651cf6ab2719021aa0fa6d91"} Feb 23 07:00:01 crc kubenswrapper[5118]: I0223 07:00:01.103326 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" event={"ID":"64221749-8f82-47ac-8b0f-cfef43a17733","Type":"ContainerStarted","Data":"b339b7ca3ff2b029b475c3892ff5547087aaaa1a4667fa174581220a5a06cefa"} Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.371684 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.524502 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvhsk\" (UniqueName: \"kubernetes.io/projected/64221749-8f82-47ac-8b0f-cfef43a17733-kube-api-access-cvhsk\") pod \"64221749-8f82-47ac-8b0f-cfef43a17733\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.525145 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64221749-8f82-47ac-8b0f-cfef43a17733-config-volume\") pod \"64221749-8f82-47ac-8b0f-cfef43a17733\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.525222 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64221749-8f82-47ac-8b0f-cfef43a17733-secret-volume\") pod \"64221749-8f82-47ac-8b0f-cfef43a17733\" (UID: \"64221749-8f82-47ac-8b0f-cfef43a17733\") " Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.527205 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64221749-8f82-47ac-8b0f-cfef43a17733-config-volume" (OuterVolumeSpecName: "config-volume") pod "64221749-8f82-47ac-8b0f-cfef43a17733" (UID: "64221749-8f82-47ac-8b0f-cfef43a17733"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.540383 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64221749-8f82-47ac-8b0f-cfef43a17733-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "64221749-8f82-47ac-8b0f-cfef43a17733" (UID: "64221749-8f82-47ac-8b0f-cfef43a17733"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.540497 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64221749-8f82-47ac-8b0f-cfef43a17733-kube-api-access-cvhsk" (OuterVolumeSpecName: "kube-api-access-cvhsk") pod "64221749-8f82-47ac-8b0f-cfef43a17733" (UID: "64221749-8f82-47ac-8b0f-cfef43a17733"). InnerVolumeSpecName "kube-api-access-cvhsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.628451 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/64221749-8f82-47ac-8b0f-cfef43a17733-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.628499 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/64221749-8f82-47ac-8b0f-cfef43a17733-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.628510 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvhsk\" (UniqueName: \"kubernetes.io/projected/64221749-8f82-47ac-8b0f-cfef43a17733-kube-api-access-cvhsk\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.975706 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:00:02 crc kubenswrapper[5118]: I0223 07:00:02.976284 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:00:03 crc kubenswrapper[5118]: I0223 07:00:03.124827 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" event={"ID":"64221749-8f82-47ac-8b0f-cfef43a17733","Type":"ContainerDied","Data":"b339b7ca3ff2b029b475c3892ff5547087aaaa1a4667fa174581220a5a06cefa"} Feb 23 07:00:03 crc kubenswrapper[5118]: I0223 07:00:03.124884 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b339b7ca3ff2b029b475c3892ff5547087aaaa1a4667fa174581220a5a06cefa" Feb 23 07:00:03 crc kubenswrapper[5118]: I0223 07:00:03.124950 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.046421 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-r9t74" podUID="d8fc5dad-079f-4768-a10e-616ff7228ccd" containerName="console" containerID="cri-o://3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16" gracePeriod=15 Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.493563 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-r9t74_d8fc5dad-079f-4768-a10e-616ff7228ccd/console/0.log" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.495381 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-r9t74" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.562634 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-oauth-serving-cert\") pod \"d8fc5dad-079f-4768-a10e-616ff7228ccd\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.562708 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-config\") pod \"d8fc5dad-079f-4768-a10e-616ff7228ccd\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.562754 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-service-ca\") pod \"d8fc5dad-079f-4768-a10e-616ff7228ccd\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.562797 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-oauth-config\") pod \"d8fc5dad-079f-4768-a10e-616ff7228ccd\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.562823 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-serving-cert\") pod \"d8fc5dad-079f-4768-a10e-616ff7228ccd\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.562844 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-trusted-ca-bundle\") pod \"d8fc5dad-079f-4768-a10e-616ff7228ccd\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.562861 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xpcs\" (UniqueName: \"kubernetes.io/projected/d8fc5dad-079f-4768-a10e-616ff7228ccd-kube-api-access-5xpcs\") pod \"d8fc5dad-079f-4768-a10e-616ff7228ccd\" (UID: \"d8fc5dad-079f-4768-a10e-616ff7228ccd\") " Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.565093 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-service-ca" (OuterVolumeSpecName: "service-ca") pod "d8fc5dad-079f-4768-a10e-616ff7228ccd" (UID: "d8fc5dad-079f-4768-a10e-616ff7228ccd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.565259 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-config" (OuterVolumeSpecName: "console-config") pod "d8fc5dad-079f-4768-a10e-616ff7228ccd" (UID: "d8fc5dad-079f-4768-a10e-616ff7228ccd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.565782 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d8fc5dad-079f-4768-a10e-616ff7228ccd" (UID: "d8fc5dad-079f-4768-a10e-616ff7228ccd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.566067 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d8fc5dad-079f-4768-a10e-616ff7228ccd" (UID: "d8fc5dad-079f-4768-a10e-616ff7228ccd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.571186 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d8fc5dad-079f-4768-a10e-616ff7228ccd" (UID: "d8fc5dad-079f-4768-a10e-616ff7228ccd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.571565 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d8fc5dad-079f-4768-a10e-616ff7228ccd" (UID: "d8fc5dad-079f-4768-a10e-616ff7228ccd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.571749 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8fc5dad-079f-4768-a10e-616ff7228ccd-kube-api-access-5xpcs" (OuterVolumeSpecName: "kube-api-access-5xpcs") pod "d8fc5dad-079f-4768-a10e-616ff7228ccd" (UID: "d8fc5dad-079f-4768-a10e-616ff7228ccd"). InnerVolumeSpecName "kube-api-access-5xpcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.665020 5118 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.665061 5118 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.665070 5118 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.665079 5118 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.665106 5118 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8fc5dad-079f-4768-a10e-616ff7228ccd-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.665116 5118 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8fc5dad-079f-4768-a10e-616ff7228ccd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:04 crc kubenswrapper[5118]: I0223 07:00:04.665124 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xpcs\" (UniqueName: \"kubernetes.io/projected/d8fc5dad-079f-4768-a10e-616ff7228ccd-kube-api-access-5xpcs\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.144489 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-r9t74_d8fc5dad-079f-4768-a10e-616ff7228ccd/console/0.log" Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.144579 5118 generic.go:334] "Generic (PLEG): container finished" podID="d8fc5dad-079f-4768-a10e-616ff7228ccd" containerID="3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16" exitCode=2 Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.144632 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-r9t74" event={"ID":"d8fc5dad-079f-4768-a10e-616ff7228ccd","Type":"ContainerDied","Data":"3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16"} Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.144676 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-r9t74" event={"ID":"d8fc5dad-079f-4768-a10e-616ff7228ccd","Type":"ContainerDied","Data":"1ce2ccfffb303f7b4c7a8e17ff25ef727daf4707a8d3437b778c1b3065776c76"} Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.144709 5118 scope.go:117] "RemoveContainer" containerID="3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16" Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.144770 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-r9t74" Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.187088 5118 scope.go:117] "RemoveContainer" containerID="3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16" Feb 23 07:00:05 crc kubenswrapper[5118]: E0223 07:00:05.187689 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16\": container with ID starting with 3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16 not found: ID does not exist" containerID="3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16" Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.187744 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16"} err="failed to get container status \"3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16\": rpc error: code = NotFound desc = could not find container \"3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16\": container with ID starting with 3c480c58f84eb3101b9e98a526461e609861897364206aa48865d293b9730e16 not found: ID does not exist" Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.204759 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-r9t74"] Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.216261 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-r9t74"] Feb 23 07:00:05 crc kubenswrapper[5118]: I0223 07:00:05.706637 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8fc5dad-079f-4768-a10e-616ff7228ccd" path="/var/lib/kubelet/pods/d8fc5dad-079f-4768-a10e-616ff7228ccd/volumes" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.371417 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4"] Feb 23 07:00:06 crc kubenswrapper[5118]: E0223 07:00:06.371762 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64221749-8f82-47ac-8b0f-cfef43a17733" containerName="collect-profiles" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.371788 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="64221749-8f82-47ac-8b0f-cfef43a17733" containerName="collect-profiles" Feb 23 07:00:06 crc kubenswrapper[5118]: E0223 07:00:06.371811 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fc5dad-079f-4768-a10e-616ff7228ccd" containerName="console" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.371823 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fc5dad-079f-4768-a10e-616ff7228ccd" containerName="console" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.372006 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8fc5dad-079f-4768-a10e-616ff7228ccd" containerName="console" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.372034 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="64221749-8f82-47ac-8b0f-cfef43a17733" containerName="collect-profiles" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.373416 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.375811 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.388528 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4"] Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.394459 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnkfc\" (UniqueName: \"kubernetes.io/projected/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-kube-api-access-qnkfc\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.394801 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.395015 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.497247 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.497497 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnkfc\" (UniqueName: \"kubernetes.io/projected/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-kube-api-access-qnkfc\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.497555 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.498532 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.498881 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.519201 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnkfc\" (UniqueName: \"kubernetes.io/projected/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-kube-api-access-qnkfc\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:06 crc kubenswrapper[5118]: I0223 07:00:06.721489 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:07 crc kubenswrapper[5118]: I0223 07:00:07.190812 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4"] Feb 23 07:00:07 crc kubenswrapper[5118]: W0223 07:00:07.199542 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2961e2e0_0d3d_4ab9_8916_c03b6ef25b22.slice/crio-2a44125772dc03f4cbc7884499210165c2b435f0932ccc93fb7c0c5c95d84404 WatchSource:0}: Error finding container 2a44125772dc03f4cbc7884499210165c2b435f0932ccc93fb7c0c5c95d84404: Status 404 returned error can't find the container with id 2a44125772dc03f4cbc7884499210165c2b435f0932ccc93fb7c0c5c95d84404 Feb 23 07:00:08 crc kubenswrapper[5118]: I0223 07:00:08.175703 5118 generic.go:334] "Generic (PLEG): container finished" podID="2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" containerID="fb3da9fd9357a19943ab90f2b4765af0976e1e4ecc9375b00b1cd524dbc19b81" exitCode=0 Feb 23 07:00:08 crc kubenswrapper[5118]: I0223 07:00:08.175884 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" event={"ID":"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22","Type":"ContainerDied","Data":"fb3da9fd9357a19943ab90f2b4765af0976e1e4ecc9375b00b1cd524dbc19b81"} Feb 23 07:00:08 crc kubenswrapper[5118]: I0223 07:00:08.176426 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" event={"ID":"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22","Type":"ContainerStarted","Data":"2a44125772dc03f4cbc7884499210165c2b435f0932ccc93fb7c0c5c95d84404"} Feb 23 07:00:10 crc kubenswrapper[5118]: I0223 07:00:10.195600 5118 generic.go:334] "Generic (PLEG): container finished" podID="2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" containerID="9f0d21b945736916f884a762d210f3019f7e3f645d12d08483c6845493b26a13" exitCode=0 Feb 23 07:00:10 crc kubenswrapper[5118]: I0223 07:00:10.195712 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" event={"ID":"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22","Type":"ContainerDied","Data":"9f0d21b945736916f884a762d210f3019f7e3f645d12d08483c6845493b26a13"} Feb 23 07:00:11 crc kubenswrapper[5118]: I0223 07:00:11.209971 5118 generic.go:334] "Generic (PLEG): container finished" podID="2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" containerID="9158b84176c611218f6ffaa741ea337d7aa9120af81ced20233f7c0a34a20bc4" exitCode=0 Feb 23 07:00:11 crc kubenswrapper[5118]: I0223 07:00:11.210280 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" event={"ID":"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22","Type":"ContainerDied","Data":"9158b84176c611218f6ffaa741ea337d7aa9120af81ced20233f7c0a34a20bc4"} Feb 23 07:00:12 crc kubenswrapper[5118]: I0223 07:00:12.604731 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:12 crc kubenswrapper[5118]: I0223 07:00:12.703772 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-util\") pod \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " Feb 23 07:00:12 crc kubenswrapper[5118]: I0223 07:00:12.703890 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-bundle\") pod \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " Feb 23 07:00:12 crc kubenswrapper[5118]: I0223 07:00:12.704085 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnkfc\" (UniqueName: \"kubernetes.io/projected/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-kube-api-access-qnkfc\") pod \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\" (UID: \"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22\") " Feb 23 07:00:12 crc kubenswrapper[5118]: I0223 07:00:12.706398 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-bundle" (OuterVolumeSpecName: "bundle") pod "2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" (UID: "2961e2e0-0d3d-4ab9-8916-c03b6ef25b22"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[5118]: I0223 07:00:12.715445 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-kube-api-access-qnkfc" (OuterVolumeSpecName: "kube-api-access-qnkfc") pod "2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" (UID: "2961e2e0-0d3d-4ab9-8916-c03b6ef25b22"). InnerVolumeSpecName "kube-api-access-qnkfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[5118]: I0223 07:00:12.731977 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-util" (OuterVolumeSpecName: "util") pod "2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" (UID: "2961e2e0-0d3d-4ab9-8916-c03b6ef25b22"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:12 crc kubenswrapper[5118]: I0223 07:00:12.806064 5118 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[5118]: I0223 07:00:12.806698 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnkfc\" (UniqueName: \"kubernetes.io/projected/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-kube-api-access-qnkfc\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:12 crc kubenswrapper[5118]: I0223 07:00:12.806732 5118 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2961e2e0-0d3d-4ab9-8916-c03b6ef25b22-util\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:13 crc kubenswrapper[5118]: I0223 07:00:13.234127 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" event={"ID":"2961e2e0-0d3d-4ab9-8916-c03b6ef25b22","Type":"ContainerDied","Data":"2a44125772dc03f4cbc7884499210165c2b435f0932ccc93fb7c0c5c95d84404"} Feb 23 07:00:13 crc kubenswrapper[5118]: I0223 07:00:13.234184 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a44125772dc03f4cbc7884499210165c2b435f0932ccc93fb7c0c5c95d84404" Feb 23 07:00:13 crc kubenswrapper[5118]: I0223 07:00:13.234272 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.964770 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq"] Feb 23 07:00:21 crc kubenswrapper[5118]: E0223 07:00:21.966449 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" containerName="pull" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.966467 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" containerName="pull" Feb 23 07:00:21 crc kubenswrapper[5118]: E0223 07:00:21.966482 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" containerName="extract" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.966490 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" containerName="extract" Feb 23 07:00:21 crc kubenswrapper[5118]: E0223 07:00:21.966508 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" containerName="util" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.966517 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" containerName="util" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.966668 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2961e2e0-0d3d-4ab9-8916-c03b6ef25b22" containerName="extract" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.967210 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.976370 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.976746 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.977023 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-djb8c" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.977198 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.977320 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 23 07:00:21 crc kubenswrapper[5118]: I0223 07:00:21.980516 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq"] Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.141969 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1549afd4-1ff5-4412-92f1-10e57c8f3bc8-webhook-cert\") pod \"metallb-operator-controller-manager-566586f54b-bm2tq\" (UID: \"1549afd4-1ff5-4412-92f1-10e57c8f3bc8\") " pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.142031 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1549afd4-1ff5-4412-92f1-10e57c8f3bc8-apiservice-cert\") pod \"metallb-operator-controller-manager-566586f54b-bm2tq\" (UID: \"1549afd4-1ff5-4412-92f1-10e57c8f3bc8\") " pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.142277 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtmm6\" (UniqueName: \"kubernetes.io/projected/1549afd4-1ff5-4412-92f1-10e57c8f3bc8-kube-api-access-jtmm6\") pod \"metallb-operator-controller-manager-566586f54b-bm2tq\" (UID: \"1549afd4-1ff5-4412-92f1-10e57c8f3bc8\") " pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.207425 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz"] Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.208212 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.210027 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.210606 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.211237 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qxdpj" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.228304 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz"] Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.244734 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1549afd4-1ff5-4412-92f1-10e57c8f3bc8-webhook-cert\") pod \"metallb-operator-controller-manager-566586f54b-bm2tq\" (UID: \"1549afd4-1ff5-4412-92f1-10e57c8f3bc8\") " pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.244811 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1549afd4-1ff5-4412-92f1-10e57c8f3bc8-apiservice-cert\") pod \"metallb-operator-controller-manager-566586f54b-bm2tq\" (UID: \"1549afd4-1ff5-4412-92f1-10e57c8f3bc8\") " pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.244853 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtmm6\" (UniqueName: \"kubernetes.io/projected/1549afd4-1ff5-4412-92f1-10e57c8f3bc8-kube-api-access-jtmm6\") pod \"metallb-operator-controller-manager-566586f54b-bm2tq\" (UID: \"1549afd4-1ff5-4412-92f1-10e57c8f3bc8\") " pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.263176 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1549afd4-1ff5-4412-92f1-10e57c8f3bc8-apiservice-cert\") pod \"metallb-operator-controller-manager-566586f54b-bm2tq\" (UID: \"1549afd4-1ff5-4412-92f1-10e57c8f3bc8\") " pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.263197 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1549afd4-1ff5-4412-92f1-10e57c8f3bc8-webhook-cert\") pod \"metallb-operator-controller-manager-566586f54b-bm2tq\" (UID: \"1549afd4-1ff5-4412-92f1-10e57c8f3bc8\") " pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.270719 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtmm6\" (UniqueName: \"kubernetes.io/projected/1549afd4-1ff5-4412-92f1-10e57c8f3bc8-kube-api-access-jtmm6\") pod \"metallb-operator-controller-manager-566586f54b-bm2tq\" (UID: \"1549afd4-1ff5-4412-92f1-10e57c8f3bc8\") " pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.293904 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.345883 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/721f2089-bb03-426d-adfa-f16793595526-webhook-cert\") pod \"metallb-operator-webhook-server-5d5d656c8d-m4blz\" (UID: \"721f2089-bb03-426d-adfa-f16793595526\") " pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.345954 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxkwd\" (UniqueName: \"kubernetes.io/projected/721f2089-bb03-426d-adfa-f16793595526-kube-api-access-wxkwd\") pod \"metallb-operator-webhook-server-5d5d656c8d-m4blz\" (UID: \"721f2089-bb03-426d-adfa-f16793595526\") " pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.345997 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/721f2089-bb03-426d-adfa-f16793595526-apiservice-cert\") pod \"metallb-operator-webhook-server-5d5d656c8d-m4blz\" (UID: \"721f2089-bb03-426d-adfa-f16793595526\") " pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.451704 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxkwd\" (UniqueName: \"kubernetes.io/projected/721f2089-bb03-426d-adfa-f16793595526-kube-api-access-wxkwd\") pod \"metallb-operator-webhook-server-5d5d656c8d-m4blz\" (UID: \"721f2089-bb03-426d-adfa-f16793595526\") " pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.452367 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/721f2089-bb03-426d-adfa-f16793595526-apiservice-cert\") pod \"metallb-operator-webhook-server-5d5d656c8d-m4blz\" (UID: \"721f2089-bb03-426d-adfa-f16793595526\") " pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.452559 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/721f2089-bb03-426d-adfa-f16793595526-webhook-cert\") pod \"metallb-operator-webhook-server-5d5d656c8d-m4blz\" (UID: \"721f2089-bb03-426d-adfa-f16793595526\") " pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.459387 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/721f2089-bb03-426d-adfa-f16793595526-apiservice-cert\") pod \"metallb-operator-webhook-server-5d5d656c8d-m4blz\" (UID: \"721f2089-bb03-426d-adfa-f16793595526\") " pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.477310 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/721f2089-bb03-426d-adfa-f16793595526-webhook-cert\") pod \"metallb-operator-webhook-server-5d5d656c8d-m4blz\" (UID: \"721f2089-bb03-426d-adfa-f16793595526\") " pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.479685 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxkwd\" (UniqueName: \"kubernetes.io/projected/721f2089-bb03-426d-adfa-f16793595526-kube-api-access-wxkwd\") pod \"metallb-operator-webhook-server-5d5d656c8d-m4blz\" (UID: \"721f2089-bb03-426d-adfa-f16793595526\") " pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.522940 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.562027 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq"] Feb 23 07:00:22 crc kubenswrapper[5118]: I0223 07:00:22.877212 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz"] Feb 23 07:00:22 crc kubenswrapper[5118]: W0223 07:00:22.885800 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod721f2089_bb03_426d_adfa_f16793595526.slice/crio-8f9b1edcf254d7ca8ccee66fe84f6e9019428c3131eb3a0e3b2a5eb10e462bd9 WatchSource:0}: Error finding container 8f9b1edcf254d7ca8ccee66fe84f6e9019428c3131eb3a0e3b2a5eb10e462bd9: Status 404 returned error can't find the container with id 8f9b1edcf254d7ca8ccee66fe84f6e9019428c3131eb3a0e3b2a5eb10e462bd9 Feb 23 07:00:23 crc kubenswrapper[5118]: I0223 07:00:23.300568 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" event={"ID":"1549afd4-1ff5-4412-92f1-10e57c8f3bc8","Type":"ContainerStarted","Data":"5e8d6e87586debe7628a2eba7fddd54847665454ce1536f4561edeabe7862ca0"} Feb 23 07:00:23 crc kubenswrapper[5118]: I0223 07:00:23.301501 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" event={"ID":"721f2089-bb03-426d-adfa-f16793595526","Type":"ContainerStarted","Data":"8f9b1edcf254d7ca8ccee66fe84f6e9019428c3131eb3a0e3b2a5eb10e462bd9"} Feb 23 07:00:28 crc kubenswrapper[5118]: I0223 07:00:28.342538 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" event={"ID":"721f2089-bb03-426d-adfa-f16793595526","Type":"ContainerStarted","Data":"3e5f80e8a3c2cd42efec2f13693120a989f7570ebf44d4d2e94bafd6cf02af6c"} Feb 23 07:00:28 crc kubenswrapper[5118]: I0223 07:00:28.343137 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:28 crc kubenswrapper[5118]: I0223 07:00:28.346439 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" event={"ID":"1549afd4-1ff5-4412-92f1-10e57c8f3bc8","Type":"ContainerStarted","Data":"0970901a8873a335f10f614993f34c27824b70bfeae7b562598ae24934f34e6a"} Feb 23 07:00:28 crc kubenswrapper[5118]: I0223 07:00:28.346648 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:00:28 crc kubenswrapper[5118]: I0223 07:00:28.367622 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" podStartSLOduration=1.450619369 podStartE2EDuration="6.367599634s" podCreationTimestamp="2026-02-23 07:00:22 +0000 UTC" firstStartedPulling="2026-02-23 07:00:22.889170216 +0000 UTC m=+885.892954779" lastFinishedPulling="2026-02-23 07:00:27.806150481 +0000 UTC m=+890.809935044" observedRunningTime="2026-02-23 07:00:28.366594779 +0000 UTC m=+891.370379352" watchObservedRunningTime="2026-02-23 07:00:28.367599634 +0000 UTC m=+891.371384207" Feb 23 07:00:28 crc kubenswrapper[5118]: I0223 07:00:28.402425 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" podStartSLOduration=2.194670418 podStartE2EDuration="7.40239825s" podCreationTimestamp="2026-02-23 07:00:21 +0000 UTC" firstStartedPulling="2026-02-23 07:00:22.572200329 +0000 UTC m=+885.575984902" lastFinishedPulling="2026-02-23 07:00:27.779928161 +0000 UTC m=+890.783712734" observedRunningTime="2026-02-23 07:00:28.399598763 +0000 UTC m=+891.403383336" watchObservedRunningTime="2026-02-23 07:00:28.40239825 +0000 UTC m=+891.406182823" Feb 23 07:00:32 crc kubenswrapper[5118]: I0223 07:00:32.976058 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:00:32 crc kubenswrapper[5118]: I0223 07:00:32.976688 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:00:32 crc kubenswrapper[5118]: I0223 07:00:32.976754 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:00:32 crc kubenswrapper[5118]: I0223 07:00:32.977601 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52cbafc39cdb826b54cd76a2466f2669d01f6a03956f23813b0d4a4b258ebd73"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:00:32 crc kubenswrapper[5118]: I0223 07:00:32.977714 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://52cbafc39cdb826b54cd76a2466f2669d01f6a03956f23813b0d4a4b258ebd73" gracePeriod=600 Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.111243 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2kvcw"] Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.113349 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.121504 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2kvcw"] Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.124582 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twrc4\" (UniqueName: \"kubernetes.io/projected/9dc5859c-d217-42a8-9bb7-1d26cde701b2-kube-api-access-twrc4\") pod \"community-operators-2kvcw\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.124672 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-utilities\") pod \"community-operators-2kvcw\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.124721 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-catalog-content\") pod \"community-operators-2kvcw\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.226611 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twrc4\" (UniqueName: \"kubernetes.io/projected/9dc5859c-d217-42a8-9bb7-1d26cde701b2-kube-api-access-twrc4\") pod \"community-operators-2kvcw\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.226724 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-utilities\") pod \"community-operators-2kvcw\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.226766 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-catalog-content\") pod \"community-operators-2kvcw\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.227464 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-catalog-content\") pod \"community-operators-2kvcw\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.228249 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-utilities\") pod \"community-operators-2kvcw\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.256718 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twrc4\" (UniqueName: \"kubernetes.io/projected/9dc5859c-d217-42a8-9bb7-1d26cde701b2-kube-api-access-twrc4\") pod \"community-operators-2kvcw\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.382593 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="52cbafc39cdb826b54cd76a2466f2669d01f6a03956f23813b0d4a4b258ebd73" exitCode=0 Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.382653 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"52cbafc39cdb826b54cd76a2466f2669d01f6a03956f23813b0d4a4b258ebd73"} Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.382696 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"a2567b9bb45aad766de8eaa23f029ab9162c75ff9459d11d1dce42cc736d50e9"} Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.382727 5118 scope.go:117] "RemoveContainer" containerID="d49cae129c2c4cc8ccc31e30be8a372290b506680a9107dd5091173906b5a117" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.436353 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:33 crc kubenswrapper[5118]: I0223 07:00:33.717741 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2kvcw"] Feb 23 07:00:33 crc kubenswrapper[5118]: W0223 07:00:33.726633 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc5859c_d217_42a8_9bb7_1d26cde701b2.slice/crio-3e58ef87cc992db890fddb7af8765f98ef6f248fc66a267be44b605a925627ac WatchSource:0}: Error finding container 3e58ef87cc992db890fddb7af8765f98ef6f248fc66a267be44b605a925627ac: Status 404 returned error can't find the container with id 3e58ef87cc992db890fddb7af8765f98ef6f248fc66a267be44b605a925627ac Feb 23 07:00:34 crc kubenswrapper[5118]: I0223 07:00:34.392664 5118 generic.go:334] "Generic (PLEG): container finished" podID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerID="6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc" exitCode=0 Feb 23 07:00:34 crc kubenswrapper[5118]: I0223 07:00:34.392711 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kvcw" event={"ID":"9dc5859c-d217-42a8-9bb7-1d26cde701b2","Type":"ContainerDied","Data":"6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc"} Feb 23 07:00:34 crc kubenswrapper[5118]: I0223 07:00:34.393195 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kvcw" event={"ID":"9dc5859c-d217-42a8-9bb7-1d26cde701b2","Type":"ContainerStarted","Data":"3e58ef87cc992db890fddb7af8765f98ef6f248fc66a267be44b605a925627ac"} Feb 23 07:00:35 crc kubenswrapper[5118]: I0223 07:00:35.401055 5118 generic.go:334] "Generic (PLEG): container finished" podID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerID="22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45" exitCode=0 Feb 23 07:00:35 crc kubenswrapper[5118]: I0223 07:00:35.401132 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kvcw" event={"ID":"9dc5859c-d217-42a8-9bb7-1d26cde701b2","Type":"ContainerDied","Data":"22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45"} Feb 23 07:00:36 crc kubenswrapper[5118]: I0223 07:00:36.411279 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kvcw" event={"ID":"9dc5859c-d217-42a8-9bb7-1d26cde701b2","Type":"ContainerStarted","Data":"2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e"} Feb 23 07:00:36 crc kubenswrapper[5118]: I0223 07:00:36.436588 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2kvcw" podStartSLOduration=2.066219055 podStartE2EDuration="3.436562926s" podCreationTimestamp="2026-02-23 07:00:33 +0000 UTC" firstStartedPulling="2026-02-23 07:00:34.394776658 +0000 UTC m=+897.398561231" lastFinishedPulling="2026-02-23 07:00:35.765120529 +0000 UTC m=+898.768905102" observedRunningTime="2026-02-23 07:00:36.434442114 +0000 UTC m=+899.438226687" watchObservedRunningTime="2026-02-23 07:00:36.436562926 +0000 UTC m=+899.440347509" Feb 23 07:00:42 crc kubenswrapper[5118]: I0223 07:00:42.528876 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5d5d656c8d-m4blz" Feb 23 07:00:43 crc kubenswrapper[5118]: I0223 07:00:43.437221 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:43 crc kubenswrapper[5118]: I0223 07:00:43.438184 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:43 crc kubenswrapper[5118]: I0223 07:00:43.529305 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:44 crc kubenswrapper[5118]: I0223 07:00:44.517952 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:44 crc kubenswrapper[5118]: I0223 07:00:44.574756 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2kvcw"] Feb 23 07:00:46 crc kubenswrapper[5118]: I0223 07:00:46.483505 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2kvcw" podUID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerName="registry-server" containerID="cri-o://2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e" gracePeriod=2 Feb 23 07:00:46 crc kubenswrapper[5118]: I0223 07:00:46.877081 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:46 crc kubenswrapper[5118]: I0223 07:00:46.945524 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twrc4\" (UniqueName: \"kubernetes.io/projected/9dc5859c-d217-42a8-9bb7-1d26cde701b2-kube-api-access-twrc4\") pod \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " Feb 23 07:00:46 crc kubenswrapper[5118]: I0223 07:00:46.945585 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-catalog-content\") pod \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " Feb 23 07:00:46 crc kubenswrapper[5118]: I0223 07:00:46.945724 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-utilities\") pod \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\" (UID: \"9dc5859c-d217-42a8-9bb7-1d26cde701b2\") " Feb 23 07:00:46 crc kubenswrapper[5118]: I0223 07:00:46.951247 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-utilities" (OuterVolumeSpecName: "utilities") pod "9dc5859c-d217-42a8-9bb7-1d26cde701b2" (UID: "9dc5859c-d217-42a8-9bb7-1d26cde701b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:46 crc kubenswrapper[5118]: I0223 07:00:46.953704 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc5859c-d217-42a8-9bb7-1d26cde701b2-kube-api-access-twrc4" (OuterVolumeSpecName: "kube-api-access-twrc4") pod "9dc5859c-d217-42a8-9bb7-1d26cde701b2" (UID: "9dc5859c-d217-42a8-9bb7-1d26cde701b2"). InnerVolumeSpecName "kube-api-access-twrc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.003070 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9dc5859c-d217-42a8-9bb7-1d26cde701b2" (UID: "9dc5859c-d217-42a8-9bb7-1d26cde701b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.047140 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.047197 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twrc4\" (UniqueName: \"kubernetes.io/projected/9dc5859c-d217-42a8-9bb7-1d26cde701b2-kube-api-access-twrc4\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.047207 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc5859c-d217-42a8-9bb7-1d26cde701b2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.491499 5118 generic.go:334] "Generic (PLEG): container finished" podID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerID="2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e" exitCode=0 Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.491547 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kvcw" event={"ID":"9dc5859c-d217-42a8-9bb7-1d26cde701b2","Type":"ContainerDied","Data":"2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e"} Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.491582 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2kvcw" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.491622 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2kvcw" event={"ID":"9dc5859c-d217-42a8-9bb7-1d26cde701b2","Type":"ContainerDied","Data":"3e58ef87cc992db890fddb7af8765f98ef6f248fc66a267be44b605a925627ac"} Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.491652 5118 scope.go:117] "RemoveContainer" containerID="2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.509441 5118 scope.go:117] "RemoveContainer" containerID="22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.522830 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2kvcw"] Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.528489 5118 scope.go:117] "RemoveContainer" containerID="6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.543743 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2kvcw"] Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.566461 5118 scope.go:117] "RemoveContainer" containerID="2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e" Feb 23 07:00:47 crc kubenswrapper[5118]: E0223 07:00:47.566989 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e\": container with ID starting with 2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e not found: ID does not exist" containerID="2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.567044 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e"} err="failed to get container status \"2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e\": rpc error: code = NotFound desc = could not find container \"2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e\": container with ID starting with 2085c5c3727b50a03def7b9ea9db0c90492f74d9be26e186e44715ab9681814e not found: ID does not exist" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.567077 5118 scope.go:117] "RemoveContainer" containerID="22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45" Feb 23 07:00:47 crc kubenswrapper[5118]: E0223 07:00:47.567714 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45\": container with ID starting with 22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45 not found: ID does not exist" containerID="22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.567773 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45"} err="failed to get container status \"22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45\": rpc error: code = NotFound desc = could not find container \"22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45\": container with ID starting with 22b7838f94199db9db8a76aae1e8c17b541d350daa0f75603d61edf2135eed45 not found: ID does not exist" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.567807 5118 scope.go:117] "RemoveContainer" containerID="6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc" Feb 23 07:00:47 crc kubenswrapper[5118]: E0223 07:00:47.568362 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc\": container with ID starting with 6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc not found: ID does not exist" containerID="6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.568392 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc"} err="failed to get container status \"6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc\": rpc error: code = NotFound desc = could not find container \"6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc\": container with ID starting with 6c8869c035c0030d28d27dceed5b24d6e4b79548f18479c4bbc667fa83fc59cc not found: ID does not exist" Feb 23 07:00:47 crc kubenswrapper[5118]: I0223 07:00:47.705985 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" path="/var/lib/kubelet/pods/9dc5859c-d217-42a8-9bb7-1d26cde701b2/volumes" Feb 23 07:00:51 crc kubenswrapper[5118]: I0223 07:00:51.992902 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zlwzx"] Feb 23 07:00:52 crc kubenswrapper[5118]: E0223 07:00:52.000107 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerName="extract-utilities" Feb 23 07:00:52 crc kubenswrapper[5118]: I0223 07:00:52.000132 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerName="extract-utilities" Feb 23 07:00:52 crc kubenswrapper[5118]: E0223 07:00:52.000152 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerName="extract-content" Feb 23 07:00:52 crc kubenswrapper[5118]: I0223 07:00:52.000252 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerName="extract-content" Feb 23 07:00:52 crc kubenswrapper[5118]: E0223 07:00:52.000269 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerName="registry-server" Feb 23 07:00:52 crc kubenswrapper[5118]: I0223 07:00:52.000277 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerName="registry-server" Feb 23 07:00:52 crc kubenswrapper[5118]: I0223 07:00:52.005345 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc5859c-d217-42a8-9bb7-1d26cde701b2" containerName="registry-server" Feb 23 07:00:52 crc kubenswrapper[5118]: I0223 07:00:52.015171 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:52 crc kubenswrapper[5118]: I0223 07:00:52.025158 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-utilities\") pod \"certified-operators-zlwzx\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:52 crc kubenswrapper[5118]: I0223 07:00:52.027975 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztvsl\" (UniqueName: \"kubernetes.io/projected/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-kube-api-access-ztvsl\") pod \"certified-operators-zlwzx\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:52 crc kubenswrapper[5118]: I0223 07:00:52.028187 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-catalog-content\") pod \"certified-operators-zlwzx\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:52.052240 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zlwzx"] Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:52.129924 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-utilities\") pod \"certified-operators-zlwzx\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:52.129975 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztvsl\" (UniqueName: \"kubernetes.io/projected/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-kube-api-access-ztvsl\") pod \"certified-operators-zlwzx\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:52.130006 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-catalog-content\") pod \"certified-operators-zlwzx\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:52.130638 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-utilities\") pod \"certified-operators-zlwzx\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:52.130686 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-catalog-content\") pod \"certified-operators-zlwzx\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:52.151402 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztvsl\" (UniqueName: \"kubernetes.io/projected/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-kube-api-access-ztvsl\") pod \"certified-operators-zlwzx\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:52.361143 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:53.325174 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zlwzx"] Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:53.536231 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlwzx" event={"ID":"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb","Type":"ContainerStarted","Data":"7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698"} Feb 23 07:00:53 crc kubenswrapper[5118]: I0223 07:00:53.536308 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlwzx" event={"ID":"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb","Type":"ContainerStarted","Data":"776e37b6be77d81fc1d65245ae973c88d94600383317ff8d0a670a27b0d66dd9"} Feb 23 07:00:54 crc kubenswrapper[5118]: I0223 07:00:54.547272 5118 generic.go:334] "Generic (PLEG): container finished" podID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerID="7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698" exitCode=0 Feb 23 07:00:54 crc kubenswrapper[5118]: I0223 07:00:54.547380 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlwzx" event={"ID":"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb","Type":"ContainerDied","Data":"7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698"} Feb 23 07:00:55 crc kubenswrapper[5118]: I0223 07:00:55.556394 5118 generic.go:334] "Generic (PLEG): container finished" podID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerID="e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363" exitCode=0 Feb 23 07:00:55 crc kubenswrapper[5118]: I0223 07:00:55.556455 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlwzx" event={"ID":"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb","Type":"ContainerDied","Data":"e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363"} Feb 23 07:00:56 crc kubenswrapper[5118]: I0223 07:00:56.570973 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlwzx" event={"ID":"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb","Type":"ContainerStarted","Data":"1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975"} Feb 23 07:00:56 crc kubenswrapper[5118]: I0223 07:00:56.601323 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zlwzx" podStartSLOduration=4.233948785 podStartE2EDuration="5.601297364s" podCreationTimestamp="2026-02-23 07:00:51 +0000 UTC" firstStartedPulling="2026-02-23 07:00:54.550282184 +0000 UTC m=+917.554066797" lastFinishedPulling="2026-02-23 07:00:55.917630793 +0000 UTC m=+918.921415376" observedRunningTime="2026-02-23 07:00:56.599219753 +0000 UTC m=+919.603004366" watchObservedRunningTime="2026-02-23 07:00:56.601297364 +0000 UTC m=+919.605081947" Feb 23 07:01:02 crc kubenswrapper[5118]: I0223 07:01:02.297482 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-566586f54b-bm2tq" Feb 23 07:01:02 crc kubenswrapper[5118]: I0223 07:01:02.362011 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:01:02 crc kubenswrapper[5118]: I0223 07:01:02.362069 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:01:02 crc kubenswrapper[5118]: I0223 07:01:02.412778 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:01:02 crc kubenswrapper[5118]: I0223 07:01:02.688807 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:01:02 crc kubenswrapper[5118]: I0223 07:01:02.756000 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zlwzx"] Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.054698 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5vhwv"] Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.057412 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.058876 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz"] Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.059792 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.062009 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-8bdj6" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.062380 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.063023 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.065204 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.085534 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz"] Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.109216 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4e891361-5f36-4ebe-a394-c553a139765a-frr-startup\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.109269 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-metrics\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.109337 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-frr-conf\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.109383 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cafc617b-b973-45be-8f33-c7a6414d303a-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gtnbz\" (UID: \"cafc617b-b973-45be-8f33-c7a6414d303a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.109444 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-reloader\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.109464 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv8qx\" (UniqueName: \"kubernetes.io/projected/4e891361-5f36-4ebe-a394-c553a139765a-kube-api-access-gv8qx\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.109522 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd2qn\" (UniqueName: \"kubernetes.io/projected/cafc617b-b973-45be-8f33-c7a6414d303a-kube-api-access-jd2qn\") pod \"frr-k8s-webhook-server-78b44bf5bb-gtnbz\" (UID: \"cafc617b-b973-45be-8f33-c7a6414d303a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.109573 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-frr-sockets\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.109589 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4e891361-5f36-4ebe-a394-c553a139765a-metrics-certs\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.174867 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-75bvd"] Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.177542 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.186600 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.186823 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.187015 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-dhd5j" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.188432 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211042 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-frr-sockets\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211087 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4e891361-5f36-4ebe-a394-c553a139765a-metrics-certs\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211136 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltj8v\" (UniqueName: \"kubernetes.io/projected/ad24076e-75f9-406f-8f31-11211974625d-kube-api-access-ltj8v\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211165 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4e891361-5f36-4ebe-a394-c553a139765a-frr-startup\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211188 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-metrics\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211209 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ad24076e-75f9-406f-8f31-11211974625d-metallb-excludel2\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211242 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-frr-conf\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211261 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cafc617b-b973-45be-8f33-c7a6414d303a-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gtnbz\" (UID: \"cafc617b-b973-45be-8f33-c7a6414d303a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211283 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-metrics-certs\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211304 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211343 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-reloader\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211364 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv8qx\" (UniqueName: \"kubernetes.io/projected/4e891361-5f36-4ebe-a394-c553a139765a-kube-api-access-gv8qx\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211389 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd2qn\" (UniqueName: \"kubernetes.io/projected/cafc617b-b973-45be-8f33-c7a6414d303a-kube-api-access-jd2qn\") pod \"frr-k8s-webhook-server-78b44bf5bb-gtnbz\" (UID: \"cafc617b-b973-45be-8f33-c7a6414d303a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.211866 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-5fkd5"] Feb 23 07:01:03 crc kubenswrapper[5118]: E0223 07:01:03.212060 5118 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.212536 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-frr-sockets\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: E0223 07:01:03.212855 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e891361-5f36-4ebe-a394-c553a139765a-metrics-certs podName:4e891361-5f36-4ebe-a394-c553a139765a nodeName:}" failed. No retries permitted until 2026-02-23 07:01:03.712104984 +0000 UTC m=+926.715889557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4e891361-5f36-4ebe-a394-c553a139765a-metrics-certs") pod "frr-k8s-5vhwv" (UID: "4e891361-5f36-4ebe-a394-c553a139765a") : secret "frr-k8s-certs-secret" not found Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.213023 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-frr-conf\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: E0223 07:01:03.213125 5118 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.213159 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-metrics\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: E0223 07:01:03.213170 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cafc617b-b973-45be-8f33-c7a6414d303a-cert podName:cafc617b-b973-45be-8f33-c7a6414d303a nodeName:}" failed. No retries permitted until 2026-02-23 07:01:03.713152469 +0000 UTC m=+926.716937042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cafc617b-b973-45be-8f33-c7a6414d303a-cert") pod "frr-k8s-webhook-server-78b44bf5bb-gtnbz" (UID: "cafc617b-b973-45be-8f33-c7a6414d303a") : secret "frr-k8s-webhook-server-cert" not found Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.213402 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4e891361-5f36-4ebe-a394-c553a139765a-reloader\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.230673 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4e891361-5f36-4ebe-a394-c553a139765a-frr-startup\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.232225 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.233559 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.246311 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd2qn\" (UniqueName: \"kubernetes.io/projected/cafc617b-b973-45be-8f33-c7a6414d303a-kube-api-access-jd2qn\") pod \"frr-k8s-webhook-server-78b44bf5bb-gtnbz\" (UID: \"cafc617b-b973-45be-8f33-c7a6414d303a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.246704 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-5fkd5"] Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.250458 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv8qx\" (UniqueName: \"kubernetes.io/projected/4e891361-5f36-4ebe-a394-c553a139765a-kube-api-access-gv8qx\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.313081 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6355cba8-47c0-4813-b2a6-14b7ba533f5d-cert\") pod \"controller-69bbfbf88f-5fkd5\" (UID: \"6355cba8-47c0-4813-b2a6-14b7ba533f5d\") " pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.313201 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ad24076e-75f9-406f-8f31-11211974625d-metallb-excludel2\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.313270 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-metrics-certs\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.313301 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.313340 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6355cba8-47c0-4813-b2a6-14b7ba533f5d-metrics-certs\") pod \"controller-69bbfbf88f-5fkd5\" (UID: \"6355cba8-47c0-4813-b2a6-14b7ba533f5d\") " pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.313387 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q7m4\" (UniqueName: \"kubernetes.io/projected/6355cba8-47c0-4813-b2a6-14b7ba533f5d-kube-api-access-4q7m4\") pod \"controller-69bbfbf88f-5fkd5\" (UID: \"6355cba8-47c0-4813-b2a6-14b7ba533f5d\") " pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: E0223 07:01:03.313490 5118 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 23 07:01:03 crc kubenswrapper[5118]: E0223 07:01:03.313592 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist podName:ad24076e-75f9-406f-8f31-11211974625d nodeName:}" failed. No retries permitted until 2026-02-23 07:01:03.813564003 +0000 UTC m=+926.817348576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist") pod "speaker-75bvd" (UID: "ad24076e-75f9-406f-8f31-11211974625d") : secret "metallb-memberlist" not found Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.313621 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltj8v\" (UniqueName: \"kubernetes.io/projected/ad24076e-75f9-406f-8f31-11211974625d-kube-api-access-ltj8v\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.314330 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ad24076e-75f9-406f-8f31-11211974625d-metallb-excludel2\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.316655 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-metrics-certs\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.335814 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltj8v\" (UniqueName: \"kubernetes.io/projected/ad24076e-75f9-406f-8f31-11211974625d-kube-api-access-ltj8v\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.415331 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6355cba8-47c0-4813-b2a6-14b7ba533f5d-cert\") pod \"controller-69bbfbf88f-5fkd5\" (UID: \"6355cba8-47c0-4813-b2a6-14b7ba533f5d\") " pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.415752 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6355cba8-47c0-4813-b2a6-14b7ba533f5d-metrics-certs\") pod \"controller-69bbfbf88f-5fkd5\" (UID: \"6355cba8-47c0-4813-b2a6-14b7ba533f5d\") " pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.415882 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q7m4\" (UniqueName: \"kubernetes.io/projected/6355cba8-47c0-4813-b2a6-14b7ba533f5d-kube-api-access-4q7m4\") pod \"controller-69bbfbf88f-5fkd5\" (UID: \"6355cba8-47c0-4813-b2a6-14b7ba533f5d\") " pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: E0223 07:01:03.415878 5118 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 23 07:01:03 crc kubenswrapper[5118]: E0223 07:01:03.416141 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6355cba8-47c0-4813-b2a6-14b7ba533f5d-metrics-certs podName:6355cba8-47c0-4813-b2a6-14b7ba533f5d nodeName:}" failed. No retries permitted until 2026-02-23 07:01:03.916112237 +0000 UTC m=+926.919896880 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6355cba8-47c0-4813-b2a6-14b7ba533f5d-metrics-certs") pod "controller-69bbfbf88f-5fkd5" (UID: "6355cba8-47c0-4813-b2a6-14b7ba533f5d") : secret "controller-certs-secret" not found Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.416982 5118 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.430410 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6355cba8-47c0-4813-b2a6-14b7ba533f5d-cert\") pod \"controller-69bbfbf88f-5fkd5\" (UID: \"6355cba8-47c0-4813-b2a6-14b7ba533f5d\") " pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.440887 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q7m4\" (UniqueName: \"kubernetes.io/projected/6355cba8-47c0-4813-b2a6-14b7ba533f5d-kube-api-access-4q7m4\") pod \"controller-69bbfbf88f-5fkd5\" (UID: \"6355cba8-47c0-4813-b2a6-14b7ba533f5d\") " pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.720446 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4e891361-5f36-4ebe-a394-c553a139765a-metrics-certs\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.720623 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cafc617b-b973-45be-8f33-c7a6414d303a-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gtnbz\" (UID: \"cafc617b-b973-45be-8f33-c7a6414d303a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.725677 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4e891361-5f36-4ebe-a394-c553a139765a-metrics-certs\") pod \"frr-k8s-5vhwv\" (UID: \"4e891361-5f36-4ebe-a394-c553a139765a\") " pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.730348 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cafc617b-b973-45be-8f33-c7a6414d303a-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-gtnbz\" (UID: \"cafc617b-b973-45be-8f33-c7a6414d303a\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.822895 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:03 crc kubenswrapper[5118]: E0223 07:01:03.823194 5118 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 23 07:01:03 crc kubenswrapper[5118]: E0223 07:01:03.823389 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist podName:ad24076e-75f9-406f-8f31-11211974625d nodeName:}" failed. No retries permitted until 2026-02-23 07:01:04.823342313 +0000 UTC m=+927.827126926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist") pod "speaker-75bvd" (UID: "ad24076e-75f9-406f-8f31-11211974625d") : secret "metallb-memberlist" not found Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.924545 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6355cba8-47c0-4813-b2a6-14b7ba533f5d-metrics-certs\") pod \"controller-69bbfbf88f-5fkd5\" (UID: \"6355cba8-47c0-4813-b2a6-14b7ba533f5d\") " pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.930068 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6355cba8-47c0-4813-b2a6-14b7ba533f5d-metrics-certs\") pod \"controller-69bbfbf88f-5fkd5\" (UID: \"6355cba8-47c0-4813-b2a6-14b7ba533f5d\") " pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.977826 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:03 crc kubenswrapper[5118]: I0223 07:01:03.984275 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:04 crc kubenswrapper[5118]: I0223 07:01:04.188941 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:04 crc kubenswrapper[5118]: I0223 07:01:04.507279 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz"] Feb 23 07:01:04 crc kubenswrapper[5118]: I0223 07:01:04.619894 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-5fkd5"] Feb 23 07:01:04 crc kubenswrapper[5118]: W0223 07:01:04.627536 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6355cba8_47c0_4813_b2a6_14b7ba533f5d.slice/crio-e69732b3eac3b5b18f91a078547f969af0f3126b6127e23bc3501e93ba893278 WatchSource:0}: Error finding container e69732b3eac3b5b18f91a078547f969af0f3126b6127e23bc3501e93ba893278: Status 404 returned error can't find the container with id e69732b3eac3b5b18f91a078547f969af0f3126b6127e23bc3501e93ba893278 Feb 23 07:01:04 crc kubenswrapper[5118]: I0223 07:01:04.640806 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-5fkd5" event={"ID":"6355cba8-47c0-4813-b2a6-14b7ba533f5d","Type":"ContainerStarted","Data":"e69732b3eac3b5b18f91a078547f969af0f3126b6127e23bc3501e93ba893278"} Feb 23 07:01:04 crc kubenswrapper[5118]: I0223 07:01:04.642202 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vhwv" event={"ID":"4e891361-5f36-4ebe-a394-c553a139765a","Type":"ContainerStarted","Data":"29eb77ad562769e9a2a4de2d0098534bb1ba2ca8515b54e8df8bdc0bebdb5efe"} Feb 23 07:01:04 crc kubenswrapper[5118]: I0223 07:01:04.644631 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" event={"ID":"cafc617b-b973-45be-8f33-c7a6414d303a","Type":"ContainerStarted","Data":"d75c09a1031556773e7b920dd474842dea8dc20051c74de5645ade84e1797672"} Feb 23 07:01:04 crc kubenswrapper[5118]: I0223 07:01:04.644886 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zlwzx" podUID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerName="registry-server" containerID="cri-o://1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975" gracePeriod=2 Feb 23 07:01:04 crc kubenswrapper[5118]: I0223 07:01:04.842946 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:04 crc kubenswrapper[5118]: E0223 07:01:04.843292 5118 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 23 07:01:04 crc kubenswrapper[5118]: E0223 07:01:04.843416 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist podName:ad24076e-75f9-406f-8f31-11211974625d nodeName:}" failed. No retries permitted until 2026-02-23 07:01:06.843390127 +0000 UTC m=+929.847174700 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist") pod "speaker-75bvd" (UID: "ad24076e-75f9-406f-8f31-11211974625d") : secret "metallb-memberlist" not found Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.023173 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.148787 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-catalog-content\") pod \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.148934 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztvsl\" (UniqueName: \"kubernetes.io/projected/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-kube-api-access-ztvsl\") pod \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.148979 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-utilities\") pod \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\" (UID: \"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb\") " Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.150606 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-utilities" (OuterVolumeSpecName: "utilities") pod "4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" (UID: "4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.156039 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-kube-api-access-ztvsl" (OuterVolumeSpecName: "kube-api-access-ztvsl") pod "4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" (UID: "4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb"). InnerVolumeSpecName "kube-api-access-ztvsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.208340 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" (UID: "4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.251445 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztvsl\" (UniqueName: \"kubernetes.io/projected/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-kube-api-access-ztvsl\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.251757 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.251888 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.669940 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-5fkd5" event={"ID":"6355cba8-47c0-4813-b2a6-14b7ba533f5d","Type":"ContainerStarted","Data":"07e2f7baee0922fe24a13136fae8a8056e5cc5d35eeacaa9150126df9b3bf16d"} Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.670013 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-5fkd5" event={"ID":"6355cba8-47c0-4813-b2a6-14b7ba533f5d","Type":"ContainerStarted","Data":"9745bd18e8b0077fcd8a90868e2daee025580f6915b981c03cb958903b168b17"} Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.670189 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.676914 5118 generic.go:334] "Generic (PLEG): container finished" podID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerID="1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975" exitCode=0 Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.677077 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zlwzx" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.677813 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlwzx" event={"ID":"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb","Type":"ContainerDied","Data":"1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975"} Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.677989 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zlwzx" event={"ID":"4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb","Type":"ContainerDied","Data":"776e37b6be77d81fc1d65245ae973c88d94600383317ff8d0a670a27b0d66dd9"} Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.678306 5118 scope.go:117] "RemoveContainer" containerID="1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.691678 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-5fkd5" podStartSLOduration=2.691652302 podStartE2EDuration="2.691652302s" podCreationTimestamp="2026-02-23 07:01:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:05.688934957 +0000 UTC m=+928.692719550" watchObservedRunningTime="2026-02-23 07:01:05.691652302 +0000 UTC m=+928.695436865" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.705353 5118 scope.go:117] "RemoveContainer" containerID="e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.718670 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zlwzx"] Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.722805 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zlwzx"] Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.748046 5118 scope.go:117] "RemoveContainer" containerID="7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.762743 5118 scope.go:117] "RemoveContainer" containerID="1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975" Feb 23 07:01:05 crc kubenswrapper[5118]: E0223 07:01:05.763823 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975\": container with ID starting with 1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975 not found: ID does not exist" containerID="1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.763868 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975"} err="failed to get container status \"1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975\": rpc error: code = NotFound desc = could not find container \"1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975\": container with ID starting with 1335ad12741b883c2cb553e4d22ecbc78b9376254e5448dd06a7be8418cc5975 not found: ID does not exist" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.763897 5118 scope.go:117] "RemoveContainer" containerID="e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363" Feb 23 07:01:05 crc kubenswrapper[5118]: E0223 07:01:05.764662 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363\": container with ID starting with e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363 not found: ID does not exist" containerID="e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.764738 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363"} err="failed to get container status \"e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363\": rpc error: code = NotFound desc = could not find container \"e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363\": container with ID starting with e111c7896880067ae0b314c9bbbf025d5fd6110f1667726e76a5243071b51363 not found: ID does not exist" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.764773 5118 scope.go:117] "RemoveContainer" containerID="7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698" Feb 23 07:01:05 crc kubenswrapper[5118]: E0223 07:01:05.765367 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698\": container with ID starting with 7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698 not found: ID does not exist" containerID="7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698" Feb 23 07:01:05 crc kubenswrapper[5118]: I0223 07:01:05.765399 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698"} err="failed to get container status \"7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698\": rpc error: code = NotFound desc = could not find container \"7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698\": container with ID starting with 7e2b28185931781b0b79d3e251958cf439ab326c0f4a42bda5484f2db7616698 not found: ID does not exist" Feb 23 07:01:06 crc kubenswrapper[5118]: I0223 07:01:06.876799 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:06 crc kubenswrapper[5118]: I0223 07:01:06.881023 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ad24076e-75f9-406f-8f31-11211974625d-memberlist\") pod \"speaker-75bvd\" (UID: \"ad24076e-75f9-406f-8f31-11211974625d\") " pod="metallb-system/speaker-75bvd" Feb 23 07:01:07 crc kubenswrapper[5118]: I0223 07:01:07.130377 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-75bvd" Feb 23 07:01:07 crc kubenswrapper[5118]: I0223 07:01:07.710193 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" path="/var/lib/kubelet/pods/4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb/volumes" Feb 23 07:01:07 crc kubenswrapper[5118]: I0223 07:01:07.711572 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-75bvd" event={"ID":"ad24076e-75f9-406f-8f31-11211974625d","Type":"ContainerStarted","Data":"2b9096033908aa0a7ff7e84ada0e855bdaf0b93aff6c9b2b003db8049478a948"} Feb 23 07:01:07 crc kubenswrapper[5118]: I0223 07:01:07.711635 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-75bvd" event={"ID":"ad24076e-75f9-406f-8f31-11211974625d","Type":"ContainerStarted","Data":"a4da4dcb8135ef5fd696b9b22f38be00fae27266b260be8f13ca788da3b01d91"} Feb 23 07:01:08 crc kubenswrapper[5118]: I0223 07:01:08.725558 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-75bvd" event={"ID":"ad24076e-75f9-406f-8f31-11211974625d","Type":"ContainerStarted","Data":"16377b50257c8337828b778095bed864fa0e1702bf077387566a73fcc450aafe"} Feb 23 07:01:08 crc kubenswrapper[5118]: I0223 07:01:08.751850 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-75bvd" podStartSLOduration=5.751820535 podStartE2EDuration="5.751820535s" podCreationTimestamp="2026-02-23 07:01:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:08.742636003 +0000 UTC m=+931.746420596" watchObservedRunningTime="2026-02-23 07:01:08.751820535 +0000 UTC m=+931.755605118" Feb 23 07:01:09 crc kubenswrapper[5118]: I0223 07:01:09.733756 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-75bvd" Feb 23 07:01:12 crc kubenswrapper[5118]: I0223 07:01:12.756224 5118 generic.go:334] "Generic (PLEG): container finished" podID="4e891361-5f36-4ebe-a394-c553a139765a" containerID="e0354eadfb56e3d0e1c4d54fa2fc9d15b475ff8ece8918629e32739b835277b1" exitCode=0 Feb 23 07:01:12 crc kubenswrapper[5118]: I0223 07:01:12.756340 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vhwv" event={"ID":"4e891361-5f36-4ebe-a394-c553a139765a","Type":"ContainerDied","Data":"e0354eadfb56e3d0e1c4d54fa2fc9d15b475ff8ece8918629e32739b835277b1"} Feb 23 07:01:12 crc kubenswrapper[5118]: I0223 07:01:12.758722 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" event={"ID":"cafc617b-b973-45be-8f33-c7a6414d303a","Type":"ContainerStarted","Data":"9935542d3c64cd010cdd0472a3cb02a6e0dde2e1e2bb976465fdeb5e6cdeee51"} Feb 23 07:01:12 crc kubenswrapper[5118]: I0223 07:01:12.758922 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:12 crc kubenswrapper[5118]: I0223 07:01:12.825668 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" podStartSLOduration=2.553640625 podStartE2EDuration="9.825642746s" podCreationTimestamp="2026-02-23 07:01:03 +0000 UTC" firstStartedPulling="2026-02-23 07:01:04.515360344 +0000 UTC m=+927.519144937" lastFinishedPulling="2026-02-23 07:01:11.787362485 +0000 UTC m=+934.791147058" observedRunningTime="2026-02-23 07:01:12.821715962 +0000 UTC m=+935.825500535" watchObservedRunningTime="2026-02-23 07:01:12.825642746 +0000 UTC m=+935.829427339" Feb 23 07:01:13 crc kubenswrapper[5118]: I0223 07:01:13.769532 5118 generic.go:334] "Generic (PLEG): container finished" podID="4e891361-5f36-4ebe-a394-c553a139765a" containerID="efb7ac7fbaa10d8bd6b0063aa52d65a1563ed5a02643928fbd13e41c87ce2598" exitCode=0 Feb 23 07:01:13 crc kubenswrapper[5118]: I0223 07:01:13.770437 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vhwv" event={"ID":"4e891361-5f36-4ebe-a394-c553a139765a","Type":"ContainerDied","Data":"efb7ac7fbaa10d8bd6b0063aa52d65a1563ed5a02643928fbd13e41c87ce2598"} Feb 23 07:01:14 crc kubenswrapper[5118]: I0223 07:01:14.196601 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-5fkd5" Feb 23 07:01:14 crc kubenswrapper[5118]: I0223 07:01:14.781702 5118 generic.go:334] "Generic (PLEG): container finished" podID="4e891361-5f36-4ebe-a394-c553a139765a" containerID="d0a837b09f1b2b41acb813b00f94cbd5c688061a514fde1e6320f80e9c340585" exitCode=0 Feb 23 07:01:14 crc kubenswrapper[5118]: I0223 07:01:14.781846 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vhwv" event={"ID":"4e891361-5f36-4ebe-a394-c553a139765a","Type":"ContainerDied","Data":"d0a837b09f1b2b41acb813b00f94cbd5c688061a514fde1e6320f80e9c340585"} Feb 23 07:01:15 crc kubenswrapper[5118]: I0223 07:01:15.797413 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vhwv" event={"ID":"4e891361-5f36-4ebe-a394-c553a139765a","Type":"ContainerStarted","Data":"c68f56fd008965a99a820df8d75fc1bd4811361d932ce513b22b0096e7dc4e51"} Feb 23 07:01:15 crc kubenswrapper[5118]: I0223 07:01:15.797470 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vhwv" event={"ID":"4e891361-5f36-4ebe-a394-c553a139765a","Type":"ContainerStarted","Data":"f624442c49c964b1f5419be7803a43aae05938400323c71640e54d96b1f788f5"} Feb 23 07:01:15 crc kubenswrapper[5118]: I0223 07:01:15.797483 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vhwv" event={"ID":"4e891361-5f36-4ebe-a394-c553a139765a","Type":"ContainerStarted","Data":"f5c7f30ece52251c6924b33a6a5d34f3f9a8dc1da69f2b5e54324b4b59259d38"} Feb 23 07:01:15 crc kubenswrapper[5118]: I0223 07:01:15.797495 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vhwv" event={"ID":"4e891361-5f36-4ebe-a394-c553a139765a","Type":"ContainerStarted","Data":"2dd9fcd5917337c29ccfa01dc8adc877fd88017b5ebdd388de27f453b671fce6"} Feb 23 07:01:15 crc kubenswrapper[5118]: I0223 07:01:15.797508 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vhwv" event={"ID":"4e891361-5f36-4ebe-a394-c553a139765a","Type":"ContainerStarted","Data":"14dc73e73e6407b7cf6ba2bbe55aaea1d170e046dffea3c755a584548c2d3496"} Feb 23 07:01:16 crc kubenswrapper[5118]: I0223 07:01:16.814577 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5vhwv" event={"ID":"4e891361-5f36-4ebe-a394-c553a139765a","Type":"ContainerStarted","Data":"b45b70298b33f109137324ed2e5b587abbd839abcf49e0138fb6b5cbd1a44c3e"} Feb 23 07:01:16 crc kubenswrapper[5118]: I0223 07:01:16.815062 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:16 crc kubenswrapper[5118]: I0223 07:01:16.850326 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5vhwv" podStartSLOduration=6.23219733 podStartE2EDuration="13.850299217s" podCreationTimestamp="2026-02-23 07:01:03 +0000 UTC" firstStartedPulling="2026-02-23 07:01:04.178472508 +0000 UTC m=+927.182257091" lastFinishedPulling="2026-02-23 07:01:11.796574365 +0000 UTC m=+934.800358978" observedRunningTime="2026-02-23 07:01:16.849004106 +0000 UTC m=+939.852788769" watchObservedRunningTime="2026-02-23 07:01:16.850299217 +0000 UTC m=+939.854083800" Feb 23 07:01:17 crc kubenswrapper[5118]: I0223 07:01:17.136249 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-75bvd" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.464077 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78"] Feb 23 07:01:18 crc kubenswrapper[5118]: E0223 07:01:18.464396 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerName="extract-content" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.464409 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerName="extract-content" Feb 23 07:01:18 crc kubenswrapper[5118]: E0223 07:01:18.464427 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerName="registry-server" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.464433 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerName="registry-server" Feb 23 07:01:18 crc kubenswrapper[5118]: E0223 07:01:18.464442 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerName="extract-utilities" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.464449 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerName="extract-utilities" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.464558 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d15d135-be2b-42a9-a5f1-b89a3e0ae5cb" containerName="registry-server" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.465378 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.468606 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.522646 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78"] Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.603531 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.603601 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lhvh\" (UniqueName: \"kubernetes.io/projected/9fc8d426-e713-4b39-b393-717378bc1b1b-kube-api-access-9lhvh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.603634 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.705479 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.705552 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lhvh\" (UniqueName: \"kubernetes.io/projected/9fc8d426-e713-4b39-b393-717378bc1b1b-kube-api-access-9lhvh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.705582 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.706462 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.706509 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.729807 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lhvh\" (UniqueName: \"kubernetes.io/projected/9fc8d426-e713-4b39-b393-717378bc1b1b-kube-api-access-9lhvh\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.783334 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:18 crc kubenswrapper[5118]: I0223 07:01:18.978053 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:19 crc kubenswrapper[5118]: I0223 07:01:19.031820 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:19 crc kubenswrapper[5118]: I0223 07:01:19.234614 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78"] Feb 23 07:01:19 crc kubenswrapper[5118]: I0223 07:01:19.837580 5118 generic.go:334] "Generic (PLEG): container finished" podID="9fc8d426-e713-4b39-b393-717378bc1b1b" containerID="e4786ac8bb8a042b6e867a9637eb7bb0ea58338413e129981296ab2905037c0b" exitCode=0 Feb 23 07:01:19 crc kubenswrapper[5118]: I0223 07:01:19.839600 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" event={"ID":"9fc8d426-e713-4b39-b393-717378bc1b1b","Type":"ContainerDied","Data":"e4786ac8bb8a042b6e867a9637eb7bb0ea58338413e129981296ab2905037c0b"} Feb 23 07:01:19 crc kubenswrapper[5118]: I0223 07:01:19.839641 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" event={"ID":"9fc8d426-e713-4b39-b393-717378bc1b1b","Type":"ContainerStarted","Data":"2a3804ec5f40062d529ed8a2b388c197baf6b5ff3b16cf88fa9d4c4f5fa05398"} Feb 23 07:01:23 crc kubenswrapper[5118]: I0223 07:01:23.990961 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-gtnbz" Feb 23 07:01:24 crc kubenswrapper[5118]: I0223 07:01:24.883077 5118 generic.go:334] "Generic (PLEG): container finished" podID="9fc8d426-e713-4b39-b393-717378bc1b1b" containerID="06400fc8e93f1e49f63de6bd7760dec8bf4352a7271a7f2e0afc5d125be1f9b9" exitCode=0 Feb 23 07:01:24 crc kubenswrapper[5118]: I0223 07:01:24.883174 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" event={"ID":"9fc8d426-e713-4b39-b393-717378bc1b1b","Type":"ContainerDied","Data":"06400fc8e93f1e49f63de6bd7760dec8bf4352a7271a7f2e0afc5d125be1f9b9"} Feb 23 07:01:25 crc kubenswrapper[5118]: I0223 07:01:25.893696 5118 generic.go:334] "Generic (PLEG): container finished" podID="9fc8d426-e713-4b39-b393-717378bc1b1b" containerID="cbe943fcdfbedb3060245c7b395b1429f375c3c7d25ed61c4f51d9b0350942ce" exitCode=0 Feb 23 07:01:25 crc kubenswrapper[5118]: I0223 07:01:25.893938 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" event={"ID":"9fc8d426-e713-4b39-b393-717378bc1b1b","Type":"ContainerDied","Data":"cbe943fcdfbedb3060245c7b395b1429f375c3c7d25ed61c4f51d9b0350942ce"} Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.271484 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.364315 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-util\") pod \"9fc8d426-e713-4b39-b393-717378bc1b1b\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.364487 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-bundle\") pod \"9fc8d426-e713-4b39-b393-717378bc1b1b\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.364550 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lhvh\" (UniqueName: \"kubernetes.io/projected/9fc8d426-e713-4b39-b393-717378bc1b1b-kube-api-access-9lhvh\") pod \"9fc8d426-e713-4b39-b393-717378bc1b1b\" (UID: \"9fc8d426-e713-4b39-b393-717378bc1b1b\") " Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.366923 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-bundle" (OuterVolumeSpecName: "bundle") pod "9fc8d426-e713-4b39-b393-717378bc1b1b" (UID: "9fc8d426-e713-4b39-b393-717378bc1b1b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.375626 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc8d426-e713-4b39-b393-717378bc1b1b-kube-api-access-9lhvh" (OuterVolumeSpecName: "kube-api-access-9lhvh") pod "9fc8d426-e713-4b39-b393-717378bc1b1b" (UID: "9fc8d426-e713-4b39-b393-717378bc1b1b"). InnerVolumeSpecName "kube-api-access-9lhvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.382485 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-util" (OuterVolumeSpecName: "util") pod "9fc8d426-e713-4b39-b393-717378bc1b1b" (UID: "9fc8d426-e713-4b39-b393-717378bc1b1b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.466965 5118 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-util\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.467019 5118 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fc8d426-e713-4b39-b393-717378bc1b1b-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.467041 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lhvh\" (UniqueName: \"kubernetes.io/projected/9fc8d426-e713-4b39-b393-717378bc1b1b-kube-api-access-9lhvh\") on node \"crc\" DevicePath \"\"" Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.916016 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" event={"ID":"9fc8d426-e713-4b39-b393-717378bc1b1b","Type":"ContainerDied","Data":"2a3804ec5f40062d529ed8a2b388c197baf6b5ff3b16cf88fa9d4c4f5fa05398"} Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.916085 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a3804ec5f40062d529ed8a2b388c197baf6b5ff3b16cf88fa9d4c4f5fa05398" Feb 23 07:01:27 crc kubenswrapper[5118]: I0223 07:01:27.916146 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.492924 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk"] Feb 23 07:01:31 crc kubenswrapper[5118]: E0223 07:01:31.494828 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc8d426-e713-4b39-b393-717378bc1b1b" containerName="util" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.494972 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc8d426-e713-4b39-b393-717378bc1b1b" containerName="util" Feb 23 07:01:31 crc kubenswrapper[5118]: E0223 07:01:31.495123 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc8d426-e713-4b39-b393-717378bc1b1b" containerName="extract" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.495249 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc8d426-e713-4b39-b393-717378bc1b1b" containerName="extract" Feb 23 07:01:31 crc kubenswrapper[5118]: E0223 07:01:31.495369 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc8d426-e713-4b39-b393-717378bc1b1b" containerName="pull" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.495485 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc8d426-e713-4b39-b393-717378bc1b1b" containerName="pull" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.495771 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc8d426-e713-4b39-b393-717378bc1b1b" containerName="extract" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.496714 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.502344 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.503262 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.503831 5118 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-t2sch" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.518306 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk"] Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.632944 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f9e34fce-fcb2-4959-88fb-e06e40e3e5f5-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-x8ftk\" (UID: \"f9e34fce-fcb2-4959-88fb-e06e40e3e5f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.633375 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cv9x\" (UniqueName: \"kubernetes.io/projected/f9e34fce-fcb2-4959-88fb-e06e40e3e5f5-kube-api-access-4cv9x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-x8ftk\" (UID: \"f9e34fce-fcb2-4959-88fb-e06e40e3e5f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.734821 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f9e34fce-fcb2-4959-88fb-e06e40e3e5f5-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-x8ftk\" (UID: \"f9e34fce-fcb2-4959-88fb-e06e40e3e5f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.734939 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cv9x\" (UniqueName: \"kubernetes.io/projected/f9e34fce-fcb2-4959-88fb-e06e40e3e5f5-kube-api-access-4cv9x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-x8ftk\" (UID: \"f9e34fce-fcb2-4959-88fb-e06e40e3e5f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.735385 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f9e34fce-fcb2-4959-88fb-e06e40e3e5f5-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-x8ftk\" (UID: \"f9e34fce-fcb2-4959-88fb-e06e40e3e5f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.754770 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cv9x\" (UniqueName: \"kubernetes.io/projected/f9e34fce-fcb2-4959-88fb-e06e40e3e5f5-kube-api-access-4cv9x\") pod \"cert-manager-operator-controller-manager-66c8bdd694-x8ftk\" (UID: \"f9e34fce-fcb2-4959-88fb-e06e40e3e5f5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" Feb 23 07:01:31 crc kubenswrapper[5118]: I0223 07:01:31.819511 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" Feb 23 07:01:32 crc kubenswrapper[5118]: I0223 07:01:32.348360 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk"] Feb 23 07:01:32 crc kubenswrapper[5118]: W0223 07:01:32.360438 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9e34fce_fcb2_4959_88fb_e06e40e3e5f5.slice/crio-d40cacfa0670021d9dce76cbcd9a5e282c067191092704d040869ca09911c2be WatchSource:0}: Error finding container d40cacfa0670021d9dce76cbcd9a5e282c067191092704d040869ca09911c2be: Status 404 returned error can't find the container with id d40cacfa0670021d9dce76cbcd9a5e282c067191092704d040869ca09911c2be Feb 23 07:01:32 crc kubenswrapper[5118]: I0223 07:01:32.948575 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" event={"ID":"f9e34fce-fcb2-4959-88fb-e06e40e3e5f5","Type":"ContainerStarted","Data":"d40cacfa0670021d9dce76cbcd9a5e282c067191092704d040869ca09911c2be"} Feb 23 07:01:33 crc kubenswrapper[5118]: I0223 07:01:33.982304 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5vhwv" Feb 23 07:01:35 crc kubenswrapper[5118]: I0223 07:01:35.972493 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" event={"ID":"f9e34fce-fcb2-4959-88fb-e06e40e3e5f5","Type":"ContainerStarted","Data":"053247e821f49ebb6219ba9196261e2c92d827d9cbe09975b1908eed77b97c90"} Feb 23 07:01:36 crc kubenswrapper[5118]: I0223 07:01:36.015930 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-x8ftk" podStartSLOduration=2.328288454 podStartE2EDuration="5.015903733s" podCreationTimestamp="2026-02-23 07:01:31 +0000 UTC" firstStartedPulling="2026-02-23 07:01:32.364493953 +0000 UTC m=+955.368278536" lastFinishedPulling="2026-02-23 07:01:35.052109252 +0000 UTC m=+958.055893815" observedRunningTime="2026-02-23 07:01:36.011868466 +0000 UTC m=+959.015653049" watchObservedRunningTime="2026-02-23 07:01:36.015903733 +0000 UTC m=+959.019688316" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.530753 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-rkjcd"] Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.532439 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.539692 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.540126 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.540983 5118 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jn5s4" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.545081 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-rkjcd"] Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.659269 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-rkjcd\" (UID: \"ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.659446 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8g9w\" (UniqueName: \"kubernetes.io/projected/ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba-kube-api-access-r8g9w\") pod \"cert-manager-webhook-6888856db4-rkjcd\" (UID: \"ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.761915 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-rkjcd\" (UID: \"ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.762354 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8g9w\" (UniqueName: \"kubernetes.io/projected/ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba-kube-api-access-r8g9w\") pod \"cert-manager-webhook-6888856db4-rkjcd\" (UID: \"ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.785401 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8g9w\" (UniqueName: \"kubernetes.io/projected/ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba-kube-api-access-r8g9w\") pod \"cert-manager-webhook-6888856db4-rkjcd\" (UID: \"ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.791061 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-rkjcd\" (UID: \"ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" Feb 23 07:01:39 crc kubenswrapper[5118]: I0223 07:01:39.856039 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" Feb 23 07:01:40 crc kubenswrapper[5118]: I0223 07:01:40.329515 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-rkjcd"] Feb 23 07:01:41 crc kubenswrapper[5118]: I0223 07:01:41.014060 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" event={"ID":"ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba","Type":"ContainerStarted","Data":"fc20515a7aa4474f2d74c90521ddbae03caad7b7e7e9c5696cdf14423fee6906"} Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.256622 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bzjnz"] Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.257915 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.261750 5118 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-mbrsb" Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.266568 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bzjnz"] Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.310233 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jv6\" (UniqueName: \"kubernetes.io/projected/523774d0-ef24-403e-8a1a-1fad85d7985c-kube-api-access-q6jv6\") pod \"cert-manager-cainjector-5545bd876-bzjnz\" (UID: \"523774d0-ef24-403e-8a1a-1fad85d7985c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.310359 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/523774d0-ef24-403e-8a1a-1fad85d7985c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bzjnz\" (UID: \"523774d0-ef24-403e-8a1a-1fad85d7985c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.411503 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/523774d0-ef24-403e-8a1a-1fad85d7985c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bzjnz\" (UID: \"523774d0-ef24-403e-8a1a-1fad85d7985c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.411586 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jv6\" (UniqueName: \"kubernetes.io/projected/523774d0-ef24-403e-8a1a-1fad85d7985c-kube-api-access-q6jv6\") pod \"cert-manager-cainjector-5545bd876-bzjnz\" (UID: \"523774d0-ef24-403e-8a1a-1fad85d7985c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.435253 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jv6\" (UniqueName: \"kubernetes.io/projected/523774d0-ef24-403e-8a1a-1fad85d7985c-kube-api-access-q6jv6\") pod \"cert-manager-cainjector-5545bd876-bzjnz\" (UID: \"523774d0-ef24-403e-8a1a-1fad85d7985c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.436980 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/523774d0-ef24-403e-8a1a-1fad85d7985c-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bzjnz\" (UID: \"523774d0-ef24-403e-8a1a-1fad85d7985c\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" Feb 23 07:01:42 crc kubenswrapper[5118]: I0223 07:01:42.581977 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" Feb 23 07:01:43 crc kubenswrapper[5118]: W0223 07:01:43.069020 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod523774d0_ef24_403e_8a1a_1fad85d7985c.slice/crio-522e0d52a558870c8baad968955c17d3e2fc199bf248ece7ad682f389167e15a WatchSource:0}: Error finding container 522e0d52a558870c8baad968955c17d3e2fc199bf248ece7ad682f389167e15a: Status 404 returned error can't find the container with id 522e0d52a558870c8baad968955c17d3e2fc199bf248ece7ad682f389167e15a Feb 23 07:01:43 crc kubenswrapper[5118]: I0223 07:01:43.071613 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bzjnz"] Feb 23 07:01:44 crc kubenswrapper[5118]: I0223 07:01:44.036482 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" event={"ID":"523774d0-ef24-403e-8a1a-1fad85d7985c","Type":"ContainerStarted","Data":"522e0d52a558870c8baad968955c17d3e2fc199bf248ece7ad682f389167e15a"} Feb 23 07:01:46 crc kubenswrapper[5118]: I0223 07:01:46.052772 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" event={"ID":"523774d0-ef24-403e-8a1a-1fad85d7985c","Type":"ContainerStarted","Data":"5ca8f51c52ec5c5bd490c5f952e1107442d384128959c837ff4e48659d5578ba"} Feb 23 07:01:46 crc kubenswrapper[5118]: I0223 07:01:46.065932 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" event={"ID":"ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba","Type":"ContainerStarted","Data":"1d50169899359ed7c75239fbc2eeb5b6a0d4a791065feec2811b08e4b7b455e3"} Feb 23 07:01:46 crc kubenswrapper[5118]: I0223 07:01:46.066198 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" Feb 23 07:01:46 crc kubenswrapper[5118]: I0223 07:01:46.100850 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-bzjnz" podStartSLOduration=2.020231853 podStartE2EDuration="4.100828074s" podCreationTimestamp="2026-02-23 07:01:42 +0000 UTC" firstStartedPulling="2026-02-23 07:01:43.072017436 +0000 UTC m=+966.075802009" lastFinishedPulling="2026-02-23 07:01:45.152613657 +0000 UTC m=+968.156398230" observedRunningTime="2026-02-23 07:01:46.086443399 +0000 UTC m=+969.090227982" watchObservedRunningTime="2026-02-23 07:01:46.100828074 +0000 UTC m=+969.104612647" Feb 23 07:01:46 crc kubenswrapper[5118]: I0223 07:01:46.113433 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" podStartSLOduration=2.298042014 podStartE2EDuration="7.113412167s" podCreationTimestamp="2026-02-23 07:01:39 +0000 UTC" firstStartedPulling="2026-02-23 07:01:40.343978596 +0000 UTC m=+963.347763179" lastFinishedPulling="2026-02-23 07:01:45.159348759 +0000 UTC m=+968.163133332" observedRunningTime="2026-02-23 07:01:46.112077365 +0000 UTC m=+969.115861948" watchObservedRunningTime="2026-02-23 07:01:46.113412167 +0000 UTC m=+969.117196740" Feb 23 07:01:54 crc kubenswrapper[5118]: I0223 07:01:54.861672 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-rkjcd" Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.513142 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-x2phk"] Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.514412 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-x2phk" Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.518242 5118 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-4c26g" Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.538004 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-x2phk"] Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.553645 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c7ed690-42a7-491b-ac42-b0e4afda4272-bound-sa-token\") pod \"cert-manager-545d4d4674-x2phk\" (UID: \"8c7ed690-42a7-491b-ac42-b0e4afda4272\") " pod="cert-manager/cert-manager-545d4d4674-x2phk" Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.553684 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssmnp\" (UniqueName: \"kubernetes.io/projected/8c7ed690-42a7-491b-ac42-b0e4afda4272-kube-api-access-ssmnp\") pod \"cert-manager-545d4d4674-x2phk\" (UID: \"8c7ed690-42a7-491b-ac42-b0e4afda4272\") " pod="cert-manager/cert-manager-545d4d4674-x2phk" Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.656350 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c7ed690-42a7-491b-ac42-b0e4afda4272-bound-sa-token\") pod \"cert-manager-545d4d4674-x2phk\" (UID: \"8c7ed690-42a7-491b-ac42-b0e4afda4272\") " pod="cert-manager/cert-manager-545d4d4674-x2phk" Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.656439 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssmnp\" (UniqueName: \"kubernetes.io/projected/8c7ed690-42a7-491b-ac42-b0e4afda4272-kube-api-access-ssmnp\") pod \"cert-manager-545d4d4674-x2phk\" (UID: \"8c7ed690-42a7-491b-ac42-b0e4afda4272\") " pod="cert-manager/cert-manager-545d4d4674-x2phk" Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.692823 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c7ed690-42a7-491b-ac42-b0e4afda4272-bound-sa-token\") pod \"cert-manager-545d4d4674-x2phk\" (UID: \"8c7ed690-42a7-491b-ac42-b0e4afda4272\") " pod="cert-manager/cert-manager-545d4d4674-x2phk" Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.695497 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssmnp\" (UniqueName: \"kubernetes.io/projected/8c7ed690-42a7-491b-ac42-b0e4afda4272-kube-api-access-ssmnp\") pod \"cert-manager-545d4d4674-x2phk\" (UID: \"8c7ed690-42a7-491b-ac42-b0e4afda4272\") " pod="cert-manager/cert-manager-545d4d4674-x2phk" Feb 23 07:01:57 crc kubenswrapper[5118]: I0223 07:01:57.838326 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-x2phk" Feb 23 07:01:58 crc kubenswrapper[5118]: I0223 07:01:58.104999 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-x2phk"] Feb 23 07:01:58 crc kubenswrapper[5118]: I0223 07:01:58.168045 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-x2phk" event={"ID":"8c7ed690-42a7-491b-ac42-b0e4afda4272","Type":"ContainerStarted","Data":"a0234b79b03a93017b1e1c015e6a261abffa56c1498b5818901e29eb250b76d4"} Feb 23 07:01:59 crc kubenswrapper[5118]: I0223 07:01:59.177292 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-x2phk" event={"ID":"8c7ed690-42a7-491b-ac42-b0e4afda4272","Type":"ContainerStarted","Data":"efc1cee95eed2081ff9790c0defbd00b91bef1c8c2b98f4750a8e9e1c82ca834"} Feb 23 07:01:59 crc kubenswrapper[5118]: I0223 07:01:59.194224 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-x2phk" podStartSLOduration=2.194165973 podStartE2EDuration="2.194165973s" podCreationTimestamp="2026-02-23 07:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:01:59.193358404 +0000 UTC m=+982.197142997" watchObservedRunningTime="2026-02-23 07:01:59.194165973 +0000 UTC m=+982.197950566" Feb 23 07:02:07 crc kubenswrapper[5118]: I0223 07:02:07.975188 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qrxc8"] Feb 23 07:02:07 crc kubenswrapper[5118]: I0223 07:02:07.977204 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qrxc8" Feb 23 07:02:07 crc kubenswrapper[5118]: I0223 07:02:07.981516 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-8jtjw" Feb 23 07:02:07 crc kubenswrapper[5118]: I0223 07:02:07.987785 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 23 07:02:07 crc kubenswrapper[5118]: I0223 07:02:07.988158 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 23 07:02:07 crc kubenswrapper[5118]: I0223 07:02:07.989049 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qrxc8"] Feb 23 07:02:08 crc kubenswrapper[5118]: I0223 07:02:08.028506 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znv5b\" (UniqueName: \"kubernetes.io/projected/d950224a-6bde-4dae-b8db-b76092cf5afc-kube-api-access-znv5b\") pod \"openstack-operator-index-qrxc8\" (UID: \"d950224a-6bde-4dae-b8db-b76092cf5afc\") " pod="openstack-operators/openstack-operator-index-qrxc8" Feb 23 07:02:08 crc kubenswrapper[5118]: I0223 07:02:08.131946 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znv5b\" (UniqueName: \"kubernetes.io/projected/d950224a-6bde-4dae-b8db-b76092cf5afc-kube-api-access-znv5b\") pod \"openstack-operator-index-qrxc8\" (UID: \"d950224a-6bde-4dae-b8db-b76092cf5afc\") " pod="openstack-operators/openstack-operator-index-qrxc8" Feb 23 07:02:08 crc kubenswrapper[5118]: I0223 07:02:08.152448 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znv5b\" (UniqueName: \"kubernetes.io/projected/d950224a-6bde-4dae-b8db-b76092cf5afc-kube-api-access-znv5b\") pod \"openstack-operator-index-qrxc8\" (UID: \"d950224a-6bde-4dae-b8db-b76092cf5afc\") " pod="openstack-operators/openstack-operator-index-qrxc8" Feb 23 07:02:08 crc kubenswrapper[5118]: I0223 07:02:08.308801 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qrxc8" Feb 23 07:02:08 crc kubenswrapper[5118]: I0223 07:02:08.528420 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qrxc8"] Feb 23 07:02:09 crc kubenswrapper[5118]: I0223 07:02:09.275696 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qrxc8" event={"ID":"d950224a-6bde-4dae-b8db-b76092cf5afc","Type":"ContainerStarted","Data":"940326f4afe8a87347320183705f6f57f320075e82c775d299415acf9272560a"} Feb 23 07:02:10 crc kubenswrapper[5118]: I0223 07:02:10.286773 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qrxc8" event={"ID":"d950224a-6bde-4dae-b8db-b76092cf5afc","Type":"ContainerStarted","Data":"32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d"} Feb 23 07:02:10 crc kubenswrapper[5118]: I0223 07:02:10.315664 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qrxc8" podStartSLOduration=2.3263777660000002 podStartE2EDuration="3.31563556s" podCreationTimestamp="2026-02-23 07:02:07 +0000 UTC" firstStartedPulling="2026-02-23 07:02:08.581769012 +0000 UTC m=+991.585553595" lastFinishedPulling="2026-02-23 07:02:09.571026786 +0000 UTC m=+992.574811389" observedRunningTime="2026-02-23 07:02:10.315425174 +0000 UTC m=+993.319209747" watchObservedRunningTime="2026-02-23 07:02:10.31563556 +0000 UTC m=+993.319420133" Feb 23 07:02:11 crc kubenswrapper[5118]: I0223 07:02:11.336078 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qrxc8"] Feb 23 07:02:11 crc kubenswrapper[5118]: I0223 07:02:11.744974 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qswx2"] Feb 23 07:02:11 crc kubenswrapper[5118]: I0223 07:02:11.746375 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qswx2" Feb 23 07:02:11 crc kubenswrapper[5118]: I0223 07:02:11.757818 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qswx2"] Feb 23 07:02:11 crc kubenswrapper[5118]: I0223 07:02:11.798924 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m74pl\" (UniqueName: \"kubernetes.io/projected/8a10def1-5478-48e8-acb6-3cc2e549e94a-kube-api-access-m74pl\") pod \"openstack-operator-index-qswx2\" (UID: \"8a10def1-5478-48e8-acb6-3cc2e549e94a\") " pod="openstack-operators/openstack-operator-index-qswx2" Feb 23 07:02:11 crc kubenswrapper[5118]: I0223 07:02:11.900767 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m74pl\" (UniqueName: \"kubernetes.io/projected/8a10def1-5478-48e8-acb6-3cc2e549e94a-kube-api-access-m74pl\") pod \"openstack-operator-index-qswx2\" (UID: \"8a10def1-5478-48e8-acb6-3cc2e549e94a\") " pod="openstack-operators/openstack-operator-index-qswx2" Feb 23 07:02:11 crc kubenswrapper[5118]: I0223 07:02:11.933451 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m74pl\" (UniqueName: \"kubernetes.io/projected/8a10def1-5478-48e8-acb6-3cc2e549e94a-kube-api-access-m74pl\") pod \"openstack-operator-index-qswx2\" (UID: \"8a10def1-5478-48e8-acb6-3cc2e549e94a\") " pod="openstack-operators/openstack-operator-index-qswx2" Feb 23 07:02:12 crc kubenswrapper[5118]: I0223 07:02:12.084787 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qswx2" Feb 23 07:02:12 crc kubenswrapper[5118]: I0223 07:02:12.324756 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-qrxc8" podUID="d950224a-6bde-4dae-b8db-b76092cf5afc" containerName="registry-server" containerID="cri-o://32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d" gracePeriod=2 Feb 23 07:02:12 crc kubenswrapper[5118]: I0223 07:02:12.614238 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qswx2"] Feb 23 07:02:12 crc kubenswrapper[5118]: W0223 07:02:12.616519 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a10def1_5478_48e8_acb6_3cc2e549e94a.slice/crio-391ad03063ba680a92ac4aecf60faa35a2157bdb7ae2e94a8fcd2e877c636109 WatchSource:0}: Error finding container 391ad03063ba680a92ac4aecf60faa35a2157bdb7ae2e94a8fcd2e877c636109: Status 404 returned error can't find the container with id 391ad03063ba680a92ac4aecf60faa35a2157bdb7ae2e94a8fcd2e877c636109 Feb 23 07:02:12 crc kubenswrapper[5118]: I0223 07:02:12.677209 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qrxc8" Feb 23 07:02:12 crc kubenswrapper[5118]: I0223 07:02:12.714551 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znv5b\" (UniqueName: \"kubernetes.io/projected/d950224a-6bde-4dae-b8db-b76092cf5afc-kube-api-access-znv5b\") pod \"d950224a-6bde-4dae-b8db-b76092cf5afc\" (UID: \"d950224a-6bde-4dae-b8db-b76092cf5afc\") " Feb 23 07:02:12 crc kubenswrapper[5118]: I0223 07:02:12.754499 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d950224a-6bde-4dae-b8db-b76092cf5afc-kube-api-access-znv5b" (OuterVolumeSpecName: "kube-api-access-znv5b") pod "d950224a-6bde-4dae-b8db-b76092cf5afc" (UID: "d950224a-6bde-4dae-b8db-b76092cf5afc"). InnerVolumeSpecName "kube-api-access-znv5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:12 crc kubenswrapper[5118]: I0223 07:02:12.817969 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znv5b\" (UniqueName: \"kubernetes.io/projected/d950224a-6bde-4dae-b8db-b76092cf5afc-kube-api-access-znv5b\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.335569 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qswx2" event={"ID":"8a10def1-5478-48e8-acb6-3cc2e549e94a","Type":"ContainerStarted","Data":"2432fce03c711454d74a085738fcca996acb4084c2b377607738199c9ded8fb4"} Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.335916 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qswx2" event={"ID":"8a10def1-5478-48e8-acb6-3cc2e549e94a","Type":"ContainerStarted","Data":"391ad03063ba680a92ac4aecf60faa35a2157bdb7ae2e94a8fcd2e877c636109"} Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.340354 5118 generic.go:334] "Generic (PLEG): container finished" podID="d950224a-6bde-4dae-b8db-b76092cf5afc" containerID="32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d" exitCode=0 Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.340419 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qrxc8" event={"ID":"d950224a-6bde-4dae-b8db-b76092cf5afc","Type":"ContainerDied","Data":"32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d"} Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.340424 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qrxc8" Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.340456 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qrxc8" event={"ID":"d950224a-6bde-4dae-b8db-b76092cf5afc","Type":"ContainerDied","Data":"940326f4afe8a87347320183705f6f57f320075e82c775d299415acf9272560a"} Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.340480 5118 scope.go:117] "RemoveContainer" containerID="32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d" Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.376198 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qswx2" podStartSLOduration=1.9419945159999998 podStartE2EDuration="2.376077328s" podCreationTimestamp="2026-02-23 07:02:11 +0000 UTC" firstStartedPulling="2026-02-23 07:02:12.622683193 +0000 UTC m=+995.626467776" lastFinishedPulling="2026-02-23 07:02:13.056765995 +0000 UTC m=+996.060550588" observedRunningTime="2026-02-23 07:02:13.362566034 +0000 UTC m=+996.366350607" watchObservedRunningTime="2026-02-23 07:02:13.376077328 +0000 UTC m=+996.379861901" Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.377567 5118 scope.go:117] "RemoveContainer" containerID="32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d" Feb 23 07:02:13 crc kubenswrapper[5118]: E0223 07:02:13.378240 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d\": container with ID starting with 32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d not found: ID does not exist" containerID="32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d" Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.378286 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d"} err="failed to get container status \"32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d\": rpc error: code = NotFound desc = could not find container \"32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d\": container with ID starting with 32341e720947adc2a147591f5359044cc4778fdc807a5760c559e97f6e83476d not found: ID does not exist" Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.381610 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qrxc8"] Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.386542 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-qrxc8"] Feb 23 07:02:13 crc kubenswrapper[5118]: E0223 07:02:13.434989 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd950224a_6bde_4dae_b8db_b76092cf5afc.slice/crio-940326f4afe8a87347320183705f6f57f320075e82c775d299415acf9272560a\": RecentStats: unable to find data in memory cache]" Feb 23 07:02:13 crc kubenswrapper[5118]: I0223 07:02:13.712062 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d950224a-6bde-4dae-b8db-b76092cf5afc" path="/var/lib/kubelet/pods/d950224a-6bde-4dae-b8db-b76092cf5afc/volumes" Feb 23 07:02:22 crc kubenswrapper[5118]: I0223 07:02:22.087831 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qswx2" Feb 23 07:02:22 crc kubenswrapper[5118]: I0223 07:02:22.088523 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qswx2" Feb 23 07:02:22 crc kubenswrapper[5118]: I0223 07:02:22.140951 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qswx2" Feb 23 07:02:22 crc kubenswrapper[5118]: I0223 07:02:22.462643 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qswx2" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.013804 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn"] Feb 23 07:02:31 crc kubenswrapper[5118]: E0223 07:02:31.015037 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d950224a-6bde-4dae-b8db-b76092cf5afc" containerName="registry-server" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.015063 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d950224a-6bde-4dae-b8db-b76092cf5afc" containerName="registry-server" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.015328 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d950224a-6bde-4dae-b8db-b76092cf5afc" containerName="registry-server" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.017288 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.020741 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-l65mv" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.026658 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn"] Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.147342 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr8tc\" (UniqueName: \"kubernetes.io/projected/300729f0-11bf-4947-b58d-1a60bd0af741-kube-api-access-nr8tc\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.147449 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.147640 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.249261 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.249738 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr8tc\" (UniqueName: \"kubernetes.io/projected/300729f0-11bf-4947-b58d-1a60bd0af741-kube-api-access-nr8tc\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.249937 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.250263 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.250653 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.284054 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr8tc\" (UniqueName: \"kubernetes.io/projected/300729f0-11bf-4947-b58d-1a60bd0af741-kube-api-access-nr8tc\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.355703 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:31 crc kubenswrapper[5118]: I0223 07:02:31.654771 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn"] Feb 23 07:02:32 crc kubenswrapper[5118]: I0223 07:02:32.499233 5118 generic.go:334] "Generic (PLEG): container finished" podID="300729f0-11bf-4947-b58d-1a60bd0af741" containerID="9bbe0ea212d87e61e193937c7a7582c6272d3325476d909fb8357dfc32f6287f" exitCode=0 Feb 23 07:02:32 crc kubenswrapper[5118]: I0223 07:02:32.499342 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" event={"ID":"300729f0-11bf-4947-b58d-1a60bd0af741","Type":"ContainerDied","Data":"9bbe0ea212d87e61e193937c7a7582c6272d3325476d909fb8357dfc32f6287f"} Feb 23 07:02:32 crc kubenswrapper[5118]: I0223 07:02:32.499634 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" event={"ID":"300729f0-11bf-4947-b58d-1a60bd0af741","Type":"ContainerStarted","Data":"99c9ac19d04750e508c08352946a152612138253a98ceced25e71476f70a6442"} Feb 23 07:02:32 crc kubenswrapper[5118]: I0223 07:02:32.502050 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:02:33 crc kubenswrapper[5118]: E0223 07:02:33.836008 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod300729f0_11bf_4947_b58d_1a60bd0af741.slice/crio-8da8fd40bb98ad803738fb491d143a32d59b8782d6dfda118871455b4aac208f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod300729f0_11bf_4947_b58d_1a60bd0af741.slice/crio-conmon-8da8fd40bb98ad803738fb491d143a32d59b8782d6dfda118871455b4aac208f.scope\": RecentStats: unable to find data in memory cache]" Feb 23 07:02:34 crc kubenswrapper[5118]: I0223 07:02:34.522033 5118 generic.go:334] "Generic (PLEG): container finished" podID="300729f0-11bf-4947-b58d-1a60bd0af741" containerID="8da8fd40bb98ad803738fb491d143a32d59b8782d6dfda118871455b4aac208f" exitCode=0 Feb 23 07:02:34 crc kubenswrapper[5118]: I0223 07:02:34.522171 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" event={"ID":"300729f0-11bf-4947-b58d-1a60bd0af741","Type":"ContainerDied","Data":"8da8fd40bb98ad803738fb491d143a32d59b8782d6dfda118871455b4aac208f"} Feb 23 07:02:35 crc kubenswrapper[5118]: I0223 07:02:35.533750 5118 generic.go:334] "Generic (PLEG): container finished" podID="300729f0-11bf-4947-b58d-1a60bd0af741" containerID="5adcc7244f4bd69ff226e6d99626ce285d0af7c76a78b69d6655a9db9b70aa85" exitCode=0 Feb 23 07:02:35 crc kubenswrapper[5118]: I0223 07:02:35.533808 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" event={"ID":"300729f0-11bf-4947-b58d-1a60bd0af741","Type":"ContainerDied","Data":"5adcc7244f4bd69ff226e6d99626ce285d0af7c76a78b69d6655a9db9b70aa85"} Feb 23 07:02:36 crc kubenswrapper[5118]: I0223 07:02:36.906723 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:36 crc kubenswrapper[5118]: I0223 07:02:36.983958 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-util\") pod \"300729f0-11bf-4947-b58d-1a60bd0af741\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " Feb 23 07:02:36 crc kubenswrapper[5118]: I0223 07:02:36.984165 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr8tc\" (UniqueName: \"kubernetes.io/projected/300729f0-11bf-4947-b58d-1a60bd0af741-kube-api-access-nr8tc\") pod \"300729f0-11bf-4947-b58d-1a60bd0af741\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " Feb 23 07:02:36 crc kubenswrapper[5118]: I0223 07:02:36.984232 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-bundle\") pod \"300729f0-11bf-4947-b58d-1a60bd0af741\" (UID: \"300729f0-11bf-4947-b58d-1a60bd0af741\") " Feb 23 07:02:36 crc kubenswrapper[5118]: I0223 07:02:36.985514 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-bundle" (OuterVolumeSpecName: "bundle") pod "300729f0-11bf-4947-b58d-1a60bd0af741" (UID: "300729f0-11bf-4947-b58d-1a60bd0af741"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:36 crc kubenswrapper[5118]: I0223 07:02:36.993785 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300729f0-11bf-4947-b58d-1a60bd0af741-kube-api-access-nr8tc" (OuterVolumeSpecName: "kube-api-access-nr8tc") pod "300729f0-11bf-4947-b58d-1a60bd0af741" (UID: "300729f0-11bf-4947-b58d-1a60bd0af741"). InnerVolumeSpecName "kube-api-access-nr8tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:02:37 crc kubenswrapper[5118]: I0223 07:02:37.085289 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr8tc\" (UniqueName: \"kubernetes.io/projected/300729f0-11bf-4947-b58d-1a60bd0af741-kube-api-access-nr8tc\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:37 crc kubenswrapper[5118]: I0223 07:02:37.085330 5118 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:37 crc kubenswrapper[5118]: I0223 07:02:37.111556 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-util" (OuterVolumeSpecName: "util") pod "300729f0-11bf-4947-b58d-1a60bd0af741" (UID: "300729f0-11bf-4947-b58d-1a60bd0af741"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:02:37 crc kubenswrapper[5118]: I0223 07:02:37.186525 5118 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/300729f0-11bf-4947-b58d-1a60bd0af741-util\") on node \"crc\" DevicePath \"\"" Feb 23 07:02:37 crc kubenswrapper[5118]: I0223 07:02:37.563458 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" event={"ID":"300729f0-11bf-4947-b58d-1a60bd0af741","Type":"ContainerDied","Data":"99c9ac19d04750e508c08352946a152612138253a98ceced25e71476f70a6442"} Feb 23 07:02:37 crc kubenswrapper[5118]: I0223 07:02:37.563964 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99c9ac19d04750e508c08352946a152612138253a98ceced25e71476f70a6442" Feb 23 07:02:37 crc kubenswrapper[5118]: I0223 07:02:37.563589 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.397337 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp"] Feb 23 07:02:43 crc kubenswrapper[5118]: E0223 07:02:43.399423 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300729f0-11bf-4947-b58d-1a60bd0af741" containerName="util" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.399534 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="300729f0-11bf-4947-b58d-1a60bd0af741" containerName="util" Feb 23 07:02:43 crc kubenswrapper[5118]: E0223 07:02:43.399638 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300729f0-11bf-4947-b58d-1a60bd0af741" containerName="pull" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.399709 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="300729f0-11bf-4947-b58d-1a60bd0af741" containerName="pull" Feb 23 07:02:43 crc kubenswrapper[5118]: E0223 07:02:43.399783 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300729f0-11bf-4947-b58d-1a60bd0af741" containerName="extract" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.399853 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="300729f0-11bf-4947-b58d-1a60bd0af741" containerName="extract" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.400115 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="300729f0-11bf-4947-b58d-1a60bd0af741" containerName="extract" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.400747 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.406803 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-dfrgw" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.422209 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp"] Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.492668 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfssm\" (UniqueName: \"kubernetes.io/projected/e58e8816-6d4e-48bd-b00e-956269a79c2f-kube-api-access-jfssm\") pod \"openstack-operator-controller-init-6679bf9b57-m96kp\" (UID: \"e58e8816-6d4e-48bd-b00e-956269a79c2f\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.593675 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfssm\" (UniqueName: \"kubernetes.io/projected/e58e8816-6d4e-48bd-b00e-956269a79c2f-kube-api-access-jfssm\") pod \"openstack-operator-controller-init-6679bf9b57-m96kp\" (UID: \"e58e8816-6d4e-48bd-b00e-956269a79c2f\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.619805 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfssm\" (UniqueName: \"kubernetes.io/projected/e58e8816-6d4e-48bd-b00e-956269a79c2f-kube-api-access-jfssm\") pod \"openstack-operator-controller-init-6679bf9b57-m96kp\" (UID: \"e58e8816-6d4e-48bd-b00e-956269a79c2f\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp" Feb 23 07:02:43 crc kubenswrapper[5118]: I0223 07:02:43.721890 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp" Feb 23 07:02:44 crc kubenswrapper[5118]: I0223 07:02:44.236507 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp"] Feb 23 07:02:44 crc kubenswrapper[5118]: I0223 07:02:44.626283 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp" event={"ID":"e58e8816-6d4e-48bd-b00e-956269a79c2f","Type":"ContainerStarted","Data":"4d57353778029d179a2071a49f4eaba659610013efbe352a23db27c9ae2b6fcc"} Feb 23 07:02:49 crc kubenswrapper[5118]: I0223 07:02:49.667838 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp" event={"ID":"e58e8816-6d4e-48bd-b00e-956269a79c2f","Type":"ContainerStarted","Data":"2834e98cc28de45b0bfc9dea7941870c6310d195b1ea6e4dea572c432c39e527"} Feb 23 07:02:49 crc kubenswrapper[5118]: I0223 07:02:49.669000 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp" Feb 23 07:02:49 crc kubenswrapper[5118]: I0223 07:02:49.724747 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp" podStartSLOduration=2.314282115 podStartE2EDuration="6.724722707s" podCreationTimestamp="2026-02-23 07:02:43 +0000 UTC" firstStartedPulling="2026-02-23 07:02:44.256342402 +0000 UTC m=+1027.260126965" lastFinishedPulling="2026-02-23 07:02:48.666782984 +0000 UTC m=+1031.670567557" observedRunningTime="2026-02-23 07:02:49.714593234 +0000 UTC m=+1032.718377877" watchObservedRunningTime="2026-02-23 07:02:49.724722707 +0000 UTC m=+1032.728507290" Feb 23 07:02:53 crc kubenswrapper[5118]: I0223 07:02:53.726064 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-m96kp" Feb 23 07:03:02 crc kubenswrapper[5118]: I0223 07:03:02.975355 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:03:02 crc kubenswrapper[5118]: I0223 07:03:02.976442 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.100403 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.104962 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.107286 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xc4zx" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.115558 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.126038 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.127270 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.130672 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-fzs65" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.143935 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.144848 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.147519 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xk2d9" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.175167 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.183171 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-98cgm"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.184150 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-98cgm" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.187088 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.189268 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-n9pbb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.216518 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.217463 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.223911 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-v5f8t" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.238468 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-98cgm"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.242224 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.244271 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lz8z\" (UniqueName: \"kubernetes.io/projected/f6ab91fc-435e-4d88-8dfa-837000d8decd-kube-api-access-8lz8z\") pod \"barbican-operator-controller-manager-868647ff47-x8xz4\" (UID: \"f6ab91fc-435e-4d88-8dfa-837000d8decd\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.244311 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr48c\" (UniqueName: \"kubernetes.io/projected/f2be56ef-55b5-42c9-bf9e-7db0f6148b49-kube-api-access-tr48c\") pod \"cinder-operator-controller-manager-5d946d989d-bz8br\" (UID: \"f2be56ef-55b5-42c9-bf9e-7db0f6148b49\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.248173 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.249334 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.264372 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-kff6x" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.267648 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.285297 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-79wlh"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.286222 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.287909 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.291254 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-79wlh"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.296220 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rws6q" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.301467 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.303036 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.307063 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.314717 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.315783 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.316284 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-qzcf8" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.319751 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-h8pqz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.320847 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.345183 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbx2s\" (UniqueName: \"kubernetes.io/projected/323a4a15-ada9-44e1-9fb1-4533964ab4b8-kube-api-access-nbx2s\") pod \"designate-operator-controller-manager-6d8bf5c495-dhlhx\" (UID: \"323a4a15-ada9-44e1-9fb1-4533964ab4b8\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.345267 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lz8z\" (UniqueName: \"kubernetes.io/projected/f6ab91fc-435e-4d88-8dfa-837000d8decd-kube-api-access-8lz8z\") pod \"barbican-operator-controller-manager-868647ff47-x8xz4\" (UID: \"f6ab91fc-435e-4d88-8dfa-837000d8decd\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.345292 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gv8z\" (UniqueName: \"kubernetes.io/projected/618a679a-a0c0-40a1-b3b3-79e0834aef7c-kube-api-access-4gv8z\") pod \"glance-operator-controller-manager-77987464f4-98cgm\" (UID: \"618a679a-a0c0-40a1-b3b3-79e0834aef7c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-98cgm" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.345320 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkhfv\" (UniqueName: \"kubernetes.io/projected/22db249a-2327-4cb1-a11c-72ac99925b27-kube-api-access-mkhfv\") pod \"heat-operator-controller-manager-69f49c598c-d6j2x\" (UID: \"22db249a-2327-4cb1-a11c-72ac99925b27\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.345342 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr48c\" (UniqueName: \"kubernetes.io/projected/f2be56ef-55b5-42c9-bf9e-7db0f6148b49-kube-api-access-tr48c\") pod \"cinder-operator-controller-manager-5d946d989d-bz8br\" (UID: \"f2be56ef-55b5-42c9-bf9e-7db0f6148b49\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.372887 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr48c\" (UniqueName: \"kubernetes.io/projected/f2be56ef-55b5-42c9-bf9e-7db0f6148b49-kube-api-access-tr48c\") pod \"cinder-operator-controller-manager-5d946d989d-bz8br\" (UID: \"f2be56ef-55b5-42c9-bf9e-7db0f6148b49\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.396903 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lz8z\" (UniqueName: \"kubernetes.io/projected/f6ab91fc-435e-4d88-8dfa-837000d8decd-kube-api-access-8lz8z\") pod \"barbican-operator-controller-manager-868647ff47-x8xz4\" (UID: \"f6ab91fc-435e-4d88-8dfa-837000d8decd\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.420175 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.421176 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.426742 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.427720 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.433885 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.441486 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.447677 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbx2s\" (UniqueName: \"kubernetes.io/projected/323a4a15-ada9-44e1-9fb1-4533964ab4b8-kube-api-access-nbx2s\") pod \"designate-operator-controller-manager-6d8bf5c495-dhlhx\" (UID: \"323a4a15-ada9-44e1-9fb1-4533964ab4b8\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.447757 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnpz6\" (UniqueName: \"kubernetes.io/projected/d42094d4-0a02-4b3d-8bad-7d0a0bda3d47-kube-api-access-gnpz6\") pod \"keystone-operator-controller-manager-b4d948c87-98sv6\" (UID: \"d42094d4-0a02-4b3d-8bad-7d0a0bda3d47\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.447796 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qjdw\" (UniqueName: \"kubernetes.io/projected/29f5282f-4695-42d1-9ebd-6fd3d8acc24b-kube-api-access-5qjdw\") pod \"ironic-operator-controller-manager-554564d7fc-7m4sj\" (UID: \"29f5282f-4695-42d1-9ebd-6fd3d8acc24b\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.447841 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-256nl\" (UniqueName: \"kubernetes.io/projected/f9e92ac7-cd2b-4c82-94a1-213c37487e61-kube-api-access-256nl\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.447875 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gv8z\" (UniqueName: \"kubernetes.io/projected/618a679a-a0c0-40a1-b3b3-79e0834aef7c-kube-api-access-4gv8z\") pod \"glance-operator-controller-manager-77987464f4-98cgm\" (UID: \"618a679a-a0c0-40a1-b3b3-79e0834aef7c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-98cgm" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.447912 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sppnc\" (UniqueName: \"kubernetes.io/projected/3fe2a698-e792-41a1-92bc-589d0bab315b-kube-api-access-sppnc\") pod \"horizon-operator-controller-manager-5b9b8895d5-vqkbz\" (UID: \"3fe2a698-e792-41a1-92bc-589d0bab315b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.447939 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkhfv\" (UniqueName: \"kubernetes.io/projected/22db249a-2327-4cb1-a11c-72ac99925b27-kube-api-access-mkhfv\") pod \"heat-operator-controller-manager-69f49c598c-d6j2x\" (UID: \"22db249a-2327-4cb1-a11c-72ac99925b27\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.447993 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.459975 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.460243 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.460505 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fkszk" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.460607 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-d7tbw" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.467162 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.468227 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.468648 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.479014 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-q6b4r" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.484045 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbx2s\" (UniqueName: \"kubernetes.io/projected/323a4a15-ada9-44e1-9fb1-4533964ab4b8-kube-api-access-nbx2s\") pod \"designate-operator-controller-manager-6d8bf5c495-dhlhx\" (UID: \"323a4a15-ada9-44e1-9fb1-4533964ab4b8\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.487720 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkhfv\" (UniqueName: \"kubernetes.io/projected/22db249a-2327-4cb1-a11c-72ac99925b27-kube-api-access-mkhfv\") pod \"heat-operator-controller-manager-69f49c598c-d6j2x\" (UID: \"22db249a-2327-4cb1-a11c-72ac99925b27\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.495411 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.497387 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.497864 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.498781 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.505976 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.507203 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-b5ggc" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.507436 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mghnh" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.512175 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.519787 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gv8z\" (UniqueName: \"kubernetes.io/projected/618a679a-a0c0-40a1-b3b3-79e0834aef7c-kube-api-access-4gv8z\") pod \"glance-operator-controller-manager-77987464f4-98cgm\" (UID: \"618a679a-a0c0-40a1-b3b3-79e0834aef7c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-98cgm" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.537821 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.539042 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.544296 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.544437 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.544635 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2js5p" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.545200 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.545269 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.545201 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.548762 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hvnlh" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.548950 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.549597 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnjzw\" (UniqueName: \"kubernetes.io/projected/133701aa-5bf0-4431-90bd-4a67b0c71a8f-kube-api-access-dnjzw\") pod \"mariadb-operator-controller-manager-6994f66f48-kprdd\" (UID: \"133701aa-5bf0-4431-90bd-4a67b0c71a8f\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.549638 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpxwn\" (UniqueName: \"kubernetes.io/projected/e729f808-0740-4153-aa13-93bdfc02da47-kube-api-access-dpxwn\") pod \"manila-operator-controller-manager-54f6768c69-bfhhb\" (UID: \"e729f808-0740-4153-aa13-93bdfc02da47\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.549672 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.549718 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnpz6\" (UniqueName: \"kubernetes.io/projected/d42094d4-0a02-4b3d-8bad-7d0a0bda3d47-kube-api-access-gnpz6\") pod \"keystone-operator-controller-manager-b4d948c87-98sv6\" (UID: \"d42094d4-0a02-4b3d-8bad-7d0a0bda3d47\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.549742 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qjdw\" (UniqueName: \"kubernetes.io/projected/29f5282f-4695-42d1-9ebd-6fd3d8acc24b-kube-api-access-5qjdw\") pod \"ironic-operator-controller-manager-554564d7fc-7m4sj\" (UID: \"29f5282f-4695-42d1-9ebd-6fd3d8acc24b\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.549777 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-256nl\" (UniqueName: \"kubernetes.io/projected/f9e92ac7-cd2b-4c82-94a1-213c37487e61-kube-api-access-256nl\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.549808 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sppnc\" (UniqueName: \"kubernetes.io/projected/3fe2a698-e792-41a1-92bc-589d0bab315b-kube-api-access-sppnc\") pod \"horizon-operator-controller-manager-5b9b8895d5-vqkbz\" (UID: \"3fe2a698-e792-41a1-92bc-589d0bab315b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" Feb 23 07:03:30 crc kubenswrapper[5118]: E0223 07:03:30.550179 5118 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 07:03:30 crc kubenswrapper[5118]: E0223 07:03:30.550232 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert podName:f9e92ac7-cd2b-4c82-94a1-213c37487e61 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:31.050214664 +0000 UTC m=+1074.053999237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert") pod "infra-operator-controller-manager-79d975b745-79wlh" (UID: "f9e92ac7-cd2b-4c82-94a1-213c37487e61") : secret "infra-operator-webhook-server-cert" not found Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.555375 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.556391 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.580750 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-cf2fm" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.586541 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qjdw\" (UniqueName: \"kubernetes.io/projected/29f5282f-4695-42d1-9ebd-6fd3d8acc24b-kube-api-access-5qjdw\") pod \"ironic-operator-controller-manager-554564d7fc-7m4sj\" (UID: \"29f5282f-4695-42d1-9ebd-6fd3d8acc24b\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.607850 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.609057 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.610345 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-256nl\" (UniqueName: \"kubernetes.io/projected/f9e92ac7-cd2b-4c82-94a1-213c37487e61-kube-api-access-256nl\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.612594 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zjr9l" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.622148 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnpz6\" (UniqueName: \"kubernetes.io/projected/d42094d4-0a02-4b3d-8bad-7d0a0bda3d47-kube-api-access-gnpz6\") pod \"keystone-operator-controller-manager-b4d948c87-98sv6\" (UID: \"d42094d4-0a02-4b3d-8bad-7d0a0bda3d47\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.627599 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sppnc\" (UniqueName: \"kubernetes.io/projected/3fe2a698-e792-41a1-92bc-589d0bab315b-kube-api-access-sppnc\") pod \"horizon-operator-controller-manager-5b9b8895d5-vqkbz\" (UID: \"3fe2a698-e792-41a1-92bc-589d0bab315b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.630325 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.632054 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.636364 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.651364 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7p5x\" (UniqueName: \"kubernetes.io/projected/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-kube-api-access-f7p5x\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.651416 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.651443 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzx8w\" (UniqueName: \"kubernetes.io/projected/292023bd-64ab-499f-aabc-43b314775b62-kube-api-access-bzx8w\") pod \"octavia-operator-controller-manager-69f8888797-gvdcc\" (UID: \"292023bd-64ab-499f-aabc-43b314775b62\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.651470 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4c2q\" (UniqueName: \"kubernetes.io/projected/61637571-dc55-41bf-9765-415fa7a78100-kube-api-access-p4c2q\") pod \"neutron-operator-controller-manager-64ddbf8bb-qhvx6\" (UID: \"61637571-dc55-41bf-9765-415fa7a78100\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.651496 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2mns\" (UniqueName: \"kubernetes.io/projected/9885aacd-cab4-40c1-b504-067fa6b38393-kube-api-access-t2mns\") pod \"placement-operator-controller-manager-8497b45c89-5zc5v\" (UID: \"9885aacd-cab4-40c1-b504-067fa6b38393\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.651523 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjr45\" (UniqueName: \"kubernetes.io/projected/e56300c0-de91-4f3e-9b0b-6e5303e7561d-kube-api-access-zjr45\") pod \"ovn-operator-controller-manager-d44cf6b75-z5qcz\" (UID: \"e56300c0-de91-4f3e-9b0b-6e5303e7561d\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.651550 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.651567 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnjzw\" (UniqueName: \"kubernetes.io/projected/133701aa-5bf0-4431-90bd-4a67b0c71a8f-kube-api-access-dnjzw\") pod \"mariadb-operator-controller-manager-6994f66f48-kprdd\" (UID: \"133701aa-5bf0-4431-90bd-4a67b0c71a8f\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.652083 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpxwn\" (UniqueName: \"kubernetes.io/projected/e729f808-0740-4153-aa13-93bdfc02da47-kube-api-access-dpxwn\") pod \"manila-operator-controller-manager-54f6768c69-bfhhb\" (UID: \"e729f808-0740-4153-aa13-93bdfc02da47\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.652126 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjglg\" (UniqueName: \"kubernetes.io/projected/f392b9d5-c8ed-4976-a7df-96cbbd8faad0-kube-api-access-mjglg\") pod \"nova-operator-controller-manager-567668f5cf-54gzb\" (UID: \"f392b9d5-c8ed-4976-a7df-96cbbd8faad0\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.668979 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.673193 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.677155 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-92dh7" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.681570 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnjzw\" (UniqueName: \"kubernetes.io/projected/133701aa-5bf0-4431-90bd-4a67b0c71a8f-kube-api-access-dnjzw\") pod \"mariadb-operator-controller-manager-6994f66f48-kprdd\" (UID: \"133701aa-5bf0-4431-90bd-4a67b0c71a8f\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.681630 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-mgwl7"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.682529 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.690521 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rszdf" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.702798 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpxwn\" (UniqueName: \"kubernetes.io/projected/e729f808-0740-4153-aa13-93bdfc02da47-kube-api-access-dpxwn\") pod \"manila-operator-controller-manager-54f6768c69-bfhhb\" (UID: \"e729f808-0740-4153-aa13-93bdfc02da47\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.719845 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-mgwl7"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.734440 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.753693 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jllbq\" (UniqueName: \"kubernetes.io/projected/a66d9cfc-3b02-4896-8dda-bcbeddb5c803-kube-api-access-jllbq\") pod \"swift-operator-controller-manager-68f46476f-4x7fr\" (UID: \"a66d9cfc-3b02-4896-8dda-bcbeddb5c803\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.753776 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjr45\" (UniqueName: \"kubernetes.io/projected/e56300c0-de91-4f3e-9b0b-6e5303e7561d-kube-api-access-zjr45\") pod \"ovn-operator-controller-manager-d44cf6b75-z5qcz\" (UID: \"e56300c0-de91-4f3e-9b0b-6e5303e7561d\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.753823 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjglg\" (UniqueName: \"kubernetes.io/projected/f392b9d5-c8ed-4976-a7df-96cbbd8faad0-kube-api-access-mjglg\") pod \"nova-operator-controller-manager-567668f5cf-54gzb\" (UID: \"f392b9d5-c8ed-4976-a7df-96cbbd8faad0\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.766403 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7p5x\" (UniqueName: \"kubernetes.io/projected/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-kube-api-access-f7p5x\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.766454 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.766479 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzx8w\" (UniqueName: \"kubernetes.io/projected/292023bd-64ab-499f-aabc-43b314775b62-kube-api-access-bzx8w\") pod \"octavia-operator-controller-manager-69f8888797-gvdcc\" (UID: \"292023bd-64ab-499f-aabc-43b314775b62\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.766509 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4c2q\" (UniqueName: \"kubernetes.io/projected/61637571-dc55-41bf-9765-415fa7a78100-kube-api-access-p4c2q\") pod \"neutron-operator-controller-manager-64ddbf8bb-qhvx6\" (UID: \"61637571-dc55-41bf-9765-415fa7a78100\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.766558 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2mns\" (UniqueName: \"kubernetes.io/projected/9885aacd-cab4-40c1-b504-067fa6b38393-kube-api-access-t2mns\") pod \"placement-operator-controller-manager-8497b45c89-5zc5v\" (UID: \"9885aacd-cab4-40c1-b504-067fa6b38393\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v" Feb 23 07:03:30 crc kubenswrapper[5118]: E0223 07:03:30.766737 5118 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:30 crc kubenswrapper[5118]: E0223 07:03:30.766843 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert podName:f4b0cd84-f619-415b-9d18-bc6de1b1f40b nodeName:}" failed. No retries permitted until 2026-02-23 07:03:31.26681144 +0000 UTC m=+1074.270596013 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" (UID: "f4b0cd84-f619-415b-9d18-bc6de1b1f40b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.769565 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.770687 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.778210 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dqv2t" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.786779 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.795396 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.797241 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjr45\" (UniqueName: \"kubernetes.io/projected/e56300c0-de91-4f3e-9b0b-6e5303e7561d-kube-api-access-zjr45\") pod \"ovn-operator-controller-manager-d44cf6b75-z5qcz\" (UID: \"e56300c0-de91-4f3e-9b0b-6e5303e7561d\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.802857 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-98cgm" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.806140 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjglg\" (UniqueName: \"kubernetes.io/projected/f392b9d5-c8ed-4976-a7df-96cbbd8faad0-kube-api-access-mjglg\") pod \"nova-operator-controller-manager-567668f5cf-54gzb\" (UID: \"f392b9d5-c8ed-4976-a7df-96cbbd8faad0\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.812302 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2mns\" (UniqueName: \"kubernetes.io/projected/9885aacd-cab4-40c1-b504-067fa6b38393-kube-api-access-t2mns\") pod \"placement-operator-controller-manager-8497b45c89-5zc5v\" (UID: \"9885aacd-cab4-40c1-b504-067fa6b38393\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.813285 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzx8w\" (UniqueName: \"kubernetes.io/projected/292023bd-64ab-499f-aabc-43b314775b62-kube-api-access-bzx8w\") pod \"octavia-operator-controller-manager-69f8888797-gvdcc\" (UID: \"292023bd-64ab-499f-aabc-43b314775b62\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.815787 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4c2q\" (UniqueName: \"kubernetes.io/projected/61637571-dc55-41bf-9765-415fa7a78100-kube-api-access-p4c2q\") pod \"neutron-operator-controller-manager-64ddbf8bb-qhvx6\" (UID: \"61637571-dc55-41bf-9765-415fa7a78100\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.821055 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7p5x\" (UniqueName: \"kubernetes.io/projected/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-kube-api-access-f7p5x\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.860204 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.878006 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jllbq\" (UniqueName: \"kubernetes.io/projected/a66d9cfc-3b02-4896-8dda-bcbeddb5c803-kube-api-access-jllbq\") pod \"swift-operator-controller-manager-68f46476f-4x7fr\" (UID: \"a66d9cfc-3b02-4896-8dda-bcbeddb5c803\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.884852 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.891602 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr27j\" (UniqueName: \"kubernetes.io/projected/e7dabd60-e0b9-4ddf-9775-67bf8f922e54-kube-api-access-gr27j\") pod \"test-operator-controller-manager-7866795846-mgwl7\" (UID: \"e7dabd60-e0b9-4ddf-9775-67bf8f922e54\") " pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.891708 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7jl6\" (UniqueName: \"kubernetes.io/projected/07e7dae1-170c-4c44-8ebf-4547edea1c65-kube-api-access-p7jl6\") pod \"telemetry-operator-controller-manager-7f45b4ff68-4zd6f\" (UID: \"07e7dae1-170c-4c44-8ebf-4547edea1c65\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.900692 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jllbq\" (UniqueName: \"kubernetes.io/projected/a66d9cfc-3b02-4896-8dda-bcbeddb5c803-kube-api-access-jllbq\") pod \"swift-operator-controller-manager-68f46476f-4x7fr\" (UID: \"a66d9cfc-3b02-4896-8dda-bcbeddb5c803\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.901391 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.901502 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.926682 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.936629 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.948030 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.958864 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.959746 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.966567 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.967344 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.967380 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.969492 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vsnrz" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.971549 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.977855 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.979186 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.983329 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx"] Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.985050 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-wglns" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.994732 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7jl6\" (UniqueName: \"kubernetes.io/projected/07e7dae1-170c-4c44-8ebf-4547edea1c65-kube-api-access-p7jl6\") pod \"telemetry-operator-controller-manager-7f45b4ff68-4zd6f\" (UID: \"07e7dae1-170c-4c44-8ebf-4547edea1c65\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.994795 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pwkk\" (UniqueName: \"kubernetes.io/projected/9169039b-beb7-48c1-8dbc-485f91b6a2ac-kube-api-access-8pwkk\") pod \"watcher-operator-controller-manager-5db88f68c-xkf9z\" (UID: \"9169039b-beb7-48c1-8dbc-485f91b6a2ac\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" Feb 23 07:03:30 crc kubenswrapper[5118]: I0223 07:03:30.994906 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr27j\" (UniqueName: \"kubernetes.io/projected/e7dabd60-e0b9-4ddf-9775-67bf8f922e54-kube-api-access-gr27j\") pod \"test-operator-controller-manager-7866795846-mgwl7\" (UID: \"e7dabd60-e0b9-4ddf-9775-67bf8f922e54\") " pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.014599 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.028792 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr27j\" (UniqueName: \"kubernetes.io/projected/e7dabd60-e0b9-4ddf-9775-67bf8f922e54-kube-api-access-gr27j\") pod \"test-operator-controller-manager-7866795846-mgwl7\" (UID: \"e7dabd60-e0b9-4ddf-9775-67bf8f922e54\") " pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.033643 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7jl6\" (UniqueName: \"kubernetes.io/projected/07e7dae1-170c-4c44-8ebf-4547edea1c65-kube-api-access-p7jl6\") pod \"telemetry-operator-controller-manager-7f45b4ff68-4zd6f\" (UID: \"07e7dae1-170c-4c44-8ebf-4547edea1c65\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.098737 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nkrk\" (UniqueName: \"kubernetes.io/projected/39af282d-8ae3-4a52-9274-cd62d12a1ef1-kube-api-access-6nkrk\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.099305 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.099334 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqlrm\" (UniqueName: \"kubernetes.io/projected/de50ead7-a26c-4c87-a379-7c83b3ffeabd-kube-api-access-bqlrm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dcfgx\" (UID: \"de50ead7-a26c-4c87-a379-7c83b3ffeabd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.099713 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pwkk\" (UniqueName: \"kubernetes.io/projected/9169039b-beb7-48c1-8dbc-485f91b6a2ac-kube-api-access-8pwkk\") pod \"watcher-operator-controller-manager-5db88f68c-xkf9z\" (UID: \"9169039b-beb7-48c1-8dbc-485f91b6a2ac\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.099749 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.099773 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.100446 5118 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.100498 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert podName:f9e92ac7-cd2b-4c82-94a1-213c37487e61 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:32.100478388 +0000 UTC m=+1075.104262961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert") pod "infra-operator-controller-manager-79d975b745-79wlh" (UID: "f9e92ac7-cd2b-4c82-94a1-213c37487e61") : secret "infra-operator-webhook-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.121342 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pwkk\" (UniqueName: \"kubernetes.io/projected/9169039b-beb7-48c1-8dbc-485f91b6a2ac-kube-api-access-8pwkk\") pod \"watcher-operator-controller-manager-5db88f68c-xkf9z\" (UID: \"9169039b-beb7-48c1-8dbc-485f91b6a2ac\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.208242 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nkrk\" (UniqueName: \"kubernetes.io/projected/39af282d-8ae3-4a52-9274-cd62d12a1ef1-kube-api-access-6nkrk\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.208323 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqlrm\" (UniqueName: \"kubernetes.io/projected/de50ead7-a26c-4c87-a379-7c83b3ffeabd-kube-api-access-bqlrm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dcfgx\" (UID: \"de50ead7-a26c-4c87-a379-7c83b3ffeabd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.208384 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.208408 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.208577 5118 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.208645 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:31.708623368 +0000 UTC m=+1074.712407941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "metrics-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.208835 5118 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.208865 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:31.708859153 +0000 UTC m=+1074.712643726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "webhook-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.220263 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br"] Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.229596 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqlrm\" (UniqueName: \"kubernetes.io/projected/de50ead7-a26c-4c87-a379-7c83b3ffeabd-kube-api-access-bqlrm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-dcfgx\" (UID: \"de50ead7-a26c-4c87-a379-7c83b3ffeabd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.233563 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nkrk\" (UniqueName: \"kubernetes.io/projected/39af282d-8ae3-4a52-9274-cd62d12a1ef1-kube-api-access-6nkrk\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.272614 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.298172 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.309170 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.309332 5118 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.309375 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert podName:f4b0cd84-f619-415b-9d18-bc6de1b1f40b nodeName:}" failed. No retries permitted until 2026-02-23 07:03:32.309357938 +0000 UTC m=+1075.313142501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" (UID: "f4b0cd84-f619-415b-9d18-bc6de1b1f40b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.348700 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.382229 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.408595 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4"] Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.416397 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x"] Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.533606 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj"] Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.538515 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx"] Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.543047 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6"] Feb 23 07:03:31 crc kubenswrapper[5118]: W0223 07:03:31.554774 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f5282f_4695_42d1_9ebd_6fd3d8acc24b.slice/crio-51dc2a432496eec539086129675b46fe33758f12c6308308952c72228e5f03bd WatchSource:0}: Error finding container 51dc2a432496eec539086129675b46fe33758f12c6308308952c72228e5f03bd: Status 404 returned error can't find the container with id 51dc2a432496eec539086129675b46fe33758f12c6308308952c72228e5f03bd Feb 23 07:03:31 crc kubenswrapper[5118]: W0223 07:03:31.565594 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd42094d4_0a02_4b3d_8bad_7d0a0bda3d47.slice/crio-20259f1dc3aee4401b2ea20462fd86ad7ed770a8557b90f08f70cda3bc124280 WatchSource:0}: Error finding container 20259f1dc3aee4401b2ea20462fd86ad7ed770a8557b90f08f70cda3bc124280: Status 404 returned error can't find the container with id 20259f1dc3aee4401b2ea20462fd86ad7ed770a8557b90f08f70cda3bc124280 Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.717312 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.717426 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.718085 5118 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.718122 5118 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.718200 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:32.718174643 +0000 UTC m=+1075.721959216 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "webhook-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.718218 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:32.718210713 +0000 UTC m=+1075.721995286 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "metrics-server-cert" not found Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.748016 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v"] Feb 23 07:03:31 crc kubenswrapper[5118]: W0223 07:03:31.764684 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod133701aa_5bf0_4431_90bd_4a67b0c71a8f.slice/crio-c765f114516dff36ffaf2d8626083a07e0579d8baef812897f1b1b0796f67f8f WatchSource:0}: Error finding container c765f114516dff36ffaf2d8626083a07e0579d8baef812897f1b1b0796f67f8f: Status 404 returned error can't find the container with id c765f114516dff36ffaf2d8626083a07e0579d8baef812897f1b1b0796f67f8f Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.771419 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd"] Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.783367 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz"] Feb 23 07:03:31 crc kubenswrapper[5118]: W0223 07:03:31.787058 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode56300c0_de91_4f3e_9b0b_6e5303e7561d.slice/crio-c8ae14b331fe4a5873786940d05433e9eea53ff289899b6cd0b830e86361664f WatchSource:0}: Error finding container c8ae14b331fe4a5873786940d05433e9eea53ff289899b6cd0b830e86361664f: Status 404 returned error can't find the container with id c8ae14b331fe4a5873786940d05433e9eea53ff289899b6cd0b830e86361664f Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.813170 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-98cgm"] Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.867529 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr"] Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.873414 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb"] Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.877752 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6"] Feb 23 07:03:31 crc kubenswrapper[5118]: W0223 07:03:31.879362 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode729f808_0740_4153_aa13_93bdfc02da47.slice/crio-b3cdc5c72ccca4c5aa3552446ba019dfc8dcc884566ec680c272ce9c51a1fce5 WatchSource:0}: Error finding container b3cdc5c72ccca4c5aa3552446ba019dfc8dcc884566ec680c272ce9c51a1fce5: Status 404 returned error can't find the container with id b3cdc5c72ccca4c5aa3552446ba019dfc8dcc884566ec680c272ce9c51a1fce5 Feb 23 07:03:31 crc kubenswrapper[5118]: W0223 07:03:31.879993 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61637571_dc55_41bf_9765_415fa7a78100.slice/crio-24608a341c9a059eb95f6e1a4a085ed23534e766adc3db9b915c23549515dc17 WatchSource:0}: Error finding container 24608a341c9a059eb95f6e1a4a085ed23534e766adc3db9b915c23549515dc17: Status 404 returned error can't find the container with id 24608a341c9a059eb95f6e1a4a085ed23534e766adc3db9b915c23549515dc17 Feb 23 07:03:31 crc kubenswrapper[5118]: I0223 07:03:31.882526 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz"] Feb 23 07:03:31 crc kubenswrapper[5118]: W0223 07:03:31.883299 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe2a698_e792_41a1_92bc_589d0bab315b.slice/crio-70304eba98e94f8b4826a1ecf237c7d6323539547730bea79c7daaeaa34ccdd7 WatchSource:0}: Error finding container 70304eba98e94f8b4826a1ecf237c7d6323539547730bea79c7daaeaa34ccdd7: Status 404 returned error can't find the container with id 70304eba98e94f8b4826a1ecf237c7d6323539547730bea79c7daaeaa34ccdd7 Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.887181 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sppnc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-vqkbz_openstack-operators(3fe2a698-e792-41a1-92bc-589d0bab315b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.888291 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" podUID="3fe2a698-e792-41a1-92bc-589d0bab315b" Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.888409 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jllbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-4x7fr_openstack-operators(a66d9cfc-3b02-4896-8dda-bcbeddb5c803): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 07:03:31 crc kubenswrapper[5118]: E0223 07:03:31.889586 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" podUID="a66d9cfc-3b02-4896-8dda-bcbeddb5c803" Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.024672 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f"] Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.033203 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb"] Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.038786 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc"] Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.041449 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x" event={"ID":"22db249a-2327-4cb1-a11c-72ac99925b27","Type":"ContainerStarted","Data":"6609bba15c7341c8377186092bfaa1a7348e1439cafbcd88ef5375dc64c9e2b4"} Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.047150 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd" event={"ID":"133701aa-5bf0-4431-90bd-4a67b0c71a8f","Type":"ContainerStarted","Data":"c765f114516dff36ffaf2d8626083a07e0579d8baef812897f1b1b0796f67f8f"} Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.049050 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" event={"ID":"a66d9cfc-3b02-4896-8dda-bcbeddb5c803","Type":"ContainerStarted","Data":"240139ebdaebdd4df7ac059cb5726ac119e576e9e6c996564287588f18e11558"} Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.050457 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" podUID="a66d9cfc-3b02-4896-8dda-bcbeddb5c803" Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.050770 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz" event={"ID":"e56300c0-de91-4f3e-9b0b-6e5303e7561d","Type":"ContainerStarted","Data":"c8ae14b331fe4a5873786940d05433e9eea53ff289899b6cd0b830e86361664f"} Feb 23 07:03:32 crc kubenswrapper[5118]: W0223 07:03:32.053295 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e7dae1_170c_4c44_8ebf_4547edea1c65.slice/crio-09e7e99770468f3d954fae1c798726311b38b29803919a66a2e290baabef23ab WatchSource:0}: Error finding container 09e7e99770468f3d954fae1c798726311b38b29803919a66a2e290baabef23ab: Status 404 returned error can't find the container with id 09e7e99770468f3d954fae1c798726311b38b29803919a66a2e290baabef23ab Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.053326 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-mgwl7"] Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.053774 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" event={"ID":"61637571-dc55-41bf-9765-415fa7a78100","Type":"ContainerStarted","Data":"24608a341c9a059eb95f6e1a4a085ed23534e766adc3db9b915c23549515dc17"} Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.055945 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-98cgm" event={"ID":"618a679a-a0c0-40a1-b3b3-79e0834aef7c","Type":"ContainerStarted","Data":"820bac2e798799938f9f6b9edb0cd95859a72af40d7123a99c2b6f71163304cb"} Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.085587 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z"] Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.099887 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb" event={"ID":"e729f808-0740-4153-aa13-93bdfc02da47","Type":"ContainerStarted","Data":"b3cdc5c72ccca4c5aa3552446ba019dfc8dcc884566ec680c272ce9c51a1fce5"} Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.110805 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bzx8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-gvdcc_openstack-operators(292023bd-64ab-499f-aabc-43b314775b62): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.110974 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8pwkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-xkf9z_openstack-operators(9169039b-beb7-48c1-8dbc-485f91b6a2ac): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.112316 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" podUID="9169039b-beb7-48c1-8dbc-485f91b6a2ac" Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.112400 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" podUID="292023bd-64ab-499f-aabc-43b314775b62" Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.125264 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mjglg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-54gzb_openstack-operators(f392b9d5-c8ed-4976-a7df-96cbbd8faad0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.126900 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" podUID="f392b9d5-c8ed-4976-a7df-96cbbd8faad0" Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.129405 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.129753 5118 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.129839 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert podName:f9e92ac7-cd2b-4c82-94a1-213c37487e61 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:34.129811855 +0000 UTC m=+1077.133596468 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert") pod "infra-operator-controller-manager-79d975b745-79wlh" (UID: "f9e92ac7-cd2b-4c82-94a1-213c37487e61") : secret "infra-operator-webhook-server-cert" not found Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.147052 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj" event={"ID":"29f5282f-4695-42d1-9ebd-6fd3d8acc24b","Type":"ContainerStarted","Data":"51dc2a432496eec539086129675b46fe33758f12c6308308952c72228e5f03bd"} Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.151312 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4" event={"ID":"f6ab91fc-435e-4d88-8dfa-837000d8decd","Type":"ContainerStarted","Data":"53436e00197f068c3fc5c2b0777ff1443e3e842a97f3fac19c178e223fed7bc0"} Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.157247 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" event={"ID":"d42094d4-0a02-4b3d-8bad-7d0a0bda3d47","Type":"ContainerStarted","Data":"20259f1dc3aee4401b2ea20462fd86ad7ed770a8557b90f08f70cda3bc124280"} Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.161734 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx" event={"ID":"323a4a15-ada9-44e1-9fb1-4533964ab4b8","Type":"ContainerStarted","Data":"da7aa8d367d7334c11e59f911e619289d26eadf972eb31c93f8a7dad186b2f54"} Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.163672 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br" event={"ID":"f2be56ef-55b5-42c9-bf9e-7db0f6148b49","Type":"ContainerStarted","Data":"c50b963901edf3acad20d12f1a14531d0ed9c2caabd28c4739b24ca8fbb5da75"} Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.164660 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v" event={"ID":"9885aacd-cab4-40c1-b504-067fa6b38393","Type":"ContainerStarted","Data":"4e677102c871f10ac8648b4829f5d68936dedd166040fed04a0f2c17cb8bc3a9"} Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.183086 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" event={"ID":"3fe2a698-e792-41a1-92bc-589d0bab315b","Type":"ContainerStarted","Data":"70304eba98e94f8b4826a1ecf237c7d6323539547730bea79c7daaeaa34ccdd7"} Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.187036 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" podUID="3fe2a698-e792-41a1-92bc-589d0bab315b" Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.227086 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx"] Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.289999 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bqlrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-dcfgx_openstack-operators(de50ead7-a26c-4c87-a379-7c83b3ffeabd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.291422 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" podUID="de50ead7-a26c-4c87-a379-7c83b3ffeabd" Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.332457 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.333086 5118 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.333171 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert podName:f4b0cd84-f619-415b-9d18-bc6de1b1f40b nodeName:}" failed. No retries permitted until 2026-02-23 07:03:34.333151812 +0000 UTC m=+1077.336936385 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" (UID: "f4b0cd84-f619-415b-9d18-bc6de1b1f40b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.741253 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.741315 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.741535 5118 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.741547 5118 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.741609 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:34.741588967 +0000 UTC m=+1077.745373530 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "metrics-server-cert" not found Feb 23 07:03:32 crc kubenswrapper[5118]: E0223 07:03:32.741696 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:34.741663369 +0000 UTC m=+1077.745447942 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "webhook-server-cert" not found Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.975317 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:03:32 crc kubenswrapper[5118]: I0223 07:03:32.975396 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:03:33 crc kubenswrapper[5118]: I0223 07:03:33.193200 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" event={"ID":"f392b9d5-c8ed-4976-a7df-96cbbd8faad0","Type":"ContainerStarted","Data":"b834e71fa483316d75fa2a079d5642418cdbf047163154bc7f17af1fe1082f64"} Feb 23 07:03:33 crc kubenswrapper[5118]: E0223 07:03:33.196028 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" podUID="f392b9d5-c8ed-4976-a7df-96cbbd8faad0" Feb 23 07:03:33 crc kubenswrapper[5118]: I0223 07:03:33.200296 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" event={"ID":"292023bd-64ab-499f-aabc-43b314775b62","Type":"ContainerStarted","Data":"e97d519cb66fc00eb2a340f009683df780a18d94a8215255802b317bfd19dcb2"} Feb 23 07:03:33 crc kubenswrapper[5118]: E0223 07:03:33.202482 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" podUID="292023bd-64ab-499f-aabc-43b314775b62" Feb 23 07:03:33 crc kubenswrapper[5118]: I0223 07:03:33.204871 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" event={"ID":"de50ead7-a26c-4c87-a379-7c83b3ffeabd","Type":"ContainerStarted","Data":"71cab06722a8399103b1485f00f387ff8da1761887bf46d7b83cc799815754aa"} Feb 23 07:03:33 crc kubenswrapper[5118]: E0223 07:03:33.207684 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" podUID="de50ead7-a26c-4c87-a379-7c83b3ffeabd" Feb 23 07:03:33 crc kubenswrapper[5118]: I0223 07:03:33.209792 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" event={"ID":"e7dabd60-e0b9-4ddf-9775-67bf8f922e54","Type":"ContainerStarted","Data":"105b8fef1f58a6cd58082b38eeda16853f2125994d9e88da793ab27709786a9e"} Feb 23 07:03:33 crc kubenswrapper[5118]: I0223 07:03:33.225690 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" event={"ID":"9169039b-beb7-48c1-8dbc-485f91b6a2ac","Type":"ContainerStarted","Data":"e420485c254f38be2707e0f2625dfde89a1395becc8b4dd89e27355e316682e9"} Feb 23 07:03:33 crc kubenswrapper[5118]: E0223 07:03:33.236832 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" podUID="9169039b-beb7-48c1-8dbc-485f91b6a2ac" Feb 23 07:03:33 crc kubenswrapper[5118]: I0223 07:03:33.240494 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" event={"ID":"07e7dae1-170c-4c44-8ebf-4547edea1c65","Type":"ContainerStarted","Data":"09e7e99770468f3d954fae1c798726311b38b29803919a66a2e290baabef23ab"} Feb 23 07:03:33 crc kubenswrapper[5118]: E0223 07:03:33.243461 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" podUID="3fe2a698-e792-41a1-92bc-589d0bab315b" Feb 23 07:03:33 crc kubenswrapper[5118]: E0223 07:03:33.247307 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" podUID="a66d9cfc-3b02-4896-8dda-bcbeddb5c803" Feb 23 07:03:34 crc kubenswrapper[5118]: I0223 07:03:34.172861 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.173056 5118 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.173159 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert podName:f9e92ac7-cd2b-4c82-94a1-213c37487e61 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:38.173129721 +0000 UTC m=+1081.176914294 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert") pod "infra-operator-controller-manager-79d975b745-79wlh" (UID: "f9e92ac7-cd2b-4c82-94a1-213c37487e61") : secret "infra-operator-webhook-server-cert" not found Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.265440 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" podUID="9169039b-beb7-48c1-8dbc-485f91b6a2ac" Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.265480 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" podUID="f392b9d5-c8ed-4976-a7df-96cbbd8faad0" Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.265578 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" podUID="292023bd-64ab-499f-aabc-43b314775b62" Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.277570 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" podUID="de50ead7-a26c-4c87-a379-7c83b3ffeabd" Feb 23 07:03:34 crc kubenswrapper[5118]: I0223 07:03:34.379634 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.381024 5118 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.383595 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert podName:f4b0cd84-f619-415b-9d18-bc6de1b1f40b nodeName:}" failed. No retries permitted until 2026-02-23 07:03:38.381158449 +0000 UTC m=+1081.384943022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" (UID: "f4b0cd84-f619-415b-9d18-bc6de1b1f40b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:34 crc kubenswrapper[5118]: I0223 07:03:34.789330 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:34 crc kubenswrapper[5118]: I0223 07:03:34.789392 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.789531 5118 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.789559 5118 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.789632 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:38.789602975 +0000 UTC m=+1081.793387548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "webhook-server-cert" not found Feb 23 07:03:34 crc kubenswrapper[5118]: E0223 07:03:34.789669 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:38.789647657 +0000 UTC m=+1081.793432230 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "metrics-server-cert" not found Feb 23 07:03:38 crc kubenswrapper[5118]: I0223 07:03:38.259502 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:38 crc kubenswrapper[5118]: E0223 07:03:38.259647 5118 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 07:03:38 crc kubenswrapper[5118]: E0223 07:03:38.260266 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert podName:f9e92ac7-cd2b-4c82-94a1-213c37487e61 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:46.260243822 +0000 UTC m=+1089.264028395 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert") pod "infra-operator-controller-manager-79d975b745-79wlh" (UID: "f9e92ac7-cd2b-4c82-94a1-213c37487e61") : secret "infra-operator-webhook-server-cert" not found Feb 23 07:03:38 crc kubenswrapper[5118]: I0223 07:03:38.463821 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:38 crc kubenswrapper[5118]: E0223 07:03:38.464036 5118 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:38 crc kubenswrapper[5118]: E0223 07:03:38.464114 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert podName:f4b0cd84-f619-415b-9d18-bc6de1b1f40b nodeName:}" failed. No retries permitted until 2026-02-23 07:03:46.46407612 +0000 UTC m=+1089.467860693 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" (UID: "f4b0cd84-f619-415b-9d18-bc6de1b1f40b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:38 crc kubenswrapper[5118]: I0223 07:03:38.870775 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:38 crc kubenswrapper[5118]: I0223 07:03:38.870944 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:38 crc kubenswrapper[5118]: E0223 07:03:38.871048 5118 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:03:38 crc kubenswrapper[5118]: E0223 07:03:38.871054 5118 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:03:38 crc kubenswrapper[5118]: E0223 07:03:38.871142 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:46.871120362 +0000 UTC m=+1089.874904935 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "webhook-server-cert" not found Feb 23 07:03:38 crc kubenswrapper[5118]: E0223 07:03:38.871179 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:03:46.871154713 +0000 UTC m=+1089.874939326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "metrics-server-cert" not found Feb 23 07:03:44 crc kubenswrapper[5118]: E0223 07:03:44.850703 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6" Feb 23 07:03:44 crc kubenswrapper[5118]: E0223 07:03:44.851814 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gr27j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-mgwl7_openstack-operators(e7dabd60-e0b9-4ddf-9775-67bf8f922e54): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:03:44 crc kubenswrapper[5118]: E0223 07:03:44.853061 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" podUID="e7dabd60-e0b9-4ddf-9775-67bf8f922e54" Feb 23 07:03:45 crc kubenswrapper[5118]: E0223 07:03:45.352717 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" podUID="e7dabd60-e0b9-4ddf-9775-67bf8f922e54" Feb 23 07:03:45 crc kubenswrapper[5118]: E0223 07:03:45.493130 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99" Feb 23 07:03:45 crc kubenswrapper[5118]: E0223 07:03:45.493441 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p7jl6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-4zd6f_openstack-operators(07e7dae1-170c-4c44-8ebf-4547edea1c65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:03:45 crc kubenswrapper[5118]: E0223 07:03:45.494679 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" podUID="07e7dae1-170c-4c44-8ebf-4547edea1c65" Feb 23 07:03:46 crc kubenswrapper[5118]: I0223 07:03:46.326230 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:46 crc kubenswrapper[5118]: I0223 07:03:46.332663 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e92ac7-cd2b-4c82-94a1-213c37487e61-cert\") pod \"infra-operator-controller-manager-79d975b745-79wlh\" (UID: \"f9e92ac7-cd2b-4c82-94a1-213c37487e61\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:46 crc kubenswrapper[5118]: E0223 07:03:46.359801 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" podUID="07e7dae1-170c-4c44-8ebf-4547edea1c65" Feb 23 07:03:46 crc kubenswrapper[5118]: I0223 07:03:46.516987 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rws6q" Feb 23 07:03:46 crc kubenswrapper[5118]: I0223 07:03:46.527241 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:46 crc kubenswrapper[5118]: I0223 07:03:46.530906 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:03:46 crc kubenswrapper[5118]: E0223 07:03:46.531141 5118 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:46 crc kubenswrapper[5118]: E0223 07:03:46.531227 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert podName:f4b0cd84-f619-415b-9d18-bc6de1b1f40b nodeName:}" failed. No retries permitted until 2026-02-23 07:04:02.531200359 +0000 UTC m=+1105.534984932 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" (UID: "f4b0cd84-f619-415b-9d18-bc6de1b1f40b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 07:03:46 crc kubenswrapper[5118]: I0223 07:03:46.938319 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:46 crc kubenswrapper[5118]: I0223 07:03:46.938470 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:03:46 crc kubenswrapper[5118]: E0223 07:03:46.938587 5118 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 07:03:46 crc kubenswrapper[5118]: E0223 07:03:46.938666 5118 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 07:03:46 crc kubenswrapper[5118]: E0223 07:03:46.938707 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:04:02.938674962 +0000 UTC m=+1105.942459575 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "webhook-server-cert" not found Feb 23 07:03:46 crc kubenswrapper[5118]: E0223 07:03:46.938752 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs podName:39af282d-8ae3-4a52-9274-cd62d12a1ef1 nodeName:}" failed. No retries permitted until 2026-02-23 07:04:02.938723943 +0000 UTC m=+1105.942508546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-b8rhr" (UID: "39af282d-8ae3-4a52-9274-cd62d12a1ef1") : secret "metrics-server-cert" not found Feb 23 07:03:47 crc kubenswrapper[5118]: E0223 07:03:47.045947 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 23 07:03:47 crc kubenswrapper[5118]: E0223 07:03:47.046134 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p4c2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-qhvx6_openstack-operators(61637571-dc55-41bf-9765-415fa7a78100): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:03:47 crc kubenswrapper[5118]: E0223 07:03:47.048764 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" podUID="61637571-dc55-41bf-9765-415fa7a78100" Feb 23 07:03:47 crc kubenswrapper[5118]: E0223 07:03:47.366078 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" podUID="61637571-dc55-41bf-9765-415fa7a78100" Feb 23 07:03:47 crc kubenswrapper[5118]: E0223 07:03:47.595778 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 23 07:03:47 crc kubenswrapper[5118]: E0223 07:03:47.595970 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gnpz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-98sv6_openstack-operators(d42094d4-0a02-4b3d-8bad-7d0a0bda3d47): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:03:47 crc kubenswrapper[5118]: E0223 07:03:47.598219 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" podUID="d42094d4-0a02-4b3d-8bad-7d0a0bda3d47" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.061125 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-79wlh"] Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.382082 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x" event={"ID":"22db249a-2327-4cb1-a11c-72ac99925b27","Type":"ContainerStarted","Data":"53d7f926c82b39eacda98d69e0332e9eacd726ce59a5fa394aba5a0b4bb796df"} Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.382196 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.390170 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj" event={"ID":"29f5282f-4695-42d1-9ebd-6fd3d8acc24b","Type":"ContainerStarted","Data":"aadf75e81dbbfa9756f89ff62f566427e7e68faf7a028e7943a9b66ee8d9b37f"} Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.390264 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.393321 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd" event={"ID":"133701aa-5bf0-4431-90bd-4a67b0c71a8f","Type":"ContainerStarted","Data":"531633fae760f3678300c1c62ae9b6510c4ab9f8af39444eac902c435172b4c6"} Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.393434 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.404536 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x" podStartSLOduration=2.357546958 podStartE2EDuration="18.404504259s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.537834769 +0000 UTC m=+1074.541619342" lastFinishedPulling="2026-02-23 07:03:47.58479207 +0000 UTC m=+1090.588576643" observedRunningTime="2026-02-23 07:03:48.402806488 +0000 UTC m=+1091.406591061" watchObservedRunningTime="2026-02-23 07:03:48.404504259 +0000 UTC m=+1091.408288832" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.405499 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx" event={"ID":"323a4a15-ada9-44e1-9fb1-4533964ab4b8","Type":"ContainerStarted","Data":"7a4af61312601461cbcde61f50811cf2f63eb042f439596525af02bccb59116a"} Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.405764 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.449822 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd" podStartSLOduration=2.638010688 podStartE2EDuration="18.449787296s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.772058448 +0000 UTC m=+1074.775843021" lastFinishedPulling="2026-02-23 07:03:47.583835056 +0000 UTC m=+1090.587619629" observedRunningTime="2026-02-23 07:03:48.435305609 +0000 UTC m=+1091.439090192" watchObservedRunningTime="2026-02-23 07:03:48.449787296 +0000 UTC m=+1091.453571869" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.456179 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v" event={"ID":"9885aacd-cab4-40c1-b504-067fa6b38393","Type":"ContainerStarted","Data":"a4d8676a441e508b8d2bf28c5d7e067673e0cf38e67321f85fdb50a1636a69b3"} Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.456695 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.491222 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-98cgm" event={"ID":"618a679a-a0c0-40a1-b3b3-79e0834aef7c","Type":"ContainerStarted","Data":"43758de3f7f51018c7fa05ab40ab81ef36e043c552e86759518a662bd6f711da"} Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.492028 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-98cgm" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.501520 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj" podStartSLOduration=2.477827458 podStartE2EDuration="18.501496329s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.560562415 +0000 UTC m=+1074.564346988" lastFinishedPulling="2026-02-23 07:03:47.584231286 +0000 UTC m=+1090.588015859" observedRunningTime="2026-02-23 07:03:48.47569951 +0000 UTC m=+1091.479484073" watchObservedRunningTime="2026-02-23 07:03:48.501496329 +0000 UTC m=+1091.505280902" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.507746 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb" event={"ID":"e729f808-0740-4153-aa13-93bdfc02da47","Type":"ContainerStarted","Data":"4bfa1a03d309d6a4c5edce4bcd77bffc068804bb194624ab83d3b83d499c4518"} Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.508045 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.513417 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v" podStartSLOduration=2.689214449 podStartE2EDuration="18.513399496s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.759802593 +0000 UTC m=+1074.763587166" lastFinishedPulling="2026-02-23 07:03:47.58398763 +0000 UTC m=+1090.587772213" observedRunningTime="2026-02-23 07:03:48.497728019 +0000 UTC m=+1091.501512592" watchObservedRunningTime="2026-02-23 07:03:48.513399496 +0000 UTC m=+1091.517184069" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.530052 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx" podStartSLOduration=2.499366366 podStartE2EDuration="18.530031226s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.554333825 +0000 UTC m=+1074.558118398" lastFinishedPulling="2026-02-23 07:03:47.584998685 +0000 UTC m=+1090.588783258" observedRunningTime="2026-02-23 07:03:48.529667796 +0000 UTC m=+1091.533452369" watchObservedRunningTime="2026-02-23 07:03:48.530031226 +0000 UTC m=+1091.533815799" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.535536 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4" event={"ID":"f6ab91fc-435e-4d88-8dfa-837000d8decd","Type":"ContainerStarted","Data":"976974b5d7da812446deaec1981f856d6a86951f62330e3a5a2c6b374948f61c"} Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.537012 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.540408 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.540616 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz" event={"ID":"e56300c0-de91-4f3e-9b0b-6e5303e7561d","Type":"ContainerStarted","Data":"7f8405cc6b406297e26122ddf030220123659525d9b30e26f57d52ad628de196"} Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.566657 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br" event={"ID":"f2be56ef-55b5-42c9-bf9e-7db0f6148b49","Type":"ContainerStarted","Data":"c2d5617d788d2aaf1338c44367bd4fc9e3b9e9fb1d5b66a6f5e8848a72c6b2da"} Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.566729 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br" Feb 23 07:03:48 crc kubenswrapper[5118]: E0223 07:03:48.570852 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" podUID="d42094d4-0a02-4b3d-8bad-7d0a0bda3d47" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.583940 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb" podStartSLOduration=2.882722439 podStartE2EDuration="18.58391398s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.883006254 +0000 UTC m=+1074.886790827" lastFinishedPulling="2026-02-23 07:03:47.584197795 +0000 UTC m=+1090.587982368" observedRunningTime="2026-02-23 07:03:48.5768189 +0000 UTC m=+1091.580603473" watchObservedRunningTime="2026-02-23 07:03:48.58391398 +0000 UTC m=+1091.587698553" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.612553 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-98cgm" podStartSLOduration=2.855065485 podStartE2EDuration="18.612522698s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.826411584 +0000 UTC m=+1074.830196157" lastFinishedPulling="2026-02-23 07:03:47.583868797 +0000 UTC m=+1090.587653370" observedRunningTime="2026-02-23 07:03:48.599984027 +0000 UTC m=+1091.603768600" watchObservedRunningTime="2026-02-23 07:03:48.612522698 +0000 UTC m=+1091.616307271" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.628501 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br" podStartSLOduration=2.334401091 podStartE2EDuration="18.628478551s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.292430401 +0000 UTC m=+1074.296214974" lastFinishedPulling="2026-02-23 07:03:47.586507861 +0000 UTC m=+1090.590292434" observedRunningTime="2026-02-23 07:03:48.623588094 +0000 UTC m=+1091.627372677" watchObservedRunningTime="2026-02-23 07:03:48.628478551 +0000 UTC m=+1091.632263114" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.680460 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz" podStartSLOduration=2.886378868 podStartE2EDuration="18.68043555s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.789947198 +0000 UTC m=+1074.793731771" lastFinishedPulling="2026-02-23 07:03:47.58400388 +0000 UTC m=+1090.587788453" observedRunningTime="2026-02-23 07:03:48.660785677 +0000 UTC m=+1091.664570250" watchObservedRunningTime="2026-02-23 07:03:48.68043555 +0000 UTC m=+1091.684220123" Feb 23 07:03:48 crc kubenswrapper[5118]: I0223 07:03:48.695213 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4" podStartSLOduration=2.540890194 podStartE2EDuration="18.695188895s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.430917899 +0000 UTC m=+1074.434702472" lastFinishedPulling="2026-02-23 07:03:47.5852166 +0000 UTC m=+1090.589001173" observedRunningTime="2026-02-23 07:03:48.693524274 +0000 UTC m=+1091.697308857" watchObservedRunningTime="2026-02-23 07:03:48.695188895 +0000 UTC m=+1091.698973468" Feb 23 07:03:49 crc kubenswrapper[5118]: W0223 07:03:49.745066 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9e92ac7_cd2b_4c82_94a1_213c37487e61.slice/crio-1a7a881884875c0d58e612519b81f216ae213ef5d2977e03cf3bb65d86ab171a WatchSource:0}: Error finding container 1a7a881884875c0d58e612519b81f216ae213ef5d2977e03cf3bb65d86ab171a: Status 404 returned error can't find the container with id 1a7a881884875c0d58e612519b81f216ae213ef5d2977e03cf3bb65d86ab171a Feb 23 07:03:50 crc kubenswrapper[5118]: I0223 07:03:50.582153 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" event={"ID":"f9e92ac7-cd2b-4c82-94a1-213c37487e61","Type":"ContainerStarted","Data":"1a7a881884875c0d58e612519b81f216ae213ef5d2977e03cf3bb65d86ab171a"} Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.642385 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" event={"ID":"f392b9d5-c8ed-4976-a7df-96cbbd8faad0","Type":"ContainerStarted","Data":"2ae3226e4bd2a1b51ce9ab00719b4619874fe2b0450fc6a0448c02be700cc345"} Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.643295 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.644646 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" event={"ID":"3fe2a698-e792-41a1-92bc-589d0bab315b","Type":"ContainerStarted","Data":"29a56d208f8c152f543f0dfccfd0dbe0a74ea1951654733c4d4e41745c756b66"} Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.644905 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.646815 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" event={"ID":"292023bd-64ab-499f-aabc-43b314775b62","Type":"ContainerStarted","Data":"bc7a7570af55bff6573163d5725145980f70c9ff8995a1573b688dcba0fc467f"} Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.646985 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.648562 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" event={"ID":"de50ead7-a26c-4c87-a379-7c83b3ffeabd","Type":"ContainerStarted","Data":"80c5a1dab5443592d2be7e9468183cb88c6428ab8fb1416d384500563656183e"} Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.650072 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" event={"ID":"a66d9cfc-3b02-4896-8dda-bcbeddb5c803","Type":"ContainerStarted","Data":"61afbee09d69272a0c0e2d81fa207982888adc6cb67a9528d775510d263e4480"} Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.650310 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.651879 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" event={"ID":"f9e92ac7-cd2b-4c82-94a1-213c37487e61","Type":"ContainerStarted","Data":"892b99d0abcce8ed841e17c2a04794c85779a3412a5b0caf0110dc7dd9b79846"} Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.651968 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.654006 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" event={"ID":"9169039b-beb7-48c1-8dbc-485f91b6a2ac","Type":"ContainerStarted","Data":"24989c6ef0c75ba56ea64ec2778a57ea0e7b391f9d8d7f6bdab07a7263ed72e8"} Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.654166 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.668914 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" podStartSLOduration=2.7224050159999997 podStartE2EDuration="27.66888712s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:32.111025064 +0000 UTC m=+1075.114809637" lastFinishedPulling="2026-02-23 07:03:57.057507168 +0000 UTC m=+1100.061291741" observedRunningTime="2026-02-23 07:03:57.662886677 +0000 UTC m=+1100.666671250" watchObservedRunningTime="2026-02-23 07:03:57.66888712 +0000 UTC m=+1100.672671693" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.683634 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" podStartSLOduration=6.663600151 podStartE2EDuration="27.683607464s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:32.110611264 +0000 UTC m=+1075.114395837" lastFinishedPulling="2026-02-23 07:03:53.130618567 +0000 UTC m=+1096.134403150" observedRunningTime="2026-02-23 07:03:57.67804607 +0000 UTC m=+1100.681830643" watchObservedRunningTime="2026-02-23 07:03:57.683607464 +0000 UTC m=+1100.687392037" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.697563 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" podStartSLOduration=2.799037858 podStartE2EDuration="27.697536658s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:32.110877141 +0000 UTC m=+1075.114661704" lastFinishedPulling="2026-02-23 07:03:57.009375931 +0000 UTC m=+1100.013160504" observedRunningTime="2026-02-23 07:03:57.695025178 +0000 UTC m=+1100.698809751" watchObservedRunningTime="2026-02-23 07:03:57.697536658 +0000 UTC m=+1100.701321231" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.714933 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-dcfgx" podStartSLOduration=2.896314366 podStartE2EDuration="27.714913987s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:32.289856732 +0000 UTC m=+1075.293641305" lastFinishedPulling="2026-02-23 07:03:57.108456323 +0000 UTC m=+1100.112240926" observedRunningTime="2026-02-23 07:03:57.70920209 +0000 UTC m=+1100.712986663" watchObservedRunningTime="2026-02-23 07:03:57.714913987 +0000 UTC m=+1100.718698560" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.743892 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" podStartSLOduration=2.624185606 podStartE2EDuration="27.743861062s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.88821913 +0000 UTC m=+1074.892003703" lastFinishedPulling="2026-02-23 07:03:57.007894586 +0000 UTC m=+1100.011679159" observedRunningTime="2026-02-23 07:03:57.742300294 +0000 UTC m=+1100.746084867" watchObservedRunningTime="2026-02-23 07:03:57.743861062 +0000 UTC m=+1100.747645635" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.770738 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" podStartSLOduration=7.767509761 podStartE2EDuration="27.770709208s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.88698762 +0000 UTC m=+1074.890772193" lastFinishedPulling="2026-02-23 07:03:51.890187067 +0000 UTC m=+1094.893971640" observedRunningTime="2026-02-23 07:03:57.766670381 +0000 UTC m=+1100.770454954" watchObservedRunningTime="2026-02-23 07:03:57.770709208 +0000 UTC m=+1100.774493781" Feb 23 07:03:57 crc kubenswrapper[5118]: I0223 07:03:57.787039 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" podStartSLOduration=20.491063302 podStartE2EDuration="27.787009419s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:49.762504284 +0000 UTC m=+1092.766288857" lastFinishedPulling="2026-02-23 07:03:57.058450401 +0000 UTC m=+1100.062234974" observedRunningTime="2026-02-23 07:03:57.781582428 +0000 UTC m=+1100.785367001" watchObservedRunningTime="2026-02-23 07:03:57.787009419 +0000 UTC m=+1100.790793992" Feb 23 07:04:00 crc kubenswrapper[5118]: I0223 07:04:00.437548 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-x8xz4" Feb 23 07:04:00 crc kubenswrapper[5118]: I0223 07:04:00.463755 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-bz8br" Feb 23 07:04:00 crc kubenswrapper[5118]: I0223 07:04:00.549155 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-d6j2x" Feb 23 07:04:00 crc kubenswrapper[5118]: I0223 07:04:00.638335 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-7m4sj" Feb 23 07:04:00 crc kubenswrapper[5118]: I0223 07:04:00.792258 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-dhlhx" Feb 23 07:04:00 crc kubenswrapper[5118]: I0223 07:04:00.806831 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-98cgm" Feb 23 07:04:00 crc kubenswrapper[5118]: I0223 07:04:00.865294 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-z5qcz" Feb 23 07:04:00 crc kubenswrapper[5118]: I0223 07:04:00.905847 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-kprdd" Feb 23 07:04:00 crc kubenswrapper[5118]: I0223 07:04:00.908596 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-5zc5v" Feb 23 07:04:00 crc kubenswrapper[5118]: I0223 07:04:00.933923 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-bfhhb" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.550880 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.561280 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b0cd84-f619-415b-9d18-bc6de1b1f40b-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5\" (UID: \"f4b0cd84-f619-415b-9d18-bc6de1b1f40b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.701407 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" event={"ID":"e7dabd60-e0b9-4ddf-9775-67bf8f922e54","Type":"ContainerStarted","Data":"968e6a157135ab4b510a30b87ab137f94f782ef6a233450bf4693ec27c9ae3e0"} Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.833841 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2js5p" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.841890 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.959000 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.959054 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.969577 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.975083 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39af282d-8ae3-4a52-9274-cd62d12a1ef1-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-b8rhr\" (UID: \"39af282d-8ae3-4a52-9274-cd62d12a1ef1\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.976592 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.976667 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.976748 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.978310 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a2567b9bb45aad766de8eaa23f029ab9162c75ff9459d11d1dce42cc736d50e9"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:04:02 crc kubenswrapper[5118]: I0223 07:04:02.978434 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://a2567b9bb45aad766de8eaa23f029ab9162c75ff9459d11d1dce42cc736d50e9" gracePeriod=600 Feb 23 07:04:03 crc kubenswrapper[5118]: I0223 07:04:03.158670 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vsnrz" Feb 23 07:04:03 crc kubenswrapper[5118]: I0223 07:04:03.166649 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:04:03 crc kubenswrapper[5118]: I0223 07:04:03.205191 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5"] Feb 23 07:04:03 crc kubenswrapper[5118]: I0223 07:04:03.430716 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr"] Feb 23 07:04:03 crc kubenswrapper[5118]: I0223 07:04:03.715319 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" event={"ID":"f4b0cd84-f619-415b-9d18-bc6de1b1f40b","Type":"ContainerStarted","Data":"5e065c0772a6bb982e1c98b541c75512c8993e14125782f6500b2e9f0c45e301"} Feb 23 07:04:03 crc kubenswrapper[5118]: I0223 07:04:03.718263 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="a2567b9bb45aad766de8eaa23f029ab9162c75ff9459d11d1dce42cc736d50e9" exitCode=0 Feb 23 07:04:03 crc kubenswrapper[5118]: I0223 07:04:03.718325 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"a2567b9bb45aad766de8eaa23f029ab9162c75ff9459d11d1dce42cc736d50e9"} Feb 23 07:04:03 crc kubenswrapper[5118]: I0223 07:04:03.718392 5118 scope.go:117] "RemoveContainer" containerID="52cbafc39cdb826b54cd76a2466f2669d01f6a03956f23813b0d4a4b258ebd73" Feb 23 07:04:03 crc kubenswrapper[5118]: I0223 07:04:03.723282 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" event={"ID":"39af282d-8ae3-4a52-9274-cd62d12a1ef1","Type":"ContainerStarted","Data":"41609e14bc940742563c74bedc846426d26fe777c00e18ce7df5d3f95f3bdeab"} Feb 23 07:04:06 crc kubenswrapper[5118]: I0223 07:04:06.537386 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-79wlh" Feb 23 07:04:07 crc kubenswrapper[5118]: I0223 07:04:07.788998 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" event={"ID":"39af282d-8ae3-4a52-9274-cd62d12a1ef1","Type":"ContainerStarted","Data":"24535e61722ae2d1f892aba8fab77c6b0638a888e4be2d717932a29f5365e612"} Feb 23 07:04:07 crc kubenswrapper[5118]: I0223 07:04:07.789745 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" Feb 23 07:04:07 crc kubenswrapper[5118]: I0223 07:04:07.789800 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:04:07 crc kubenswrapper[5118]: I0223 07:04:07.792888 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" Feb 23 07:04:07 crc kubenswrapper[5118]: I0223 07:04:07.863912 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" podStartSLOduration=37.863869077 podStartE2EDuration="37.863869077s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:04:07.857566419 +0000 UTC m=+1110.861351022" watchObservedRunningTime="2026-02-23 07:04:07.863869077 +0000 UTC m=+1110.867653660" Feb 23 07:04:07 crc kubenswrapper[5118]: I0223 07:04:07.885935 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-mgwl7" podStartSLOduration=9.891776751 podStartE2EDuration="37.885870906s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:32.110077301 +0000 UTC m=+1075.113861874" lastFinishedPulling="2026-02-23 07:04:00.104171456 +0000 UTC m=+1103.107956029" observedRunningTime="2026-02-23 07:04:07.878134723 +0000 UTC m=+1110.881919316" watchObservedRunningTime="2026-02-23 07:04:07.885870906 +0000 UTC m=+1110.889655529" Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.811337 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" event={"ID":"07e7dae1-170c-4c44-8ebf-4547edea1c65","Type":"ContainerStarted","Data":"578ed0bf7202a6abfb15628e3d4d29c96728738e9b089f6f8fdd7c43978803e8"} Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.812333 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.813439 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" event={"ID":"61637571-dc55-41bf-9765-415fa7a78100","Type":"ContainerStarted","Data":"ab4742cda778898e574dc8892fe52bb6b2adac49bf44dc247fb7df460ede8f80"} Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.813872 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.815726 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" event={"ID":"f4b0cd84-f619-415b-9d18-bc6de1b1f40b","Type":"ContainerStarted","Data":"11974425ab1fc489764159c6aa398cdf3f627ee72ae0552a64e72b575f00e53e"} Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.815854 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.819282 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"497192e0fff21b8738796a3ac22d605d2776126154853b7db5acd78788b414a6"} Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.820890 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" event={"ID":"d42094d4-0a02-4b3d-8bad-7d0a0bda3d47","Type":"ContainerStarted","Data":"7091f19110f0de9e80fa9fad894f356443b7ef6be81183069df9ef26b1530a41"} Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.821266 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.844005 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" podStartSLOduration=2.506305867 podStartE2EDuration="39.843961125s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:32.083568804 +0000 UTC m=+1075.087353377" lastFinishedPulling="2026-02-23 07:04:09.421224062 +0000 UTC m=+1112.425008635" observedRunningTime="2026-02-23 07:04:09.834593915 +0000 UTC m=+1112.838378498" watchObservedRunningTime="2026-02-23 07:04:09.843961125 +0000 UTC m=+1112.847745698" Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.885584 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" podStartSLOduration=33.615826262 podStartE2EDuration="39.885554555s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:04:03.225558615 +0000 UTC m=+1106.229343228" lastFinishedPulling="2026-02-23 07:04:09.495286948 +0000 UTC m=+1112.499071521" observedRunningTime="2026-02-23 07:04:09.884240914 +0000 UTC m=+1112.888025507" watchObservedRunningTime="2026-02-23 07:04:09.885554555 +0000 UTC m=+1112.889339128" Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.909418 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" podStartSLOduration=2.3730618039999998 podStartE2EDuration="39.909392237s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.883111217 +0000 UTC m=+1074.886895790" lastFinishedPulling="2026-02-23 07:04:09.41944164 +0000 UTC m=+1112.423226223" observedRunningTime="2026-02-23 07:04:09.90146443 +0000 UTC m=+1112.905249013" watchObservedRunningTime="2026-02-23 07:04:09.909392237 +0000 UTC m=+1112.913176810" Feb 23 07:04:09 crc kubenswrapper[5118]: I0223 07:04:09.916534 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" podStartSLOduration=2.066788159 podStartE2EDuration="39.916518455s" podCreationTimestamp="2026-02-23 07:03:30 +0000 UTC" firstStartedPulling="2026-02-23 07:03:31.568967597 +0000 UTC m=+1074.572752170" lastFinishedPulling="2026-02-23 07:04:09.418697893 +0000 UTC m=+1112.422482466" observedRunningTime="2026-02-23 07:04:09.914235912 +0000 UTC m=+1112.918020495" watchObservedRunningTime="2026-02-23 07:04:09.916518455 +0000 UTC m=+1112.920303028" Feb 23 07:04:10 crc kubenswrapper[5118]: I0223 07:04:10.888529 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-vqkbz" Feb 23 07:04:10 crc kubenswrapper[5118]: I0223 07:04:10.959189 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-4x7fr" Feb 23 07:04:10 crc kubenswrapper[5118]: I0223 07:04:10.971065 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-54gzb" Feb 23 07:04:11 crc kubenswrapper[5118]: I0223 07:04:11.043663 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-gvdcc" Feb 23 07:04:11 crc kubenswrapper[5118]: I0223 07:04:11.352059 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-xkf9z" Feb 23 07:04:13 crc kubenswrapper[5118]: I0223 07:04:13.178091 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-b8rhr" Feb 23 07:04:20 crc kubenswrapper[5118]: I0223 07:04:20.657292 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" Feb 23 07:04:20 crc kubenswrapper[5118]: I0223 07:04:20.939614 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-qhvx6" Feb 23 07:04:21 crc kubenswrapper[5118]: I0223 07:04:21.277813 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-4zd6f" Feb 23 07:04:22 crc kubenswrapper[5118]: I0223 07:04:22.853184 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.301827 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-srf62"] Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.308369 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-srf62" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.312020 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rmvnv" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.312683 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.312809 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.312817 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.343292 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-srf62"] Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.423649 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clt8r\" (UniqueName: \"kubernetes.io/projected/132705b8-36e1-4bbf-9068-5c232ac19be1-kube-api-access-clt8r\") pod \"dnsmasq-dns-855cbc58c5-srf62\" (UID: \"132705b8-36e1-4bbf-9068-5c232ac19be1\") " pod="openstack/dnsmasq-dns-855cbc58c5-srf62" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.423714 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/132705b8-36e1-4bbf-9068-5c232ac19be1-config\") pod \"dnsmasq-dns-855cbc58c5-srf62\" (UID: \"132705b8-36e1-4bbf-9068-5c232ac19be1\") " pod="openstack/dnsmasq-dns-855cbc58c5-srf62" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.509001 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-vwjd7"] Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.510181 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.515595 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.524714 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/132705b8-36e1-4bbf-9068-5c232ac19be1-config\") pod \"dnsmasq-dns-855cbc58c5-srf62\" (UID: \"132705b8-36e1-4bbf-9068-5c232ac19be1\") " pod="openstack/dnsmasq-dns-855cbc58c5-srf62" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.525050 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clt8r\" (UniqueName: \"kubernetes.io/projected/132705b8-36e1-4bbf-9068-5c232ac19be1-kube-api-access-clt8r\") pod \"dnsmasq-dns-855cbc58c5-srf62\" (UID: \"132705b8-36e1-4bbf-9068-5c232ac19be1\") " pod="openstack/dnsmasq-dns-855cbc58c5-srf62" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.525266 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-vwjd7"] Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.526981 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/132705b8-36e1-4bbf-9068-5c232ac19be1-config\") pod \"dnsmasq-dns-855cbc58c5-srf62\" (UID: \"132705b8-36e1-4bbf-9068-5c232ac19be1\") " pod="openstack/dnsmasq-dns-855cbc58c5-srf62" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.574201 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clt8r\" (UniqueName: \"kubernetes.io/projected/132705b8-36e1-4bbf-9068-5c232ac19be1-kube-api-access-clt8r\") pod \"dnsmasq-dns-855cbc58c5-srf62\" (UID: \"132705b8-36e1-4bbf-9068-5c232ac19be1\") " pod="openstack/dnsmasq-dns-855cbc58c5-srf62" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.627115 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-vwjd7\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.627225 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pgkt\" (UniqueName: \"kubernetes.io/projected/21f2c305-f1bb-416e-99da-9bf705a38e9f-kube-api-access-7pgkt\") pod \"dnsmasq-dns-6fcf94d689-vwjd7\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.627265 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-config\") pod \"dnsmasq-dns-6fcf94d689-vwjd7\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.628542 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-srf62" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.728861 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-vwjd7\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.728997 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pgkt\" (UniqueName: \"kubernetes.io/projected/21f2c305-f1bb-416e-99da-9bf705a38e9f-kube-api-access-7pgkt\") pod \"dnsmasq-dns-6fcf94d689-vwjd7\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.729058 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-config\") pod \"dnsmasq-dns-6fcf94d689-vwjd7\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.730821 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-config\") pod \"dnsmasq-dns-6fcf94d689-vwjd7\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.731673 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-vwjd7\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.761159 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pgkt\" (UniqueName: \"kubernetes.io/projected/21f2c305-f1bb-416e-99da-9bf705a38e9f-kube-api-access-7pgkt\") pod \"dnsmasq-dns-6fcf94d689-vwjd7\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.837945 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:04:38 crc kubenswrapper[5118]: I0223 07:04:38.923503 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-srf62"] Feb 23 07:04:38 crc kubenswrapper[5118]: W0223 07:04:38.939662 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod132705b8_36e1_4bbf_9068_5c232ac19be1.slice/crio-9d308ffadb8a0032079d69977af833660562f978c5c1aa228fdf4d9a2530b2a9 WatchSource:0}: Error finding container 9d308ffadb8a0032079d69977af833660562f978c5c1aa228fdf4d9a2530b2a9: Status 404 returned error can't find the container with id 9d308ffadb8a0032079d69977af833660562f978c5c1aa228fdf4d9a2530b2a9 Feb 23 07:04:39 crc kubenswrapper[5118]: I0223 07:04:39.104191 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-srf62" event={"ID":"132705b8-36e1-4bbf-9068-5c232ac19be1","Type":"ContainerStarted","Data":"9d308ffadb8a0032079d69977af833660562f978c5c1aa228fdf4d9a2530b2a9"} Feb 23 07:04:39 crc kubenswrapper[5118]: I0223 07:04:39.324997 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-vwjd7"] Feb 23 07:04:39 crc kubenswrapper[5118]: W0223 07:04:39.326503 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21f2c305_f1bb_416e_99da_9bf705a38e9f.slice/crio-ac937059dac6b47d7d0faca2b9142e16f02a377e2d88c797a19b40f8523c2b2e WatchSource:0}: Error finding container ac937059dac6b47d7d0faca2b9142e16f02a377e2d88c797a19b40f8523c2b2e: Status 404 returned error can't find the container with id ac937059dac6b47d7d0faca2b9142e16f02a377e2d88c797a19b40f8523c2b2e Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.122850 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" event={"ID":"21f2c305-f1bb-416e-99da-9bf705a38e9f","Type":"ContainerStarted","Data":"ac937059dac6b47d7d0faca2b9142e16f02a377e2d88c797a19b40f8523c2b2e"} Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.565575 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-vwjd7"] Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.584464 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-gbff8"] Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.585584 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.609135 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-gbff8"] Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.669778 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6rm8\" (UniqueName: \"kubernetes.io/projected/97208c3d-4be8-473e-8949-b3c9a5f274f9-kube-api-access-x6rm8\") pod \"dnsmasq-dns-f54874ffc-gbff8\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.669860 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-config\") pod \"dnsmasq-dns-f54874ffc-gbff8\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.669900 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-dns-svc\") pod \"dnsmasq-dns-f54874ffc-gbff8\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.773346 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6rm8\" (UniqueName: \"kubernetes.io/projected/97208c3d-4be8-473e-8949-b3c9a5f274f9-kube-api-access-x6rm8\") pod \"dnsmasq-dns-f54874ffc-gbff8\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.773417 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-config\") pod \"dnsmasq-dns-f54874ffc-gbff8\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.773471 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-dns-svc\") pod \"dnsmasq-dns-f54874ffc-gbff8\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.774590 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-config\") pod \"dnsmasq-dns-f54874ffc-gbff8\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.777529 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-dns-svc\") pod \"dnsmasq-dns-f54874ffc-gbff8\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.818263 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6rm8\" (UniqueName: \"kubernetes.io/projected/97208c3d-4be8-473e-8949-b3c9a5f274f9-kube-api-access-x6rm8\") pod \"dnsmasq-dns-f54874ffc-gbff8\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:40 crc kubenswrapper[5118]: I0223 07:04:40.909801 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.094152 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-srf62"] Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.171809 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-67mqx"] Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.174968 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.196626 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-67mqx"] Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.292182 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npfhw\" (UniqueName: \"kubernetes.io/projected/ccba9518-711f-4a31-aff7-1817619a7a30-kube-api-access-npfhw\") pod \"dnsmasq-dns-67ff45466c-67mqx\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.292224 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-dns-svc\") pod \"dnsmasq-dns-67ff45466c-67mqx\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.292281 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-config\") pod \"dnsmasq-dns-67ff45466c-67mqx\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.393650 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-config\") pod \"dnsmasq-dns-67ff45466c-67mqx\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.393779 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npfhw\" (UniqueName: \"kubernetes.io/projected/ccba9518-711f-4a31-aff7-1817619a7a30-kube-api-access-npfhw\") pod \"dnsmasq-dns-67ff45466c-67mqx\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.393801 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-dns-svc\") pod \"dnsmasq-dns-67ff45466c-67mqx\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.394913 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-dns-svc\") pod \"dnsmasq-dns-67ff45466c-67mqx\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.395388 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-config\") pod \"dnsmasq-dns-67ff45466c-67mqx\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.414809 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npfhw\" (UniqueName: \"kubernetes.io/projected/ccba9518-711f-4a31-aff7-1817619a7a30-kube-api-access-npfhw\") pod \"dnsmasq-dns-67ff45466c-67mqx\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.503260 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.553652 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-gbff8"] Feb 23 07:04:41 crc kubenswrapper[5118]: W0223 07:04:41.564078 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97208c3d_4be8_473e_8949_b3c9a5f274f9.slice/crio-18a72f4bf3dcc4fb358921fe625a93b13f1430790cfdd0a4e05e60d651a4c6b0 WatchSource:0}: Error finding container 18a72f4bf3dcc4fb358921fe625a93b13f1430790cfdd0a4e05e60d651a4c6b0: Status 404 returned error can't find the container with id 18a72f4bf3dcc4fb358921fe625a93b13f1430790cfdd0a4e05e60d651a4c6b0 Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.726067 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.727819 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.734540 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.735072 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.735592 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.735748 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.735907 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9sf89" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.736180 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.736357 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.741552 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800146 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800328 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800405 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800456 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800485 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3b37356-5c38-40b3-af55-4f25a2f16b21-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800551 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800599 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800669 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800694 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800718 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3b37356-5c38-40b3-af55-4f25a2f16b21-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.800774 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn2d7\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-kube-api-access-sn2d7\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902372 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902416 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902439 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902459 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3b37356-5c38-40b3-af55-4f25a2f16b21-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902491 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902544 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902573 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902590 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902609 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3b37356-5c38-40b3-af55-4f25a2f16b21-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902631 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn2d7\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-kube-api-access-sn2d7\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.902653 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.903455 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.903931 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.908025 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.908249 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.912415 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.913583 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.913815 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3b37356-5c38-40b3-af55-4f25a2f16b21-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.914823 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.916794 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.928163 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn2d7\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-kube-api-access-sn2d7\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.933911 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3b37356-5c38-40b3-af55-4f25a2f16b21-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.941309 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " pod="openstack/rabbitmq-server-0" Feb 23 07:04:41 crc kubenswrapper[5118]: I0223 07:04:41.984222 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-67mqx"] Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.064603 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.179876 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" event={"ID":"ccba9518-711f-4a31-aff7-1817619a7a30","Type":"ContainerStarted","Data":"55d16a28b5b004d82e740e5f07b4ee491b362004faf2eaa40e41785526341bf5"} Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.181237 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-gbff8" event={"ID":"97208c3d-4be8-473e-8949-b3c9a5f274f9","Type":"ContainerStarted","Data":"18a72f4bf3dcc4fb358921fe625a93b13f1430790cfdd0a4e05e60d651a4c6b0"} Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.269475 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.271768 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.277311 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.277583 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.277844 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-87v4t" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.278976 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.281820 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.281823 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.289407 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.291688 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.321711 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5721793b-d753-4519-b484-fa9cb958def9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.321762 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.321804 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.321825 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.321856 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5721793b-d753-4519-b484-fa9cb958def9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.321874 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh82w\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-kube-api-access-fh82w\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.321891 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.321946 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.321987 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.322012 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.322039 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.419271 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.424039 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.424174 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5721793b-d753-4519-b484-fa9cb958def9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.424213 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.424262 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.424285 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.424311 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5721793b-d753-4519-b484-fa9cb958def9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.424333 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh82w\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-kube-api-access-fh82w\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.424340 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.425279 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.425286 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.425343 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.424349 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.425551 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.425652 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.425709 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.425950 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.426361 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.433781 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5721793b-d753-4519-b484-fa9cb958def9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.434471 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5721793b-d753-4519-b484-fa9cb958def9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.434738 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.434801 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.442932 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh82w\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-kube-api-access-fh82w\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.455253 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:42 crc kubenswrapper[5118]: I0223 07:04:42.632009 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:04:43 crc kubenswrapper[5118]: I0223 07:04:43.192251 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e3b37356-5c38-40b3-af55-4f25a2f16b21","Type":"ContainerStarted","Data":"de6bfc3dbe9249393b3ded714f1438bc7da2bcefbb345ebbff0a39d67d7c978d"} Feb 23 07:04:43 crc kubenswrapper[5118]: I0223 07:04:43.394642 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:04:43 crc kubenswrapper[5118]: I0223 07:04:43.964832 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:04:43 crc kubenswrapper[5118]: I0223 07:04:43.976141 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 07:04:43 crc kubenswrapper[5118]: I0223 07:04:43.985282 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bnkg4" Feb 23 07:04:43 crc kubenswrapper[5118]: I0223 07:04:43.986450 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 23 07:04:43 crc kubenswrapper[5118]: I0223 07:04:43.991919 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.023953 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.024632 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.041899 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.077153 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-default\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.077240 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-kolla-config\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.077275 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.077302 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.077317 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzk42\" (UniqueName: \"kubernetes.io/projected/d08c9f04-59d9-4892-a540-5c892c604a71-kube-api-access-tzk42\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.077340 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.077363 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.077391 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.179183 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.179247 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.179268 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzk42\" (UniqueName: \"kubernetes.io/projected/d08c9f04-59d9-4892-a540-5c892c604a71-kube-api-access-tzk42\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.179292 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.179318 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.179349 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.179384 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-default\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.179441 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-kolla-config\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.180963 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.181129 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.181652 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-default\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.182017 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.183110 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-kolla-config\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.192183 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.192191 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.200688 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzk42\" (UniqueName: \"kubernetes.io/projected/d08c9f04-59d9-4892-a540-5c892c604a71-kube-api-access-tzk42\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.226815 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " pod="openstack/openstack-galera-0" Feb 23 07:04:44 crc kubenswrapper[5118]: I0223 07:04:44.337079 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.088911 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.090240 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.096379 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-g67h5" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.096525 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.096806 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.096999 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.114863 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.204247 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.204381 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k2sk\" (UniqueName: \"kubernetes.io/projected/178ef478-d8d3-49a5-9188-9970d3859049-kube-api-access-6k2sk\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.204446 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/178ef478-d8d3-49a5-9188-9970d3859049-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.204467 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.204540 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.204560 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.204691 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.204759 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.306090 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/178ef478-d8d3-49a5-9188-9970d3859049-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.306191 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.306247 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.306281 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.306330 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.306356 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.306382 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.306411 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k2sk\" (UniqueName: \"kubernetes.io/projected/178ef478-d8d3-49a5-9188-9970d3859049-kube-api-access-6k2sk\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.306694 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/178ef478-d8d3-49a5-9188-9970d3859049-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.309652 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.309767 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.309931 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.310264 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.313678 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.318075 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.327381 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k2sk\" (UniqueName: \"kubernetes.io/projected/178ef478-d8d3-49a5-9188-9970d3859049-kube-api-access-6k2sk\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.345369 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.417716 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.494182 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.499061 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.504626 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-747zt" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.506924 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.507212 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.511581 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kolla-config\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.511624 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.511667 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8wl8\" (UniqueName: \"kubernetes.io/projected/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kube-api-access-r8wl8\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.511687 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-config-data\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.511717 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.516405 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.614478 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.614592 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8wl8\" (UniqueName: \"kubernetes.io/projected/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kube-api-access-r8wl8\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.614630 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-config-data\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.614675 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.614790 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kolla-config\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.615704 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kolla-config\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.615925 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-config-data\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.618425 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.626055 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.636589 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8wl8\" (UniqueName: \"kubernetes.io/projected/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kube-api-access-r8wl8\") pod \"memcached-0\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " pod="openstack/memcached-0" Feb 23 07:04:45 crc kubenswrapper[5118]: I0223 07:04:45.832034 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 07:04:47 crc kubenswrapper[5118]: I0223 07:04:47.619848 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:04:47 crc kubenswrapper[5118]: I0223 07:04:47.620955 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:04:47 crc kubenswrapper[5118]: I0223 07:04:47.624249 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5vbjq" Feb 23 07:04:47 crc kubenswrapper[5118]: I0223 07:04:47.640127 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:04:47 crc kubenswrapper[5118]: I0223 07:04:47.660549 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn7t5\" (UniqueName: \"kubernetes.io/projected/6249b699-2754-43b0-ae07-e726d80c5233-kube-api-access-pn7t5\") pod \"kube-state-metrics-0\" (UID: \"6249b699-2754-43b0-ae07-e726d80c5233\") " pod="openstack/kube-state-metrics-0" Feb 23 07:04:47 crc kubenswrapper[5118]: I0223 07:04:47.762078 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn7t5\" (UniqueName: \"kubernetes.io/projected/6249b699-2754-43b0-ae07-e726d80c5233-kube-api-access-pn7t5\") pod \"kube-state-metrics-0\" (UID: \"6249b699-2754-43b0-ae07-e726d80c5233\") " pod="openstack/kube-state-metrics-0" Feb 23 07:04:47 crc kubenswrapper[5118]: I0223 07:04:47.790078 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn7t5\" (UniqueName: \"kubernetes.io/projected/6249b699-2754-43b0-ae07-e726d80c5233-kube-api-access-pn7t5\") pod \"kube-state-metrics-0\" (UID: \"6249b699-2754-43b0-ae07-e726d80c5233\") " pod="openstack/kube-state-metrics-0" Feb 23 07:04:47 crc kubenswrapper[5118]: I0223 07:04:47.946790 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:04:49 crc kubenswrapper[5118]: I0223 07:04:49.247400 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5721793b-d753-4519-b484-fa9cb958def9","Type":"ContainerStarted","Data":"98d02a05363a9f33a34e46dba064e4d303fb9272026a6b4a3347a78c263c4417"} Feb 23 07:04:50 crc kubenswrapper[5118]: I0223 07:04:50.982069 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:04:50 crc kubenswrapper[5118]: I0223 07:04:50.984982 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:50 crc kubenswrapper[5118]: I0223 07:04:50.990748 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 23 07:04:50 crc kubenswrapper[5118]: I0223 07:04:50.990979 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 23 07:04:50 crc kubenswrapper[5118]: I0223 07:04:50.991001 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 23 07:04:50 crc kubenswrapper[5118]: I0223 07:04:50.991575 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-7f2rl" Feb 23 07:04:50 crc kubenswrapper[5118]: I0223 07:04:50.995333 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 23 07:04:50 crc kubenswrapper[5118]: I0223 07:04:50.997767 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.116715 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.116794 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.116828 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.116864 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.116886 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.117031 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqm47\" (UniqueName: \"kubernetes.io/projected/015de465-4df3-4178-b28b-3dd5ec0f37aa-kube-api-access-bqm47\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.117272 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.117434 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.204371 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-f8rdg"] Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.210152 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.217664 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.218212 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-gsr44" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.218381 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.218642 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.218711 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.218763 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.218798 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.218828 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqm47\" (UniqueName: \"kubernetes.io/projected/015de465-4df3-4178-b28b-3dd5ec0f37aa-kube-api-access-bqm47\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.218851 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.218895 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.218947 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.219312 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.225468 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.226782 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dvc2v"] Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.227991 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-config\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.228286 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.228324 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.231264 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.232379 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f8rdg"] Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.232537 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.246617 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqm47\" (UniqueName: \"kubernetes.io/projected/015de465-4df3-4178-b28b-3dd5ec0f37aa-kube-api-access-bqm47\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.269977 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.280569 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.296878 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dvc2v"] Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.318823 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.320141 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rst7g\" (UniqueName: \"kubernetes.io/projected/c6bf46e9-d93e-4754-9f48-fc598c9e1359-kube-api-access-rst7g\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.320211 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-log-ovn\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.320237 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.320271 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6bf46e9-d93e-4754-9f48-fc598c9e1359-scripts\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.320298 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-ovn-controller-tls-certs\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.320341 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-combined-ca-bundle\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.320360 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run-ovn\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.422351 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rst7g\" (UniqueName: \"kubernetes.io/projected/c6bf46e9-d93e-4754-9f48-fc598c9e1359-kube-api-access-rst7g\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.422409 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxnx\" (UniqueName: \"kubernetes.io/projected/68f11050-5931-4be3-8e5b-194035e88020-kube-api-access-7cxnx\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.422435 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-log\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.422458 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-etc-ovs\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.422613 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-log-ovn\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.422668 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.422764 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68f11050-5931-4be3-8e5b-194035e88020-scripts\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.422792 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-ovn-controller-tls-certs\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.422994 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-combined-ca-bundle\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.423199 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-run\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.423251 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6bf46e9-d93e-4754-9f48-fc598c9e1359-scripts\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.423350 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-lib\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.423376 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run-ovn\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.423427 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.423553 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-log-ovn\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.423692 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run-ovn\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.426053 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-ovn-controller-tls-certs\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.426305 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6bf46e9-d93e-4754-9f48-fc598c9e1359-scripts\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.441270 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rst7g\" (UniqueName: \"kubernetes.io/projected/c6bf46e9-d93e-4754-9f48-fc598c9e1359-kube-api-access-rst7g\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.441470 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-combined-ca-bundle\") pod \"ovn-controller-f8rdg\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.525435 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-etc-ovs\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.525521 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68f11050-5931-4be3-8e5b-194035e88020-scripts\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.525596 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-run\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.525642 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-lib\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.525672 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxnx\" (UniqueName: \"kubernetes.io/projected/68f11050-5931-4be3-8e5b-194035e88020-kube-api-access-7cxnx\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.525702 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-log\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.525840 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-run\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.525846 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-etc-ovs\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.525940 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-log\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.526058 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-lib\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.529266 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68f11050-5931-4be3-8e5b-194035e88020-scripts\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.550622 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxnx\" (UniqueName: \"kubernetes.io/projected/68f11050-5931-4be3-8e5b-194035e88020-kube-api-access-7cxnx\") pod \"ovn-controller-ovs-dvc2v\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.641986 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8rdg" Feb 23 07:04:51 crc kubenswrapper[5118]: I0223 07:04:51.655941 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.266704 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.269305 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.272361 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.272477 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.272714 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.273382 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-vrjwc" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.274236 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.396809 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.396890 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dcvv\" (UniqueName: \"kubernetes.io/projected/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-kube-api-access-9dcvv\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.396925 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.397002 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.397131 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.397294 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.397378 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.397451 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-config\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.500429 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.500534 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.500561 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.500612 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.500648 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.500681 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-config\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.500725 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.500775 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dcvv\" (UniqueName: \"kubernetes.io/projected/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-kube-api-access-9dcvv\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.503789 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.510707 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.517291 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-config\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.517676 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.525156 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.525309 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.525360 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.530471 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.548405 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dcvv\" (UniqueName: \"kubernetes.io/projected/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-kube-api-access-9dcvv\") pod \"ovsdbserver-sb-0\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " pod="openstack/ovsdbserver-sb-0" Feb 23 07:04:55 crc kubenswrapper[5118]: I0223 07:04:55.599472 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.311317 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.312047 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7pgkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-vwjd7_openstack(21f2c305-f1bb-416e-99da-9bf705a38e9f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.314601 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" podUID="21f2c305-f1bb-416e-99da-9bf705a38e9f" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.328491 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.328748 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6rm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-f54874ffc-gbff8_openstack(97208c3d-4be8-473e-8949-b3c9a5f274f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.330179 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-f54874ffc-gbff8" podUID="97208c3d-4be8-473e-8949-b3c9a5f274f9" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.337807 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.337982 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-npfhw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67ff45466c-67mqx_openstack(ccba9518-711f-4a31-aff7-1817619a7a30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.339303 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" podUID="ccba9518-711f-4a31-aff7-1817619a7a30" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.355024 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.355328 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-clt8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-srf62_openstack(132705b8-36e1-4bbf-9068-5c232ac19be1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.357380 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-srf62" podUID="132705b8-36e1-4bbf-9068-5c232ac19be1" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.465235 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" podUID="ccba9518-711f-4a31-aff7-1817619a7a30" Feb 23 07:05:03 crc kubenswrapper[5118]: E0223 07:05:03.484818 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-f54874ffc-gbff8" podUID="97208c3d-4be8-473e-8949-b3c9a5f274f9" Feb 23 07:05:03 crc kubenswrapper[5118]: I0223 07:05:03.805744 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:05:03 crc kubenswrapper[5118]: I0223 07:05:03.954000 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.106433 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-dns-svc\") pod \"21f2c305-f1bb-416e-99da-9bf705a38e9f\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.106700 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pgkt\" (UniqueName: \"kubernetes.io/projected/21f2c305-f1bb-416e-99da-9bf705a38e9f-kube-api-access-7pgkt\") pod \"21f2c305-f1bb-416e-99da-9bf705a38e9f\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.106744 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-config\") pod \"21f2c305-f1bb-416e-99da-9bf705a38e9f\" (UID: \"21f2c305-f1bb-416e-99da-9bf705a38e9f\") " Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.106888 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21f2c305-f1bb-416e-99da-9bf705a38e9f" (UID: "21f2c305-f1bb-416e-99da-9bf705a38e9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.107157 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.107467 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-config" (OuterVolumeSpecName: "config") pod "21f2c305-f1bb-416e-99da-9bf705a38e9f" (UID: "21f2c305-f1bb-416e-99da-9bf705a38e9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.114682 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f2c305-f1bb-416e-99da-9bf705a38e9f-kube-api-access-7pgkt" (OuterVolumeSpecName: "kube-api-access-7pgkt") pod "21f2c305-f1bb-416e-99da-9bf705a38e9f" (UID: "21f2c305-f1bb-416e-99da-9bf705a38e9f"). InnerVolumeSpecName "kube-api-access-7pgkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.142844 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f8rdg"] Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.158315 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-srf62" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.209374 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pgkt\" (UniqueName: \"kubernetes.io/projected/21f2c305-f1bb-416e-99da-9bf705a38e9f-kube-api-access-7pgkt\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.209413 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21f2c305-f1bb-416e-99da-9bf705a38e9f-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.310727 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clt8r\" (UniqueName: \"kubernetes.io/projected/132705b8-36e1-4bbf-9068-5c232ac19be1-kube-api-access-clt8r\") pod \"132705b8-36e1-4bbf-9068-5c232ac19be1\" (UID: \"132705b8-36e1-4bbf-9068-5c232ac19be1\") " Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.310873 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/132705b8-36e1-4bbf-9068-5c232ac19be1-config\") pod \"132705b8-36e1-4bbf-9068-5c232ac19be1\" (UID: \"132705b8-36e1-4bbf-9068-5c232ac19be1\") " Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.311934 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/132705b8-36e1-4bbf-9068-5c232ac19be1-config" (OuterVolumeSpecName: "config") pod "132705b8-36e1-4bbf-9068-5c232ac19be1" (UID: "132705b8-36e1-4bbf-9068-5c232ac19be1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.329155 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dvc2v"] Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.381701 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.391845 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.413805 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/132705b8-36e1-4bbf-9068-5c232ac19be1-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.417385 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132705b8-36e1-4bbf-9068-5c232ac19be1-kube-api-access-clt8r" (OuterVolumeSpecName: "kube-api-access-clt8r") pod "132705b8-36e1-4bbf-9068-5c232ac19be1" (UID: "132705b8-36e1-4bbf-9068-5c232ac19be1"). InnerVolumeSpecName "kube-api-access-clt8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.439342 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.493136 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6249b699-2754-43b0-ae07-e726d80c5233","Type":"ContainerStarted","Data":"5a019e560972794adcea298997061d3ea7c2e95996d214e59f3ad5f5853f8f88"} Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.494479 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8rdg" event={"ID":"c6bf46e9-d93e-4754-9f48-fc598c9e1359","Type":"ContainerStarted","Data":"51baf97df9c9fc1dde9276e6727d223947f6b7974e9089ff60b43dab0132becf"} Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.495973 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d08c9f04-59d9-4892-a540-5c892c604a71","Type":"ContainerStarted","Data":"ad90bffdaded534d0ab312196f1748105833f72f25de5f940c93acd8dca99b4c"} Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.499573 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dvc2v" event={"ID":"68f11050-5931-4be3-8e5b-194035e88020","Type":"ContainerStarted","Data":"9301e9f414bcb2a4c27bd7223faf1d0cca6f8bed94f16cee9612a8fec1e089b3"} Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.504626 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" event={"ID":"21f2c305-f1bb-416e-99da-9bf705a38e9f","Type":"ContainerDied","Data":"ac937059dac6b47d7d0faca2b9142e16f02a377e2d88c797a19b40f8523c2b2e"} Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.504765 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-vwjd7" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.505882 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"178ef478-d8d3-49a5-9188-9970d3859049","Type":"ContainerStarted","Data":"11feabea80b251ccced75ac02e3dc37398e6a4b3a896a11729e74ab950ac1672"} Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.508356 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-srf62" event={"ID":"132705b8-36e1-4bbf-9068-5c232ac19be1","Type":"ContainerDied","Data":"9d308ffadb8a0032079d69977af833660562f978c5c1aa228fdf4d9a2530b2a9"} Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.508448 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-srf62" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.516715 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.516838 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clt8r\" (UniqueName: \"kubernetes.io/projected/132705b8-36e1-4bbf-9068-5c232ac19be1-kube-api-access-clt8r\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.645533 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-vwjd7"] Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.653206 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-vwjd7"] Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.673716 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.686938 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-srf62"] Feb 23 07:05:04 crc kubenswrapper[5118]: I0223 07:05:04.692677 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-srf62"] Feb 23 07:05:04 crc kubenswrapper[5118]: W0223 07:05:04.752398 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod015de465_4df3_4178_b28b_3dd5ec0f37aa.slice/crio-aebfe7f2b2170606fd6785625c51c21b5631b8b51e702b5e19689be0347c590b WatchSource:0}: Error finding container aebfe7f2b2170606fd6785625c51c21b5631b8b51e702b5e19689be0347c590b: Status 404 returned error can't find the container with id aebfe7f2b2170606fd6785625c51c21b5631b8b51e702b5e19689be0347c590b Feb 23 07:05:05 crc kubenswrapper[5118]: I0223 07:05:05.520985 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e3b37356-5c38-40b3-af55-4f25a2f16b21","Type":"ContainerStarted","Data":"cdb183b32df15cc29470c163dcec76e49a1bcccca2adcf52c5951d4b1ff228f2"} Feb 23 07:05:05 crc kubenswrapper[5118]: I0223 07:05:05.526613 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"015de465-4df3-4178-b28b-3dd5ec0f37aa","Type":"ContainerStarted","Data":"aebfe7f2b2170606fd6785625c51c21b5631b8b51e702b5e19689be0347c590b"} Feb 23 07:05:05 crc kubenswrapper[5118]: I0223 07:05:05.528524 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"52ab08e4-a114-4c99-adc6-dc05f711d8d9","Type":"ContainerStarted","Data":"2bd32646f8f8eec7c59c41f532ac0fc44795487171516ec5a5dcbbf77f899b7f"} Feb 23 07:05:05 crc kubenswrapper[5118]: I0223 07:05:05.532766 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5721793b-d753-4519-b484-fa9cb958def9","Type":"ContainerStarted","Data":"765ded8ac4c466b73ff678e59e3c1bd2c7e26f27d5b181a36f4fa8da845c96bd"} Feb 23 07:05:05 crc kubenswrapper[5118]: I0223 07:05:05.537574 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2474920d-9d8a-4fc8-a8bc-7844ed0ef139","Type":"ContainerStarted","Data":"beaf5e0519e34249b5662fadd22cd742ef8a03b6ba0e170d7b95319af9915030"} Feb 23 07:05:05 crc kubenswrapper[5118]: I0223 07:05:05.721618 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132705b8-36e1-4bbf-9068-5c232ac19be1" path="/var/lib/kubelet/pods/132705b8-36e1-4bbf-9068-5c232ac19be1/volumes" Feb 23 07:05:05 crc kubenswrapper[5118]: I0223 07:05:05.722475 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21f2c305-f1bb-416e-99da-9bf705a38e9f" path="/var/lib/kubelet/pods/21f2c305-f1bb-416e-99da-9bf705a38e9f/volumes" Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.628461 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"52ab08e4-a114-4c99-adc6-dc05f711d8d9","Type":"ContainerStarted","Data":"275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97"} Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.629913 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.638312 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d08c9f04-59d9-4892-a540-5c892c604a71","Type":"ContainerStarted","Data":"566012d8143e61ef90b99119f834d9897363370f946eb6cf21393af728dda6a1"} Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.642629 5118 generic.go:334] "Generic (PLEG): container finished" podID="68f11050-5931-4be3-8e5b-194035e88020" containerID="8fadd3a879ca766fdf860608b51130b44f68888641fff695fc8c3f558ac8fed4" exitCode=0 Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.642673 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dvc2v" event={"ID":"68f11050-5931-4be3-8e5b-194035e88020","Type":"ContainerDied","Data":"8fadd3a879ca766fdf860608b51130b44f68888641fff695fc8c3f558ac8fed4"} Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.644458 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2474920d-9d8a-4fc8-a8bc-7844ed0ef139","Type":"ContainerStarted","Data":"ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d"} Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.647145 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"178ef478-d8d3-49a5-9188-9970d3859049","Type":"ContainerStarted","Data":"fe75c42f85412ce15214f8ef6bc41dd0ed931d945a2697c44e4ce3aee60b1692"} Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.649372 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6249b699-2754-43b0-ae07-e726d80c5233","Type":"ContainerStarted","Data":"f1fb890b79737bfbf99ce5ecc106b736e0487142f1609990b3b71bfce17058cd"} Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.649764 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.657460 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.064811863 podStartE2EDuration="28.657442436s" podCreationTimestamp="2026-02-23 07:04:45 +0000 UTC" firstStartedPulling="2026-02-23 07:05:04.58396981 +0000 UTC m=+1167.587754383" lastFinishedPulling="2026-02-23 07:05:12.176600383 +0000 UTC m=+1175.180384956" observedRunningTime="2026-02-23 07:05:13.649413397 +0000 UTC m=+1176.653197970" watchObservedRunningTime="2026-02-23 07:05:13.657442436 +0000 UTC m=+1176.661227009" Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.670917 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"015de465-4df3-4178-b28b-3dd5ec0f37aa","Type":"ContainerStarted","Data":"49b70411a72db2ea12793c02e06b682bbf363fae840fa8bc8f2d66bc890a6eef"} Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.674640 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8rdg" event={"ID":"c6bf46e9-d93e-4754-9f48-fc598c9e1359","Type":"ContainerStarted","Data":"682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd"} Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.674910 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-f8rdg" Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.776777 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-f8rdg" podStartSLOduration=14.718980901 podStartE2EDuration="22.776751648s" podCreationTimestamp="2026-02-23 07:04:51 +0000 UTC" firstStartedPulling="2026-02-23 07:05:04.142527435 +0000 UTC m=+1167.146311998" lastFinishedPulling="2026-02-23 07:05:12.200298162 +0000 UTC m=+1175.204082745" observedRunningTime="2026-02-23 07:05:13.763700391 +0000 UTC m=+1176.767484964" watchObservedRunningTime="2026-02-23 07:05:13.776751648 +0000 UTC m=+1176.780536211" Feb 23 07:05:13 crc kubenswrapper[5118]: I0223 07:05:13.785430 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.448896165 podStartE2EDuration="26.785409642s" podCreationTimestamp="2026-02-23 07:04:47 +0000 UTC" firstStartedPulling="2026-02-23 07:05:03.835208622 +0000 UTC m=+1166.838993195" lastFinishedPulling="2026-02-23 07:05:12.171722099 +0000 UTC m=+1175.175506672" observedRunningTime="2026-02-23 07:05:13.777428195 +0000 UTC m=+1176.781212778" watchObservedRunningTime="2026-02-23 07:05:13.785409642 +0000 UTC m=+1176.789194215" Feb 23 07:05:14 crc kubenswrapper[5118]: I0223 07:05:14.689784 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dvc2v" event={"ID":"68f11050-5931-4be3-8e5b-194035e88020","Type":"ContainerStarted","Data":"b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5"} Feb 23 07:05:15 crc kubenswrapper[5118]: I0223 07:05:15.710417 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"015de465-4df3-4178-b28b-3dd5ec0f37aa","Type":"ContainerStarted","Data":"2ddc356225a64a2567d2c939ebf5232096d7165e8a350bbdddf0c8812106b5e2"} Feb 23 07:05:15 crc kubenswrapper[5118]: I0223 07:05:15.711550 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2474920d-9d8a-4fc8-a8bc-7844ed0ef139","Type":"ContainerStarted","Data":"4bcc61517bff899445a02cd4853722bfdc2eee81bfb4ecb4fabf4e9cdeba87db"} Feb 23 07:05:15 crc kubenswrapper[5118]: I0223 07:05:15.713087 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dvc2v" event={"ID":"68f11050-5931-4be3-8e5b-194035e88020","Type":"ContainerStarted","Data":"b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d"} Feb 23 07:05:15 crc kubenswrapper[5118]: I0223 07:05:15.713360 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:05:15 crc kubenswrapper[5118]: I0223 07:05:15.715015 5118 generic.go:334] "Generic (PLEG): container finished" podID="ccba9518-711f-4a31-aff7-1817619a7a30" containerID="4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b" exitCode=0 Feb 23 07:05:15 crc kubenswrapper[5118]: I0223 07:05:15.715220 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" event={"ID":"ccba9518-711f-4a31-aff7-1817619a7a30","Type":"ContainerDied","Data":"4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b"} Feb 23 07:05:15 crc kubenswrapper[5118]: I0223 07:05:15.740389 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.606558762 podStartE2EDuration="26.74036215s" podCreationTimestamp="2026-02-23 07:04:49 +0000 UTC" firstStartedPulling="2026-02-23 07:05:04.756082926 +0000 UTC m=+1167.759867499" lastFinishedPulling="2026-02-23 07:05:14.889886314 +0000 UTC m=+1177.893670887" observedRunningTime="2026-02-23 07:05:15.729321119 +0000 UTC m=+1178.733105712" watchObservedRunningTime="2026-02-23 07:05:15.74036215 +0000 UTC m=+1178.744146733" Feb 23 07:05:15 crc kubenswrapper[5118]: I0223 07:05:15.764081 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.437124639 podStartE2EDuration="21.764057478s" podCreationTimestamp="2026-02-23 07:04:54 +0000 UTC" firstStartedPulling="2026-02-23 07:05:04.575607183 +0000 UTC m=+1167.579391746" lastFinishedPulling="2026-02-23 07:05:14.902540022 +0000 UTC m=+1177.906324585" observedRunningTime="2026-02-23 07:05:15.755039655 +0000 UTC m=+1178.758824268" watchObservedRunningTime="2026-02-23 07:05:15.764057478 +0000 UTC m=+1178.767842061" Feb 23 07:05:15 crc kubenswrapper[5118]: I0223 07:05:15.816356 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dvc2v" podStartSLOduration=16.9705564 podStartE2EDuration="24.81633429s" podCreationTimestamp="2026-02-23 07:04:51 +0000 UTC" firstStartedPulling="2026-02-23 07:05:04.347507497 +0000 UTC m=+1167.351292090" lastFinishedPulling="2026-02-23 07:05:12.193285417 +0000 UTC m=+1175.197069980" observedRunningTime="2026-02-23 07:05:15.811525166 +0000 UTC m=+1178.815309779" watchObservedRunningTime="2026-02-23 07:05:15.81633429 +0000 UTC m=+1178.820118863" Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.319209 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.600604 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.656232 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.730682 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" event={"ID":"ccba9518-711f-4a31-aff7-1817619a7a30","Type":"ContainerStarted","Data":"263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb"} Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.731033 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.736798 5118 generic.go:334] "Generic (PLEG): container finished" podID="d08c9f04-59d9-4892-a540-5c892c604a71" containerID="566012d8143e61ef90b99119f834d9897363370f946eb6cf21393af728dda6a1" exitCode=0 Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.736964 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d08c9f04-59d9-4892-a540-5c892c604a71","Type":"ContainerDied","Data":"566012d8143e61ef90b99119f834d9897363370f946eb6cf21393af728dda6a1"} Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.741585 5118 generic.go:334] "Generic (PLEG): container finished" podID="178ef478-d8d3-49a5-9188-9970d3859049" containerID="fe75c42f85412ce15214f8ef6bc41dd0ed931d945a2697c44e4ce3aee60b1692" exitCode=0 Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.741717 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"178ef478-d8d3-49a5-9188-9970d3859049","Type":"ContainerDied","Data":"fe75c42f85412ce15214f8ef6bc41dd0ed931d945a2697c44e4ce3aee60b1692"} Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.805910 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" podStartSLOduration=2.464127059 podStartE2EDuration="35.805891203s" podCreationTimestamp="2026-02-23 07:04:41 +0000 UTC" firstStartedPulling="2026-02-23 07:04:41.993006956 +0000 UTC m=+1144.996791529" lastFinishedPulling="2026-02-23 07:05:15.3347711 +0000 UTC m=+1178.338555673" observedRunningTime="2026-02-23 07:05:16.767521049 +0000 UTC m=+1179.771305632" watchObservedRunningTime="2026-02-23 07:05:16.805891203 +0000 UTC m=+1179.809675776" Feb 23 07:05:16 crc kubenswrapper[5118]: I0223 07:05:16.839128 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:17 crc kubenswrapper[5118]: I0223 07:05:17.756661 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d08c9f04-59d9-4892-a540-5c892c604a71","Type":"ContainerStarted","Data":"0123533d83aacb1fef7bf221a008c5a2e879c0ab52103eda69db30b401c15faa"} Feb 23 07:05:17 crc kubenswrapper[5118]: I0223 07:05:17.761688 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"178ef478-d8d3-49a5-9188-9970d3859049","Type":"ContainerStarted","Data":"e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818"} Feb 23 07:05:17 crc kubenswrapper[5118]: I0223 07:05:17.763019 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:17 crc kubenswrapper[5118]: I0223 07:05:17.812960 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.977630515 podStartE2EDuration="35.812920638s" podCreationTimestamp="2026-02-23 07:04:42 +0000 UTC" firstStartedPulling="2026-02-23 07:05:04.386249829 +0000 UTC m=+1167.390034412" lastFinishedPulling="2026-02-23 07:05:12.221539952 +0000 UTC m=+1175.225324535" observedRunningTime="2026-02-23 07:05:17.792604779 +0000 UTC m=+1180.796389372" watchObservedRunningTime="2026-02-23 07:05:17.812920638 +0000 UTC m=+1180.816705251" Feb 23 07:05:17 crc kubenswrapper[5118]: I0223 07:05:17.842792 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 23 07:05:17 crc kubenswrapper[5118]: I0223 07:05:17.884062 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.07164876 podStartE2EDuration="33.883665725s" podCreationTimestamp="2026-02-23 07:04:44 +0000 UTC" firstStartedPulling="2026-02-23 07:05:04.387315485 +0000 UTC m=+1167.391100058" lastFinishedPulling="2026-02-23 07:05:12.19933245 +0000 UTC m=+1175.203117023" observedRunningTime="2026-02-23 07:05:17.829318825 +0000 UTC m=+1180.833103408" watchObservedRunningTime="2026-02-23 07:05:17.883665725 +0000 UTC m=+1180.887450338" Feb 23 07:05:17 crc kubenswrapper[5118]: I0223 07:05:17.957866 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.104137 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-gbff8"] Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.172167 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-vs8bw"] Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.173862 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.179625 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.193201 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-jtrcj"] Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.194694 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.206606 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.211056 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-vs8bw"] Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.221606 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jtrcj"] Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.311531 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-dns-svc\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.311577 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-combined-ca-bundle\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.311703 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbs6f\" (UniqueName: \"kubernetes.io/projected/b619f375-a02c-434f-a7fb-fc15a9c19bc3-kube-api-access-jbs6f\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.311869 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-config\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.311958 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxghk\" (UniqueName: \"kubernetes.io/projected/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-kube-api-access-xxghk\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.312061 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.312146 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.312243 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovs-rundir\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.312270 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-config\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.312360 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovn-rundir\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.319016 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.358345 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.414248 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.414309 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.414359 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-config\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.414381 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovs-rundir\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.414421 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovn-rundir\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.414450 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-dns-svc\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.414472 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-combined-ca-bundle\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.414509 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbs6f\" (UniqueName: \"kubernetes.io/projected/b619f375-a02c-434f-a7fb-fc15a9c19bc3-kube-api-access-jbs6f\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.414553 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-config\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.414586 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxghk\" (UniqueName: \"kubernetes.io/projected/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-kube-api-access-xxghk\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.415424 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.415600 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-config\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.415643 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-dns-svc\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.415717 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovs-rundir\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.415740 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovn-rundir\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.416291 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-config\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.419891 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.425762 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-combined-ca-bundle\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.433547 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxghk\" (UniqueName: \"kubernetes.io/projected/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-kube-api-access-xxghk\") pod \"ovn-controller-metrics-jtrcj\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.437975 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbs6f\" (UniqueName: \"kubernetes.io/projected/b619f375-a02c-434f-a7fb-fc15a9c19bc3-kube-api-access-jbs6f\") pod \"dnsmasq-dns-5b9c664f7-vs8bw\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.508466 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.528174 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-67mqx"] Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.530221 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.559020 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-jnx59"] Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.567025 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.569579 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.577980 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-jnx59"] Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.731034 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.731419 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-config\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.731695 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.732039 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkz8j\" (UniqueName: \"kubernetes.io/projected/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-kube-api-access-tkz8j\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.732302 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.786600 5118 generic.go:334] "Generic (PLEG): container finished" podID="97208c3d-4be8-473e-8949-b3c9a5f274f9" containerID="2092e0bc48f2327e93623b811c563a5a286c9c02a3be51a0b705e4cef0fced1c" exitCode=0 Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.786887 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-gbff8" event={"ID":"97208c3d-4be8-473e-8949-b3c9a5f274f9","Type":"ContainerDied","Data":"2092e0bc48f2327e93623b811c563a5a286c9c02a3be51a0b705e4cef0fced1c"} Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.787520 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" podUID="ccba9518-711f-4a31-aff7-1817619a7a30" containerName="dnsmasq-dns" containerID="cri-o://263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb" gracePeriod=10 Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.839458 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.839531 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-config\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.839646 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.839773 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkz8j\" (UniqueName: \"kubernetes.io/projected/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-kube-api-access-tkz8j\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.839891 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.842065 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.854214 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.855408 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-config\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.855430 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.866808 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.869551 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkz8j\" (UniqueName: \"kubernetes.io/projected/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-kube-api-access-tkz8j\") pod \"dnsmasq-dns-75b7bcc64f-jnx59\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.905633 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-vs8bw"] Feb 23 07:05:18 crc kubenswrapper[5118]: I0223 07:05:18.944355 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.035108 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.079720 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.079889 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.100673 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.101607 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-8rxc8" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.101790 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.101935 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.178602 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jtrcj"] Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.200619 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.259663 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.259718 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-scripts\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.259779 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.259818 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.259872 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-config\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.259908 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.259936 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lv4x\" (UniqueName: \"kubernetes.io/projected/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-kube-api-access-6lv4x\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.330075 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.361048 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-config\") pod \"97208c3d-4be8-473e-8949-b3c9a5f274f9\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.361428 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6rm8\" (UniqueName: \"kubernetes.io/projected/97208c3d-4be8-473e-8949-b3c9a5f274f9-kube-api-access-x6rm8\") pod \"97208c3d-4be8-473e-8949-b3c9a5f274f9\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.361529 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-dns-svc\") pod \"97208c3d-4be8-473e-8949-b3c9a5f274f9\" (UID: \"97208c3d-4be8-473e-8949-b3c9a5f274f9\") " Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.361828 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.361879 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.361938 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-config\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.361972 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.361994 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lv4x\" (UniqueName: \"kubernetes.io/projected/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-kube-api-access-6lv4x\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.362109 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.362135 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-scripts\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.368383 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-config\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.368505 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.368657 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.393090 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97208c3d-4be8-473e-8949-b3c9a5f274f9-kube-api-access-x6rm8" (OuterVolumeSpecName: "kube-api-access-x6rm8") pod "97208c3d-4be8-473e-8949-b3c9a5f274f9" (UID: "97208c3d-4be8-473e-8949-b3c9a5f274f9"). InnerVolumeSpecName "kube-api-access-x6rm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.393791 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.404263 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.419760 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-scripts\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.430998 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lv4x\" (UniqueName: \"kubernetes.io/projected/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-kube-api-access-6lv4x\") pod \"ovn-northd-0\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.464793 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-dns-svc\") pod \"ccba9518-711f-4a31-aff7-1817619a7a30\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.465058 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npfhw\" (UniqueName: \"kubernetes.io/projected/ccba9518-711f-4a31-aff7-1817619a7a30-kube-api-access-npfhw\") pod \"ccba9518-711f-4a31-aff7-1817619a7a30\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.465711 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-config\") pod \"ccba9518-711f-4a31-aff7-1817619a7a30\" (UID: \"ccba9518-711f-4a31-aff7-1817619a7a30\") " Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.466178 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6rm8\" (UniqueName: \"kubernetes.io/projected/97208c3d-4be8-473e-8949-b3c9a5f274f9-kube-api-access-x6rm8\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.467825 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-config" (OuterVolumeSpecName: "config") pod "97208c3d-4be8-473e-8949-b3c9a5f274f9" (UID: "97208c3d-4be8-473e-8949-b3c9a5f274f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.480282 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccba9518-711f-4a31-aff7-1817619a7a30-kube-api-access-npfhw" (OuterVolumeSpecName: "kube-api-access-npfhw") pod "ccba9518-711f-4a31-aff7-1817619a7a30" (UID: "ccba9518-711f-4a31-aff7-1817619a7a30"). InnerVolumeSpecName "kube-api-access-npfhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.521271 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97208c3d-4be8-473e-8949-b3c9a5f274f9" (UID: "97208c3d-4be8-473e-8949-b3c9a5f274f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.546769 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ccba9518-711f-4a31-aff7-1817619a7a30" (UID: "ccba9518-711f-4a31-aff7-1817619a7a30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.571413 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.571443 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.571455 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npfhw\" (UniqueName: \"kubernetes.io/projected/ccba9518-711f-4a31-aff7-1817619a7a30-kube-api-access-npfhw\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.571465 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97208c3d-4be8-473e-8949-b3c9a5f274f9-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.586968 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-config" (OuterVolumeSpecName: "config") pod "ccba9518-711f-4a31-aff7-1817619a7a30" (UID: "ccba9518-711f-4a31-aff7-1817619a7a30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.654020 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-jnx59"] Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.672673 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccba9518-711f-4a31-aff7-1817619a7a30-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.716275 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.808738 5118 generic.go:334] "Generic (PLEG): container finished" podID="ccba9518-711f-4a31-aff7-1817619a7a30" containerID="263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb" exitCode=0 Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.808870 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.809597 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" event={"ID":"ccba9518-711f-4a31-aff7-1817619a7a30","Type":"ContainerDied","Data":"263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb"} Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.809655 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-67mqx" event={"ID":"ccba9518-711f-4a31-aff7-1817619a7a30","Type":"ContainerDied","Data":"55d16a28b5b004d82e740e5f07b4ee491b362004faf2eaa40e41785526341bf5"} Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.809676 5118 scope.go:117] "RemoveContainer" containerID="263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.813592 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" event={"ID":"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10","Type":"ContainerStarted","Data":"1a015464589a0b86a2ef4880bf31ab7687571218eae14289cdd72e8cc56afcbe"} Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.817850 5118 generic.go:334] "Generic (PLEG): container finished" podID="b619f375-a02c-434f-a7fb-fc15a9c19bc3" containerID="8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c" exitCode=0 Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.817928 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" event={"ID":"b619f375-a02c-434f-a7fb-fc15a9c19bc3","Type":"ContainerDied","Data":"8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c"} Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.817982 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" event={"ID":"b619f375-a02c-434f-a7fb-fc15a9c19bc3","Type":"ContainerStarted","Data":"0f6960f1e33f01366d307dc92be1049665f10c0cec1f90951e919c832e947ff3"} Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.823494 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-gbff8" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.823518 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-gbff8" event={"ID":"97208c3d-4be8-473e-8949-b3c9a5f274f9","Type":"ContainerDied","Data":"18a72f4bf3dcc4fb358921fe625a93b13f1430790cfdd0a4e05e60d651a4c6b0"} Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.840639 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jtrcj" event={"ID":"9568fac1-abdb-4b34-a0b7-e27d6c2183ee","Type":"ContainerStarted","Data":"e782f3020fb567d1d112542543ef24f19219d86c018400d9252caaab0cecce88"} Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.840674 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jtrcj" event={"ID":"9568fac1-abdb-4b34-a0b7-e27d6c2183ee","Type":"ContainerStarted","Data":"7d77ea7b34a758bfcf2117aa91c7d2693cd803d6697653b510f7487074e9e561"} Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.862369 5118 scope.go:117] "RemoveContainer" containerID="4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.879350 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-jtrcj" podStartSLOduration=1.879307542 podStartE2EDuration="1.879307542s" podCreationTimestamp="2026-02-23 07:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:19.869314727 +0000 UTC m=+1182.873099300" watchObservedRunningTime="2026-02-23 07:05:19.879307542 +0000 UTC m=+1182.883092105" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.896034 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-67mqx"] Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.901015 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-67mqx"] Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.909482 5118 scope.go:117] "RemoveContainer" containerID="263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb" Feb 23 07:05:19 crc kubenswrapper[5118]: E0223 07:05:19.931221 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb\": container with ID starting with 263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb not found: ID does not exist" containerID="263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.931282 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb"} err="failed to get container status \"263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb\": rpc error: code = NotFound desc = could not find container \"263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb\": container with ID starting with 263a8efe6fd765ce98793f1920f836df7f1fcecca2837d516df916e6e4ad91cb not found: ID does not exist" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.931325 5118 scope.go:117] "RemoveContainer" containerID="4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b" Feb 23 07:05:19 crc kubenswrapper[5118]: E0223 07:05:19.936248 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b\": container with ID starting with 4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b not found: ID does not exist" containerID="4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.936325 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b"} err="failed to get container status \"4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b\": rpc error: code = NotFound desc = could not find container \"4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b\": container with ID starting with 4b98a6a939703dbf21849230ef65853936efdc8badfccd9dd568f8c439f9fb5b not found: ID does not exist" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.936357 5118 scope.go:117] "RemoveContainer" containerID="2092e0bc48f2327e93623b811c563a5a286c9c02a3be51a0b705e4cef0fced1c" Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.958450 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-gbff8"] Feb 23 07:05:19 crc kubenswrapper[5118]: I0223 07:05:19.979384 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-gbff8"] Feb 23 07:05:20 crc kubenswrapper[5118]: I0223 07:05:20.240907 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:05:20 crc kubenswrapper[5118]: W0223 07:05:20.244004 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fbf0cfa_8a35_49c6_bfa5_6639a1e75752.slice/crio-e581b26bea61e2ab21c91d0bfc1c92a522875f21a634e723c36b7d81a4b8837a WatchSource:0}: Error finding container e581b26bea61e2ab21c91d0bfc1c92a522875f21a634e723c36b7d81a4b8837a: Status 404 returned error can't find the container with id e581b26bea61e2ab21c91d0bfc1c92a522875f21a634e723c36b7d81a4b8837a Feb 23 07:05:20 crc kubenswrapper[5118]: I0223 07:05:20.832976 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 23 07:05:20 crc kubenswrapper[5118]: I0223 07:05:20.860033 5118 generic.go:334] "Generic (PLEG): container finished" podID="b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" containerID="5e73fa4580f4c73c9c2474d6d8e409a299fd6e9d2a0cb10f8a794c016381abeb" exitCode=0 Feb 23 07:05:20 crc kubenswrapper[5118]: I0223 07:05:20.860121 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" event={"ID":"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10","Type":"ContainerDied","Data":"5e73fa4580f4c73c9c2474d6d8e409a299fd6e9d2a0cb10f8a794c016381abeb"} Feb 23 07:05:20 crc kubenswrapper[5118]: I0223 07:05:20.863135 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" event={"ID":"b619f375-a02c-434f-a7fb-fc15a9c19bc3","Type":"ContainerStarted","Data":"7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c"} Feb 23 07:05:20 crc kubenswrapper[5118]: I0223 07:05:20.863301 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:20 crc kubenswrapper[5118]: I0223 07:05:20.864469 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752","Type":"ContainerStarted","Data":"e581b26bea61e2ab21c91d0bfc1c92a522875f21a634e723c36b7d81a4b8837a"} Feb 23 07:05:20 crc kubenswrapper[5118]: I0223 07:05:20.925987 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" podStartSLOduration=2.92595633 podStartE2EDuration="2.92595633s" podCreationTimestamp="2026-02-23 07:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:20.915198528 +0000 UTC m=+1183.918983101" watchObservedRunningTime="2026-02-23 07:05:20.92595633 +0000 UTC m=+1183.929740903" Feb 23 07:05:21 crc kubenswrapper[5118]: I0223 07:05:21.710601 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97208c3d-4be8-473e-8949-b3c9a5f274f9" path="/var/lib/kubelet/pods/97208c3d-4be8-473e-8949-b3c9a5f274f9/volumes" Feb 23 07:05:21 crc kubenswrapper[5118]: I0223 07:05:21.711804 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccba9518-711f-4a31-aff7-1817619a7a30" path="/var/lib/kubelet/pods/ccba9518-711f-4a31-aff7-1817619a7a30/volumes" Feb 23 07:05:21 crc kubenswrapper[5118]: I0223 07:05:21.879739 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" event={"ID":"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10","Type":"ContainerStarted","Data":"e04bed901323060d8bdbf121a8c0a21a8592ac08dc80af47b6dc5111c17834a4"} Feb 23 07:05:21 crc kubenswrapper[5118]: I0223 07:05:21.880506 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:21 crc kubenswrapper[5118]: I0223 07:05:21.882052 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752","Type":"ContainerStarted","Data":"b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac"} Feb 23 07:05:21 crc kubenswrapper[5118]: I0223 07:05:21.901012 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" podStartSLOduration=3.900966431 podStartE2EDuration="3.900966431s" podCreationTimestamp="2026-02-23 07:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:21.898674177 +0000 UTC m=+1184.902458750" watchObservedRunningTime="2026-02-23 07:05:21.900966431 +0000 UTC m=+1184.904751004" Feb 23 07:05:22 crc kubenswrapper[5118]: I0223 07:05:22.894299 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752","Type":"ContainerStarted","Data":"25d2ae49d9a05edcb3c8c8fa2bea4dca27a3dbb07af2c6c8fc988dc353770b62"} Feb 23 07:05:22 crc kubenswrapper[5118]: I0223 07:05:22.895086 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 23 07:05:22 crc kubenswrapper[5118]: I0223 07:05:22.929205 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.672125518 podStartE2EDuration="3.929181216s" podCreationTimestamp="2026-02-23 07:05:19 +0000 UTC" firstStartedPulling="2026-02-23 07:05:20.248582755 +0000 UTC m=+1183.252367368" lastFinishedPulling="2026-02-23 07:05:21.505638483 +0000 UTC m=+1184.509423066" observedRunningTime="2026-02-23 07:05:22.924333111 +0000 UTC m=+1185.928117714" watchObservedRunningTime="2026-02-23 07:05:22.929181216 +0000 UTC m=+1185.932965809" Feb 23 07:05:24 crc kubenswrapper[5118]: I0223 07:05:24.338482 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 23 07:05:24 crc kubenswrapper[5118]: I0223 07:05:24.338978 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 23 07:05:24 crc kubenswrapper[5118]: I0223 07:05:24.452673 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 23 07:05:25 crc kubenswrapper[5118]: I0223 07:05:25.041487 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 23 07:05:25 crc kubenswrapper[5118]: I0223 07:05:25.418723 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:25 crc kubenswrapper[5118]: I0223 07:05:25.418791 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:25 crc kubenswrapper[5118]: I0223 07:05:25.546764 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.148877 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.809366 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1011-account-create-update-vx6cd"] Feb 23 07:05:26 crc kubenswrapper[5118]: E0223 07:05:26.809893 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccba9518-711f-4a31-aff7-1817619a7a30" containerName="init" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.809911 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccba9518-711f-4a31-aff7-1817619a7a30" containerName="init" Feb 23 07:05:26 crc kubenswrapper[5118]: E0223 07:05:26.809948 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccba9518-711f-4a31-aff7-1817619a7a30" containerName="dnsmasq-dns" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.809958 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccba9518-711f-4a31-aff7-1817619a7a30" containerName="dnsmasq-dns" Feb 23 07:05:26 crc kubenswrapper[5118]: E0223 07:05:26.809982 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97208c3d-4be8-473e-8949-b3c9a5f274f9" containerName="init" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.809992 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="97208c3d-4be8-473e-8949-b3c9a5f274f9" containerName="init" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.810415 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="97208c3d-4be8-473e-8949-b3c9a5f274f9" containerName="init" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.810442 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccba9518-711f-4a31-aff7-1817619a7a30" containerName="dnsmasq-dns" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.811339 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1011-account-create-update-vx6cd" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.813782 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.820724 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1011-account-create-update-vx6cd"] Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.867271 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-brjbz"] Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.868585 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-brjbz" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.887926 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-brjbz"] Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.940380 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfds7\" (UniqueName: \"kubernetes.io/projected/0542fde3-000d-4897-9f0b-ec6a050ac5be-kube-api-access-pfds7\") pod \"keystone-1011-account-create-update-vx6cd\" (UID: \"0542fde3-000d-4897-9f0b-ec6a050ac5be\") " pod="openstack/keystone-1011-account-create-update-vx6cd" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.940482 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0542fde3-000d-4897-9f0b-ec6a050ac5be-operator-scripts\") pod \"keystone-1011-account-create-update-vx6cd\" (UID: \"0542fde3-000d-4897-9f0b-ec6a050ac5be\") " pod="openstack/keystone-1011-account-create-update-vx6cd" Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.996156 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2e07-account-create-update-sqj4n"] Feb 23 07:05:26 crc kubenswrapper[5118]: I0223 07:05:26.997826 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2e07-account-create-update-sqj4n" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.000945 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.020589 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-dcn6z"] Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.021837 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dcn6z" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.027780 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2e07-account-create-update-sqj4n"] Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.041991 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfds7\" (UniqueName: \"kubernetes.io/projected/0542fde3-000d-4897-9f0b-ec6a050ac5be-kube-api-access-pfds7\") pod \"keystone-1011-account-create-update-vx6cd\" (UID: \"0542fde3-000d-4897-9f0b-ec6a050ac5be\") " pod="openstack/keystone-1011-account-create-update-vx6cd" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.042074 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b7ca43-be43-4d92-9d24-6dbf40d3132d-operator-scripts\") pod \"keystone-db-create-brjbz\" (UID: \"49b7ca43-be43-4d92-9d24-6dbf40d3132d\") " pod="openstack/keystone-db-create-brjbz" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.042135 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwlnx\" (UniqueName: \"kubernetes.io/projected/49b7ca43-be43-4d92-9d24-6dbf40d3132d-kube-api-access-gwlnx\") pod \"keystone-db-create-brjbz\" (UID: \"49b7ca43-be43-4d92-9d24-6dbf40d3132d\") " pod="openstack/keystone-db-create-brjbz" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.042181 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0542fde3-000d-4897-9f0b-ec6a050ac5be-operator-scripts\") pod \"keystone-1011-account-create-update-vx6cd\" (UID: \"0542fde3-000d-4897-9f0b-ec6a050ac5be\") " pod="openstack/keystone-1011-account-create-update-vx6cd" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.043448 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0542fde3-000d-4897-9f0b-ec6a050ac5be-operator-scripts\") pod \"keystone-1011-account-create-update-vx6cd\" (UID: \"0542fde3-000d-4897-9f0b-ec6a050ac5be\") " pod="openstack/keystone-1011-account-create-update-vx6cd" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.050426 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dcn6z"] Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.071280 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfds7\" (UniqueName: \"kubernetes.io/projected/0542fde3-000d-4897-9f0b-ec6a050ac5be-kube-api-access-pfds7\") pod \"keystone-1011-account-create-update-vx6cd\" (UID: \"0542fde3-000d-4897-9f0b-ec6a050ac5be\") " pod="openstack/keystone-1011-account-create-update-vx6cd" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.143676 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1011-account-create-update-vx6cd" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.144084 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3940758f-3420-45cc-8824-06a4daf1b598-operator-scripts\") pod \"placement-db-create-dcn6z\" (UID: \"3940758f-3420-45cc-8824-06a4daf1b598\") " pod="openstack/placement-db-create-dcn6z" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.144196 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35127816-7431-4cc3-abde-24014813fbec-operator-scripts\") pod \"placement-2e07-account-create-update-sqj4n\" (UID: \"35127816-7431-4cc3-abde-24014813fbec\") " pod="openstack/placement-2e07-account-create-update-sqj4n" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.144265 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b7ca43-be43-4d92-9d24-6dbf40d3132d-operator-scripts\") pod \"keystone-db-create-brjbz\" (UID: \"49b7ca43-be43-4d92-9d24-6dbf40d3132d\") " pod="openstack/keystone-db-create-brjbz" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.144334 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgw8\" (UniqueName: \"kubernetes.io/projected/3940758f-3420-45cc-8824-06a4daf1b598-kube-api-access-rkgw8\") pod \"placement-db-create-dcn6z\" (UID: \"3940758f-3420-45cc-8824-06a4daf1b598\") " pod="openstack/placement-db-create-dcn6z" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.144390 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwlnx\" (UniqueName: \"kubernetes.io/projected/49b7ca43-be43-4d92-9d24-6dbf40d3132d-kube-api-access-gwlnx\") pod \"keystone-db-create-brjbz\" (UID: \"49b7ca43-be43-4d92-9d24-6dbf40d3132d\") " pod="openstack/keystone-db-create-brjbz" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.144610 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg8xh\" (UniqueName: \"kubernetes.io/projected/35127816-7431-4cc3-abde-24014813fbec-kube-api-access-sg8xh\") pod \"placement-2e07-account-create-update-sqj4n\" (UID: \"35127816-7431-4cc3-abde-24014813fbec\") " pod="openstack/placement-2e07-account-create-update-sqj4n" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.145410 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b7ca43-be43-4d92-9d24-6dbf40d3132d-operator-scripts\") pod \"keystone-db-create-brjbz\" (UID: \"49b7ca43-be43-4d92-9d24-6dbf40d3132d\") " pod="openstack/keystone-db-create-brjbz" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.173152 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwlnx\" (UniqueName: \"kubernetes.io/projected/49b7ca43-be43-4d92-9d24-6dbf40d3132d-kube-api-access-gwlnx\") pod \"keystone-db-create-brjbz\" (UID: \"49b7ca43-be43-4d92-9d24-6dbf40d3132d\") " pod="openstack/keystone-db-create-brjbz" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.184723 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-brjbz" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.247472 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35127816-7431-4cc3-abde-24014813fbec-operator-scripts\") pod \"placement-2e07-account-create-update-sqj4n\" (UID: \"35127816-7431-4cc3-abde-24014813fbec\") " pod="openstack/placement-2e07-account-create-update-sqj4n" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.247928 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgw8\" (UniqueName: \"kubernetes.io/projected/3940758f-3420-45cc-8824-06a4daf1b598-kube-api-access-rkgw8\") pod \"placement-db-create-dcn6z\" (UID: \"3940758f-3420-45cc-8824-06a4daf1b598\") " pod="openstack/placement-db-create-dcn6z" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.247978 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg8xh\" (UniqueName: \"kubernetes.io/projected/35127816-7431-4cc3-abde-24014813fbec-kube-api-access-sg8xh\") pod \"placement-2e07-account-create-update-sqj4n\" (UID: \"35127816-7431-4cc3-abde-24014813fbec\") " pod="openstack/placement-2e07-account-create-update-sqj4n" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.248058 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3940758f-3420-45cc-8824-06a4daf1b598-operator-scripts\") pod \"placement-db-create-dcn6z\" (UID: \"3940758f-3420-45cc-8824-06a4daf1b598\") " pod="openstack/placement-db-create-dcn6z" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.248373 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35127816-7431-4cc3-abde-24014813fbec-operator-scripts\") pod \"placement-2e07-account-create-update-sqj4n\" (UID: \"35127816-7431-4cc3-abde-24014813fbec\") " pod="openstack/placement-2e07-account-create-update-sqj4n" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.248779 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3940758f-3420-45cc-8824-06a4daf1b598-operator-scripts\") pod \"placement-db-create-dcn6z\" (UID: \"3940758f-3420-45cc-8824-06a4daf1b598\") " pod="openstack/placement-db-create-dcn6z" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.267087 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgw8\" (UniqueName: \"kubernetes.io/projected/3940758f-3420-45cc-8824-06a4daf1b598-kube-api-access-rkgw8\") pod \"placement-db-create-dcn6z\" (UID: \"3940758f-3420-45cc-8824-06a4daf1b598\") " pod="openstack/placement-db-create-dcn6z" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.268362 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg8xh\" (UniqueName: \"kubernetes.io/projected/35127816-7431-4cc3-abde-24014813fbec-kube-api-access-sg8xh\") pod \"placement-2e07-account-create-update-sqj4n\" (UID: \"35127816-7431-4cc3-abde-24014813fbec\") " pod="openstack/placement-2e07-account-create-update-sqj4n" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.317158 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2e07-account-create-update-sqj4n" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.352086 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dcn6z" Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.694102 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1011-account-create-update-vx6cd"] Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.803647 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-brjbz"] Feb 23 07:05:27 crc kubenswrapper[5118]: W0223 07:05:27.813147 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49b7ca43_be43_4d92_9d24_6dbf40d3132d.slice/crio-f9b9425c83bd27361bf9c9c4dd9d4bbc9342d32409134dbb8018eb2885a7ca25 WatchSource:0}: Error finding container f9b9425c83bd27361bf9c9c4dd9d4bbc9342d32409134dbb8018eb2885a7ca25: Status 404 returned error can't find the container with id f9b9425c83bd27361bf9c9c4dd9d4bbc9342d32409134dbb8018eb2885a7ca25 Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.917444 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2e07-account-create-update-sqj4n"] Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.927195 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dcn6z"] Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.939954 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dcn6z" event={"ID":"3940758f-3420-45cc-8824-06a4daf1b598","Type":"ContainerStarted","Data":"ce28f033d8314c886ed5971b5d0c4cfc7f5befb8649a3008e78c05bb8ffae03b"} Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.942373 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2e07-account-create-update-sqj4n" event={"ID":"35127816-7431-4cc3-abde-24014813fbec","Type":"ContainerStarted","Data":"e5cef4db065e0c4a32430960e68a765611d6baba2947a25b7d32dae22e1fd69e"} Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.943309 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-brjbz" event={"ID":"49b7ca43-be43-4d92-9d24-6dbf40d3132d","Type":"ContainerStarted","Data":"f9b9425c83bd27361bf9c9c4dd9d4bbc9342d32409134dbb8018eb2885a7ca25"} Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.947450 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1011-account-create-update-vx6cd" event={"ID":"0542fde3-000d-4897-9f0b-ec6a050ac5be","Type":"ContainerStarted","Data":"6a23fd3d2542fd72adf146968189c89588ba91f524f068f3518a0a21f82692ed"} Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.947484 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1011-account-create-update-vx6cd" event={"ID":"0542fde3-000d-4897-9f0b-ec6a050ac5be","Type":"ContainerStarted","Data":"6bb79e57ffa75777379a8efe1d798210e82a507f7f576760b1a0fb68128ada43"} Feb 23 07:05:27 crc kubenswrapper[5118]: I0223 07:05:27.981049 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-1011-account-create-update-vx6cd" podStartSLOduration=1.9810223439999999 podStartE2EDuration="1.981022344s" podCreationTimestamp="2026-02-23 07:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:27.973013415 +0000 UTC m=+1190.976797988" watchObservedRunningTime="2026-02-23 07:05:27.981022344 +0000 UTC m=+1190.984806917" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.109389 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-vs8bw"] Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.110170 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" podUID="b619f375-a02c-434f-a7fb-fc15a9c19bc3" containerName="dnsmasq-dns" containerID="cri-o://7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c" gracePeriod=10 Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.111371 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.142074 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-vhntj"] Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.143779 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.184339 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-vhntj"] Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.277768 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-config\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.277953 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.278081 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.278228 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-dns-svc\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.278394 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx9bv\" (UniqueName: \"kubernetes.io/projected/d4b341f1-f2c2-4114-b200-1d1c351e84ac-kube-api-access-sx9bv\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.381176 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-dns-svc\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.381266 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx9bv\" (UniqueName: \"kubernetes.io/projected/d4b341f1-f2c2-4114-b200-1d1c351e84ac-kube-api-access-sx9bv\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.381309 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-config\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.381338 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.381375 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.382586 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.382937 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-config\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.384145 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.384778 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-dns-svc\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.424129 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx9bv\" (UniqueName: \"kubernetes.io/projected/d4b341f1-f2c2-4114-b200-1d1c351e84ac-kube-api-access-sx9bv\") pod \"dnsmasq-dns-689df5d84f-vhntj\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.508759 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" podUID="b619f375-a02c-434f-a7fb-fc15a9c19bc3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Feb 23 07:05:28 crc kubenswrapper[5118]: I0223 07:05:28.639813 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.947039 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.966979 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dcn6z" event={"ID":"3940758f-3420-45cc-8824-06a4daf1b598","Type":"ContainerStarted","Data":"62a6b39873821838f5e6ec500bb13fb1c5bafa215a40ab539f49df60bc1bfa58"} Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.969366 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.974303 5118 generic.go:334] "Generic (PLEG): container finished" podID="b619f375-a02c-434f-a7fb-fc15a9c19bc3" containerID="7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c" exitCode=0 Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.974435 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" event={"ID":"b619f375-a02c-434f-a7fb-fc15a9c19bc3","Type":"ContainerDied","Data":"7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c"} Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.974531 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" event={"ID":"b619f375-a02c-434f-a7fb-fc15a9c19bc3","Type":"ContainerDied","Data":"0f6960f1e33f01366d307dc92be1049665f10c0cec1f90951e919c832e947ff3"} Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.974557 5118 scope.go:117] "RemoveContainer" containerID="7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.981905 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2e07-account-create-update-sqj4n" event={"ID":"35127816-7431-4cc3-abde-24014813fbec","Type":"ContainerStarted","Data":"6017ddaf1fbb67ccc35bf16bd2c5d9c4390cd9ca9d79da4052a7cd0a0dfc5b1a"} Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.988508 5118 generic.go:334] "Generic (PLEG): container finished" podID="49b7ca43-be43-4d92-9d24-6dbf40d3132d" containerID="efeb2dee3d60940a41bd6f825379f0cd8db4a63a4c9712ef10172e0587fe62a9" exitCode=0 Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.990559 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-brjbz" event={"ID":"49b7ca43-be43-4d92-9d24-6dbf40d3132d","Type":"ContainerDied","Data":"efeb2dee3d60940a41bd6f825379f0cd8db4a63a4c9712ef10172e0587fe62a9"} Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:28.997057 5118 scope.go:117] "RemoveContainer" containerID="8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.007082 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-dcn6z" podStartSLOduration=3.007052467 podStartE2EDuration="3.007052467s" podCreationTimestamp="2026-02-23 07:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:28.998419344 +0000 UTC m=+1192.002203927" watchObservedRunningTime="2026-02-23 07:05:29.007052467 +0000 UTC m=+1192.010837040" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.015125 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbs6f\" (UniqueName: \"kubernetes.io/projected/b619f375-a02c-434f-a7fb-fc15a9c19bc3-kube-api-access-jbs6f\") pod \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.015205 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-dns-svc\") pod \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.015266 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-ovsdbserver-sb\") pod \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.015406 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-config\") pod \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\" (UID: \"b619f375-a02c-434f-a7fb-fc15a9c19bc3\") " Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.032648 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b619f375-a02c-434f-a7fb-fc15a9c19bc3-kube-api-access-jbs6f" (OuterVolumeSpecName: "kube-api-access-jbs6f") pod "b619f375-a02c-434f-a7fb-fc15a9c19bc3" (UID: "b619f375-a02c-434f-a7fb-fc15a9c19bc3"). InnerVolumeSpecName "kube-api-access-jbs6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.042074 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-2e07-account-create-update-sqj4n" podStartSLOduration=3.042046682 podStartE2EDuration="3.042046682s" podCreationTimestamp="2026-02-23 07:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:29.039389889 +0000 UTC m=+1192.043174482" watchObservedRunningTime="2026-02-23 07:05:29.042046682 +0000 UTC m=+1192.045831265" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.045004 5118 scope.go:117] "RemoveContainer" containerID="7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c" Feb 23 07:05:29 crc kubenswrapper[5118]: E0223 07:05:29.045554 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c\": container with ID starting with 7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c not found: ID does not exist" containerID="7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.045584 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c"} err="failed to get container status \"7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c\": rpc error: code = NotFound desc = could not find container \"7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c\": container with ID starting with 7bd7e5849852787de9c4076f9459e6770b7897f6ae54ed31269e01734413d54c not found: ID does not exist" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.045603 5118 scope.go:117] "RemoveContainer" containerID="8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c" Feb 23 07:05:29 crc kubenswrapper[5118]: E0223 07:05:29.045906 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c\": container with ID starting with 8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c not found: ID does not exist" containerID="8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.045925 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c"} err="failed to get container status \"8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c\": rpc error: code = NotFound desc = could not find container \"8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c\": container with ID starting with 8ade020bddf5f95acc5772467efd0dff81037ec016972db37b82936e0b1f430c not found: ID does not exist" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.060746 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b619f375-a02c-434f-a7fb-fc15a9c19bc3" (UID: "b619f375-a02c-434f-a7fb-fc15a9c19bc3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.095817 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-config" (OuterVolumeSpecName: "config") pod "b619f375-a02c-434f-a7fb-fc15a9c19bc3" (UID: "b619f375-a02c-434f-a7fb-fc15a9c19bc3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.118349 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbs6f\" (UniqueName: \"kubernetes.io/projected/b619f375-a02c-434f-a7fb-fc15a9c19bc3-kube-api-access-jbs6f\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.118386 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.118399 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.119391 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b619f375-a02c-434f-a7fb-fc15a9c19bc3" (UID: "b619f375-a02c-434f-a7fb-fc15a9c19bc3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.219958 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b619f375-a02c-434f-a7fb-fc15a9c19bc3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.264275 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:05:29 crc kubenswrapper[5118]: E0223 07:05:29.264729 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b619f375-a02c-434f-a7fb-fc15a9c19bc3" containerName="dnsmasq-dns" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.264745 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b619f375-a02c-434f-a7fb-fc15a9c19bc3" containerName="dnsmasq-dns" Feb 23 07:05:29 crc kubenswrapper[5118]: E0223 07:05:29.264784 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b619f375-a02c-434f-a7fb-fc15a9c19bc3" containerName="init" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.264792 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b619f375-a02c-434f-a7fb-fc15a9c19bc3" containerName="init" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.265013 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b619f375-a02c-434f-a7fb-fc15a9c19bc3" containerName="dnsmasq-dns" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.272031 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.274072 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.274436 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.274478 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-9d75l" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.275552 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.296115 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.322715 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-cache\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.322865 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct4ms\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-kube-api-access-ct4ms\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.322912 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.322936 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-lock\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.323344 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.323647 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.425481 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.425541 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-cache\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.425578 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct4ms\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-kube-api-access-ct4ms\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.425605 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.425621 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-lock\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.425669 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: E0223 07:05:29.425827 5118 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:05:29 crc kubenswrapper[5118]: E0223 07:05:29.425847 5118 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:05:29 crc kubenswrapper[5118]: E0223 07:05:29.425920 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift podName:b7e8b9fa-2985-45f6-97e1-77a56b8ba9da nodeName:}" failed. No retries permitted until 2026-02-23 07:05:29.925896399 +0000 UTC m=+1192.929680972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift") pod "swift-storage-0" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da") : configmap "swift-ring-files" not found Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.426318 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.426681 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-cache\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.427053 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-lock\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.434065 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.451308 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct4ms\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-kube-api-access-ct4ms\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.466894 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: I0223 07:05:29.935513 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:29 crc kubenswrapper[5118]: E0223 07:05:29.936017 5118 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:05:29 crc kubenswrapper[5118]: E0223 07:05:29.936033 5118 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:05:29 crc kubenswrapper[5118]: E0223 07:05:29.936085 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift podName:b7e8b9fa-2985-45f6-97e1-77a56b8ba9da nodeName:}" failed. No retries permitted until 2026-02-23 07:05:30.936067183 +0000 UTC m=+1193.939851756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift") pod "swift-storage-0" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da") : configmap "swift-ring-files" not found Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.001076 5118 generic.go:334] "Generic (PLEG): container finished" podID="0542fde3-000d-4897-9f0b-ec6a050ac5be" containerID="6a23fd3d2542fd72adf146968189c89588ba91f524f068f3518a0a21f82692ed" exitCode=0 Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.001159 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1011-account-create-update-vx6cd" event={"ID":"0542fde3-000d-4897-9f0b-ec6a050ac5be","Type":"ContainerDied","Data":"6a23fd3d2542fd72adf146968189c89588ba91f524f068f3518a0a21f82692ed"} Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.002768 5118 generic.go:334] "Generic (PLEG): container finished" podID="3940758f-3420-45cc-8824-06a4daf1b598" containerID="62a6b39873821838f5e6ec500bb13fb1c5bafa215a40ab539f49df60bc1bfa58" exitCode=0 Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.002835 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dcn6z" event={"ID":"3940758f-3420-45cc-8824-06a4daf1b598","Type":"ContainerDied","Data":"62a6b39873821838f5e6ec500bb13fb1c5bafa215a40ab539f49df60bc1bfa58"} Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.004306 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-vs8bw" Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.005855 5118 generic.go:334] "Generic (PLEG): container finished" podID="35127816-7431-4cc3-abde-24014813fbec" containerID="6017ddaf1fbb67ccc35bf16bd2c5d9c4390cd9ca9d79da4052a7cd0a0dfc5b1a" exitCode=0 Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.006003 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2e07-account-create-update-sqj4n" event={"ID":"35127816-7431-4cc3-abde-24014813fbec","Type":"ContainerDied","Data":"6017ddaf1fbb67ccc35bf16bd2c5d9c4390cd9ca9d79da4052a7cd0a0dfc5b1a"} Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.057131 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-vhntj"] Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.081939 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-vs8bw"] Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.087786 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-vs8bw"] Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.439033 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-brjbz" Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.451596 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b7ca43-be43-4d92-9d24-6dbf40d3132d-operator-scripts\") pod \"49b7ca43-be43-4d92-9d24-6dbf40d3132d\" (UID: \"49b7ca43-be43-4d92-9d24-6dbf40d3132d\") " Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.451787 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwlnx\" (UniqueName: \"kubernetes.io/projected/49b7ca43-be43-4d92-9d24-6dbf40d3132d-kube-api-access-gwlnx\") pod \"49b7ca43-be43-4d92-9d24-6dbf40d3132d\" (UID: \"49b7ca43-be43-4d92-9d24-6dbf40d3132d\") " Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.455537 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b7ca43-be43-4d92-9d24-6dbf40d3132d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49b7ca43-be43-4d92-9d24-6dbf40d3132d" (UID: "49b7ca43-be43-4d92-9d24-6dbf40d3132d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.461248 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b7ca43-be43-4d92-9d24-6dbf40d3132d-kube-api-access-gwlnx" (OuterVolumeSpecName: "kube-api-access-gwlnx") pod "49b7ca43-be43-4d92-9d24-6dbf40d3132d" (UID: "49b7ca43-be43-4d92-9d24-6dbf40d3132d"). InnerVolumeSpecName "kube-api-access-gwlnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.554608 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49b7ca43-be43-4d92-9d24-6dbf40d3132d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.554641 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwlnx\" (UniqueName: \"kubernetes.io/projected/49b7ca43-be43-4d92-9d24-6dbf40d3132d-kube-api-access-gwlnx\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:30 crc kubenswrapper[5118]: I0223 07:05:30.961665 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:30 crc kubenswrapper[5118]: E0223 07:05:30.961906 5118 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:05:30 crc kubenswrapper[5118]: E0223 07:05:30.961937 5118 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:05:30 crc kubenswrapper[5118]: E0223 07:05:30.962015 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift podName:b7e8b9fa-2985-45f6-97e1-77a56b8ba9da nodeName:}" failed. No retries permitted until 2026-02-23 07:05:32.961985734 +0000 UTC m=+1195.965770307 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift") pod "swift-storage-0" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da") : configmap "swift-ring-files" not found Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.005165 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-vlb9z"] Feb 23 07:05:31 crc kubenswrapper[5118]: E0223 07:05:31.005601 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b7ca43-be43-4d92-9d24-6dbf40d3132d" containerName="mariadb-database-create" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.005619 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b7ca43-be43-4d92-9d24-6dbf40d3132d" containerName="mariadb-database-create" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.005790 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b7ca43-be43-4d92-9d24-6dbf40d3132d" containerName="mariadb-database-create" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.006440 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vlb9z" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.020482 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vlb9z"] Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.021442 5118 generic.go:334] "Generic (PLEG): container finished" podID="d4b341f1-f2c2-4114-b200-1d1c351e84ac" containerID="aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b" exitCode=0 Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.022436 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" event={"ID":"d4b341f1-f2c2-4114-b200-1d1c351e84ac","Type":"ContainerDied","Data":"aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b"} Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.022548 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" event={"ID":"d4b341f1-f2c2-4114-b200-1d1c351e84ac","Type":"ContainerStarted","Data":"b5424f608bab6d406d9e103c42ccd47452c79d433df11ccf0e604ab2feaa728e"} Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.027608 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-brjbz" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.032990 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-brjbz" event={"ID":"49b7ca43-be43-4d92-9d24-6dbf40d3132d","Type":"ContainerDied","Data":"f9b9425c83bd27361bf9c9c4dd9d4bbc9342d32409134dbb8018eb2885a7ca25"} Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.033124 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9b9425c83bd27361bf9c9c4dd9d4bbc9342d32409134dbb8018eb2885a7ca25" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.120516 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-284f-account-create-update-w9fdc"] Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.123940 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-284f-account-create-update-w9fdc" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.133276 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-284f-account-create-update-w9fdc"] Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.133462 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.165955 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892a426d-2034-416c-b5ce-9dbe665ff99e-operator-scripts\") pod \"glance-db-create-vlb9z\" (UID: \"892a426d-2034-416c-b5ce-9dbe665ff99e\") " pod="openstack/glance-db-create-vlb9z" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.166077 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctphp\" (UniqueName: \"kubernetes.io/projected/892a426d-2034-416c-b5ce-9dbe665ff99e-kube-api-access-ctphp\") pod \"glance-db-create-vlb9z\" (UID: \"892a426d-2034-416c-b5ce-9dbe665ff99e\") " pod="openstack/glance-db-create-vlb9z" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.267722 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctphp\" (UniqueName: \"kubernetes.io/projected/892a426d-2034-416c-b5ce-9dbe665ff99e-kube-api-access-ctphp\") pod \"glance-db-create-vlb9z\" (UID: \"892a426d-2034-416c-b5ce-9dbe665ff99e\") " pod="openstack/glance-db-create-vlb9z" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.267839 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/235a2c28-6291-44bc-8b66-10fe612e0c9e-operator-scripts\") pod \"glance-284f-account-create-update-w9fdc\" (UID: \"235a2c28-6291-44bc-8b66-10fe612e0c9e\") " pod="openstack/glance-284f-account-create-update-w9fdc" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.267884 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892a426d-2034-416c-b5ce-9dbe665ff99e-operator-scripts\") pod \"glance-db-create-vlb9z\" (UID: \"892a426d-2034-416c-b5ce-9dbe665ff99e\") " pod="openstack/glance-db-create-vlb9z" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.267912 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzlz6\" (UniqueName: \"kubernetes.io/projected/235a2c28-6291-44bc-8b66-10fe612e0c9e-kube-api-access-kzlz6\") pod \"glance-284f-account-create-update-w9fdc\" (UID: \"235a2c28-6291-44bc-8b66-10fe612e0c9e\") " pod="openstack/glance-284f-account-create-update-w9fdc" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.268878 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892a426d-2034-416c-b5ce-9dbe665ff99e-operator-scripts\") pod \"glance-db-create-vlb9z\" (UID: \"892a426d-2034-416c-b5ce-9dbe665ff99e\") " pod="openstack/glance-db-create-vlb9z" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.293396 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctphp\" (UniqueName: \"kubernetes.io/projected/892a426d-2034-416c-b5ce-9dbe665ff99e-kube-api-access-ctphp\") pod \"glance-db-create-vlb9z\" (UID: \"892a426d-2034-416c-b5ce-9dbe665ff99e\") " pod="openstack/glance-db-create-vlb9z" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.323757 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vlb9z" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.398336 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzlz6\" (UniqueName: \"kubernetes.io/projected/235a2c28-6291-44bc-8b66-10fe612e0c9e-kube-api-access-kzlz6\") pod \"glance-284f-account-create-update-w9fdc\" (UID: \"235a2c28-6291-44bc-8b66-10fe612e0c9e\") " pod="openstack/glance-284f-account-create-update-w9fdc" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.398870 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/235a2c28-6291-44bc-8b66-10fe612e0c9e-operator-scripts\") pod \"glance-284f-account-create-update-w9fdc\" (UID: \"235a2c28-6291-44bc-8b66-10fe612e0c9e\") " pod="openstack/glance-284f-account-create-update-w9fdc" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.399676 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/235a2c28-6291-44bc-8b66-10fe612e0c9e-operator-scripts\") pod \"glance-284f-account-create-update-w9fdc\" (UID: \"235a2c28-6291-44bc-8b66-10fe612e0c9e\") " pod="openstack/glance-284f-account-create-update-w9fdc" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.459444 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzlz6\" (UniqueName: \"kubernetes.io/projected/235a2c28-6291-44bc-8b66-10fe612e0c9e-kube-api-access-kzlz6\") pod \"glance-284f-account-create-update-w9fdc\" (UID: \"235a2c28-6291-44bc-8b66-10fe612e0c9e\") " pod="openstack/glance-284f-account-create-update-w9fdc" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.486587 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-284f-account-create-update-w9fdc" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.582140 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2e07-account-create-update-sqj4n" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.603167 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35127816-7431-4cc3-abde-24014813fbec-operator-scripts\") pod \"35127816-7431-4cc3-abde-24014813fbec\" (UID: \"35127816-7431-4cc3-abde-24014813fbec\") " Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.603319 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg8xh\" (UniqueName: \"kubernetes.io/projected/35127816-7431-4cc3-abde-24014813fbec-kube-api-access-sg8xh\") pod \"35127816-7431-4cc3-abde-24014813fbec\" (UID: \"35127816-7431-4cc3-abde-24014813fbec\") " Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.607036 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35127816-7431-4cc3-abde-24014813fbec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35127816-7431-4cc3-abde-24014813fbec" (UID: "35127816-7431-4cc3-abde-24014813fbec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.608488 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35127816-7431-4cc3-abde-24014813fbec-kube-api-access-sg8xh" (OuterVolumeSpecName: "kube-api-access-sg8xh") pod "35127816-7431-4cc3-abde-24014813fbec" (UID: "35127816-7431-4cc3-abde-24014813fbec"). InnerVolumeSpecName "kube-api-access-sg8xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.704945 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg8xh\" (UniqueName: \"kubernetes.io/projected/35127816-7431-4cc3-abde-24014813fbec-kube-api-access-sg8xh\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.704996 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35127816-7431-4cc3-abde-24014813fbec-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.727846 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b619f375-a02c-434f-a7fb-fc15a9c19bc3" path="/var/lib/kubelet/pods/b619f375-a02c-434f-a7fb-fc15a9c19bc3/volumes" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.765043 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dcn6z" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.785644 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1011-account-create-update-vx6cd" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.806825 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0542fde3-000d-4897-9f0b-ec6a050ac5be-operator-scripts\") pod \"0542fde3-000d-4897-9f0b-ec6a050ac5be\" (UID: \"0542fde3-000d-4897-9f0b-ec6a050ac5be\") " Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.809275 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0542fde3-000d-4897-9f0b-ec6a050ac5be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0542fde3-000d-4897-9f0b-ec6a050ac5be" (UID: "0542fde3-000d-4897-9f0b-ec6a050ac5be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.907906 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkgw8\" (UniqueName: \"kubernetes.io/projected/3940758f-3420-45cc-8824-06a4daf1b598-kube-api-access-rkgw8\") pod \"3940758f-3420-45cc-8824-06a4daf1b598\" (UID: \"3940758f-3420-45cc-8824-06a4daf1b598\") " Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.908167 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3940758f-3420-45cc-8824-06a4daf1b598-operator-scripts\") pod \"3940758f-3420-45cc-8824-06a4daf1b598\" (UID: \"3940758f-3420-45cc-8824-06a4daf1b598\") " Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.908318 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfds7\" (UniqueName: \"kubernetes.io/projected/0542fde3-000d-4897-9f0b-ec6a050ac5be-kube-api-access-pfds7\") pod \"0542fde3-000d-4897-9f0b-ec6a050ac5be\" (UID: \"0542fde3-000d-4897-9f0b-ec6a050ac5be\") " Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.908699 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0542fde3-000d-4897-9f0b-ec6a050ac5be-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.913001 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3940758f-3420-45cc-8824-06a4daf1b598-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3940758f-3420-45cc-8824-06a4daf1b598" (UID: "3940758f-3420-45cc-8824-06a4daf1b598"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.915468 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0542fde3-000d-4897-9f0b-ec6a050ac5be-kube-api-access-pfds7" (OuterVolumeSpecName: "kube-api-access-pfds7") pod "0542fde3-000d-4897-9f0b-ec6a050ac5be" (UID: "0542fde3-000d-4897-9f0b-ec6a050ac5be"). InnerVolumeSpecName "kube-api-access-pfds7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:31 crc kubenswrapper[5118]: I0223 07:05:31.916715 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3940758f-3420-45cc-8824-06a4daf1b598-kube-api-access-rkgw8" (OuterVolumeSpecName: "kube-api-access-rkgw8") pod "3940758f-3420-45cc-8824-06a4daf1b598" (UID: "3940758f-3420-45cc-8824-06a4daf1b598"). InnerVolumeSpecName "kube-api-access-rkgw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.010321 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfds7\" (UniqueName: \"kubernetes.io/projected/0542fde3-000d-4897-9f0b-ec6a050ac5be-kube-api-access-pfds7\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.011613 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkgw8\" (UniqueName: \"kubernetes.io/projected/3940758f-3420-45cc-8824-06a4daf1b598-kube-api-access-rkgw8\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.011637 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3940758f-3420-45cc-8824-06a4daf1b598-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.038312 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" event={"ID":"d4b341f1-f2c2-4114-b200-1d1c351e84ac","Type":"ContainerStarted","Data":"396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1"} Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.038421 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.042438 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2e07-account-create-update-sqj4n" event={"ID":"35127816-7431-4cc3-abde-24014813fbec","Type":"ContainerDied","Data":"e5cef4db065e0c4a32430960e68a765611d6baba2947a25b7d32dae22e1fd69e"} Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.042475 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5cef4db065e0c4a32430960e68a765611d6baba2947a25b7d32dae22e1fd69e" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.042551 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2e07-account-create-update-sqj4n" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.082916 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1011-account-create-update-vx6cd" event={"ID":"0542fde3-000d-4897-9f0b-ec6a050ac5be","Type":"ContainerDied","Data":"6bb79e57ffa75777379a8efe1d798210e82a507f7f576760b1a0fb68128ada43"} Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.082977 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bb79e57ffa75777379a8efe1d798210e82a507f7f576760b1a0fb68128ada43" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.083076 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1011-account-create-update-vx6cd" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.090667 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dcn6z" event={"ID":"3940758f-3420-45cc-8824-06a4daf1b598","Type":"ContainerDied","Data":"ce28f033d8314c886ed5971b5d0c4cfc7f5befb8649a3008e78c05bb8ffae03b"} Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.090724 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce28f033d8314c886ed5971b5d0c4cfc7f5befb8649a3008e78c05bb8ffae03b" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.090806 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dcn6z" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.097425 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" podStartSLOduration=4.097399545 podStartE2EDuration="4.097399545s" podCreationTimestamp="2026-02-23 07:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:32.081161562 +0000 UTC m=+1195.084946155" watchObservedRunningTime="2026-02-23 07:05:32.097399545 +0000 UTC m=+1195.101184118" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.119778 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-284f-account-create-update-w9fdc"] Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.172163 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vlb9z"] Feb 23 07:05:32 crc kubenswrapper[5118]: W0223 07:05:32.175700 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod892a426d_2034_416c_b5ce_9dbe665ff99e.slice/crio-3901d55882a98b44d18a0351c88165653f486bada663ac8528593b498bc7ebed WatchSource:0}: Error finding container 3901d55882a98b44d18a0351c88165653f486bada663ac8528593b498bc7ebed: Status 404 returned error can't find the container with id 3901d55882a98b44d18a0351c88165653f486bada663ac8528593b498bc7ebed Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.802639 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-x2lv5"] Feb 23 07:05:32 crc kubenswrapper[5118]: E0223 07:05:32.802985 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0542fde3-000d-4897-9f0b-ec6a050ac5be" containerName="mariadb-account-create-update" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.803003 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0542fde3-000d-4897-9f0b-ec6a050ac5be" containerName="mariadb-account-create-update" Feb 23 07:05:32 crc kubenswrapper[5118]: E0223 07:05:32.803025 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3940758f-3420-45cc-8824-06a4daf1b598" containerName="mariadb-database-create" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.803034 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="3940758f-3420-45cc-8824-06a4daf1b598" containerName="mariadb-database-create" Feb 23 07:05:32 crc kubenswrapper[5118]: E0223 07:05:32.803048 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35127816-7431-4cc3-abde-24014813fbec" containerName="mariadb-account-create-update" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.803055 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="35127816-7431-4cc3-abde-24014813fbec" containerName="mariadb-account-create-update" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.803231 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0542fde3-000d-4897-9f0b-ec6a050ac5be" containerName="mariadb-account-create-update" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.803245 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="3940758f-3420-45cc-8824-06a4daf1b598" containerName="mariadb-database-create" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.803260 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="35127816-7431-4cc3-abde-24014813fbec" containerName="mariadb-account-create-update" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.803793 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x2lv5" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.807015 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.826001 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x2lv5"] Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.932377 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwh6z\" (UniqueName: \"kubernetes.io/projected/66106e37-3d73-40dd-b86d-b73ed5e8ae74-kube-api-access-mwh6z\") pod \"root-account-create-update-x2lv5\" (UID: \"66106e37-3d73-40dd-b86d-b73ed5e8ae74\") " pod="openstack/root-account-create-update-x2lv5" Feb 23 07:05:32 crc kubenswrapper[5118]: I0223 07:05:32.932461 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66106e37-3d73-40dd-b86d-b73ed5e8ae74-operator-scripts\") pod \"root-account-create-update-x2lv5\" (UID: \"66106e37-3d73-40dd-b86d-b73ed5e8ae74\") " pod="openstack/root-account-create-update-x2lv5" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.034188 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwh6z\" (UniqueName: \"kubernetes.io/projected/66106e37-3d73-40dd-b86d-b73ed5e8ae74-kube-api-access-mwh6z\") pod \"root-account-create-update-x2lv5\" (UID: \"66106e37-3d73-40dd-b86d-b73ed5e8ae74\") " pod="openstack/root-account-create-update-x2lv5" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.034280 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66106e37-3d73-40dd-b86d-b73ed5e8ae74-operator-scripts\") pod \"root-account-create-update-x2lv5\" (UID: \"66106e37-3d73-40dd-b86d-b73ed5e8ae74\") " pod="openstack/root-account-create-update-x2lv5" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.034395 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:33 crc kubenswrapper[5118]: E0223 07:05:33.034709 5118 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:05:33 crc kubenswrapper[5118]: E0223 07:05:33.034743 5118 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:05:33 crc kubenswrapper[5118]: E0223 07:05:33.034860 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift podName:b7e8b9fa-2985-45f6-97e1-77a56b8ba9da nodeName:}" failed. No retries permitted until 2026-02-23 07:05:37.034809459 +0000 UTC m=+1200.038594072 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift") pod "swift-storage-0" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da") : configmap "swift-ring-files" not found Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.035864 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66106e37-3d73-40dd-b86d-b73ed5e8ae74-operator-scripts\") pod \"root-account-create-update-x2lv5\" (UID: \"66106e37-3d73-40dd-b86d-b73ed5e8ae74\") " pod="openstack/root-account-create-update-x2lv5" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.060703 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwh6z\" (UniqueName: \"kubernetes.io/projected/66106e37-3d73-40dd-b86d-b73ed5e8ae74-kube-api-access-mwh6z\") pod \"root-account-create-update-x2lv5\" (UID: \"66106e37-3d73-40dd-b86d-b73ed5e8ae74\") " pod="openstack/root-account-create-update-x2lv5" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.103226 5118 generic.go:334] "Generic (PLEG): container finished" podID="892a426d-2034-416c-b5ce-9dbe665ff99e" containerID="d3a70082746e2d19d58943e96283801f0727e8f267055d1291d429d8d0285236" exitCode=0 Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.103317 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vlb9z" event={"ID":"892a426d-2034-416c-b5ce-9dbe665ff99e","Type":"ContainerDied","Data":"d3a70082746e2d19d58943e96283801f0727e8f267055d1291d429d8d0285236"} Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.103378 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vlb9z" event={"ID":"892a426d-2034-416c-b5ce-9dbe665ff99e","Type":"ContainerStarted","Data":"3901d55882a98b44d18a0351c88165653f486bada663ac8528593b498bc7ebed"} Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.105601 5118 generic.go:334] "Generic (PLEG): container finished" podID="235a2c28-6291-44bc-8b66-10fe612e0c9e" containerID="0e7c3b3803f8e6c09d371de5e9ca37ddd7fa5760543321c91850ad4c7a5f01e4" exitCode=0 Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.105686 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-284f-account-create-update-w9fdc" event={"ID":"235a2c28-6291-44bc-8b66-10fe612e0c9e","Type":"ContainerDied","Data":"0e7c3b3803f8e6c09d371de5e9ca37ddd7fa5760543321c91850ad4c7a5f01e4"} Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.105728 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-284f-account-create-update-w9fdc" event={"ID":"235a2c28-6291-44bc-8b66-10fe612e0c9e","Type":"ContainerStarted","Data":"51028f7ca4bf60925e8e1e8c65e641e6d5ca3f6bbc1ea9713479b04acfea47fa"} Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.122751 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x2lv5" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.185409 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7lndk"] Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.186643 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.191386 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.191580 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.191758 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.209615 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7lndk"] Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.349053 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-combined-ca-bundle\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.349599 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-swiftconf\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.349633 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81af6409-f8ce-485c-a2a1-1b1cce7c5433-etc-swift\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.349678 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7t5h\" (UniqueName: \"kubernetes.io/projected/81af6409-f8ce-485c-a2a1-1b1cce7c5433-kube-api-access-s7t5h\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.349717 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-scripts\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.349744 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-ring-data-devices\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.349763 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-dispersionconf\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.452249 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-combined-ca-bundle\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.452396 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-swiftconf\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.452435 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81af6409-f8ce-485c-a2a1-1b1cce7c5433-etc-swift\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.452478 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7t5h\" (UniqueName: \"kubernetes.io/projected/81af6409-f8ce-485c-a2a1-1b1cce7c5433-kube-api-access-s7t5h\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.452515 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-scripts\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.452546 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-ring-data-devices\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.452577 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-dispersionconf\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.453236 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81af6409-f8ce-485c-a2a1-1b1cce7c5433-etc-swift\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.453596 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-scripts\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.453607 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-ring-data-devices\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.457637 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-swiftconf\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.458652 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-combined-ca-bundle\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.467610 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-dispersionconf\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.481781 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7t5h\" (UniqueName: \"kubernetes.io/projected/81af6409-f8ce-485c-a2a1-1b1cce7c5433-kube-api-access-s7t5h\") pod \"swift-ring-rebalance-7lndk\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.544175 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:33 crc kubenswrapper[5118]: I0223 07:05:33.661870 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x2lv5"] Feb 23 07:05:33 crc kubenswrapper[5118]: W0223 07:05:33.669814 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66106e37_3d73_40dd_b86d_b73ed5e8ae74.slice/crio-0bc3306149e572e71317d75b5cee4ba5f49cb72aadb3a67c927488255996a0b3 WatchSource:0}: Error finding container 0bc3306149e572e71317d75b5cee4ba5f49cb72aadb3a67c927488255996a0b3: Status 404 returned error can't find the container with id 0bc3306149e572e71317d75b5cee4ba5f49cb72aadb3a67c927488255996a0b3 Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.021118 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7lndk"] Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.117386 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7lndk" event={"ID":"81af6409-f8ce-485c-a2a1-1b1cce7c5433","Type":"ContainerStarted","Data":"8ae886d3dad73cfa747014d2930aeb3c5a885320bbc58f8eb5a8d5c85d9d14a3"} Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.119591 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x2lv5" event={"ID":"66106e37-3d73-40dd-b86d-b73ed5e8ae74","Type":"ContainerStarted","Data":"a2e3b6a370edeb80f110bba958bb271f7682fd85974733af6d118cfdaf41a31c"} Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.119648 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x2lv5" event={"ID":"66106e37-3d73-40dd-b86d-b73ed5e8ae74","Type":"ContainerStarted","Data":"0bc3306149e572e71317d75b5cee4ba5f49cb72aadb3a67c927488255996a0b3"} Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.143436 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-x2lv5" podStartSLOduration=2.143408007 podStartE2EDuration="2.143408007s" podCreationTimestamp="2026-02-23 07:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:34.137839896 +0000 UTC m=+1197.141624499" watchObservedRunningTime="2026-02-23 07:05:34.143408007 +0000 UTC m=+1197.147192600" Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.650584 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vlb9z" Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.660826 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-284f-account-create-update-w9fdc" Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.788773 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzlz6\" (UniqueName: \"kubernetes.io/projected/235a2c28-6291-44bc-8b66-10fe612e0c9e-kube-api-access-kzlz6\") pod \"235a2c28-6291-44bc-8b66-10fe612e0c9e\" (UID: \"235a2c28-6291-44bc-8b66-10fe612e0c9e\") " Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.788973 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892a426d-2034-416c-b5ce-9dbe665ff99e-operator-scripts\") pod \"892a426d-2034-416c-b5ce-9dbe665ff99e\" (UID: \"892a426d-2034-416c-b5ce-9dbe665ff99e\") " Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.789053 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/235a2c28-6291-44bc-8b66-10fe612e0c9e-operator-scripts\") pod \"235a2c28-6291-44bc-8b66-10fe612e0c9e\" (UID: \"235a2c28-6291-44bc-8b66-10fe612e0c9e\") " Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.789184 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctphp\" (UniqueName: \"kubernetes.io/projected/892a426d-2034-416c-b5ce-9dbe665ff99e-kube-api-access-ctphp\") pod \"892a426d-2034-416c-b5ce-9dbe665ff99e\" (UID: \"892a426d-2034-416c-b5ce-9dbe665ff99e\") " Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.790129 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/892a426d-2034-416c-b5ce-9dbe665ff99e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "892a426d-2034-416c-b5ce-9dbe665ff99e" (UID: "892a426d-2034-416c-b5ce-9dbe665ff99e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.790767 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/235a2c28-6291-44bc-8b66-10fe612e0c9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "235a2c28-6291-44bc-8b66-10fe612e0c9e" (UID: "235a2c28-6291-44bc-8b66-10fe612e0c9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.796768 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235a2c28-6291-44bc-8b66-10fe612e0c9e-kube-api-access-kzlz6" (OuterVolumeSpecName: "kube-api-access-kzlz6") pod "235a2c28-6291-44bc-8b66-10fe612e0c9e" (UID: "235a2c28-6291-44bc-8b66-10fe612e0c9e"). InnerVolumeSpecName "kube-api-access-kzlz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.801569 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892a426d-2034-416c-b5ce-9dbe665ff99e-kube-api-access-ctphp" (OuterVolumeSpecName: "kube-api-access-ctphp") pod "892a426d-2034-416c-b5ce-9dbe665ff99e" (UID: "892a426d-2034-416c-b5ce-9dbe665ff99e"). InnerVolumeSpecName "kube-api-access-ctphp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.892139 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/235a2c28-6291-44bc-8b66-10fe612e0c9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.892179 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctphp\" (UniqueName: \"kubernetes.io/projected/892a426d-2034-416c-b5ce-9dbe665ff99e-kube-api-access-ctphp\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.892194 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzlz6\" (UniqueName: \"kubernetes.io/projected/235a2c28-6291-44bc-8b66-10fe612e0c9e-kube-api-access-kzlz6\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:34 crc kubenswrapper[5118]: I0223 07:05:34.892206 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/892a426d-2034-416c-b5ce-9dbe665ff99e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:35 crc kubenswrapper[5118]: I0223 07:05:35.130157 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vlb9z" Feb 23 07:05:35 crc kubenswrapper[5118]: I0223 07:05:35.130181 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vlb9z" event={"ID":"892a426d-2034-416c-b5ce-9dbe665ff99e","Type":"ContainerDied","Data":"3901d55882a98b44d18a0351c88165653f486bada663ac8528593b498bc7ebed"} Feb 23 07:05:35 crc kubenswrapper[5118]: I0223 07:05:35.130263 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3901d55882a98b44d18a0351c88165653f486bada663ac8528593b498bc7ebed" Feb 23 07:05:35 crc kubenswrapper[5118]: I0223 07:05:35.132279 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-284f-account-create-update-w9fdc" event={"ID":"235a2c28-6291-44bc-8b66-10fe612e0c9e","Type":"ContainerDied","Data":"51028f7ca4bf60925e8e1e8c65e641e6d5ca3f6bbc1ea9713479b04acfea47fa"} Feb 23 07:05:35 crc kubenswrapper[5118]: I0223 07:05:35.132321 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51028f7ca4bf60925e8e1e8c65e641e6d5ca3f6bbc1ea9713479b04acfea47fa" Feb 23 07:05:35 crc kubenswrapper[5118]: I0223 07:05:35.132332 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-284f-account-create-update-w9fdc" Feb 23 07:05:35 crc kubenswrapper[5118]: I0223 07:05:35.134455 5118 generic.go:334] "Generic (PLEG): container finished" podID="66106e37-3d73-40dd-b86d-b73ed5e8ae74" containerID="a2e3b6a370edeb80f110bba958bb271f7682fd85974733af6d118cfdaf41a31c" exitCode=0 Feb 23 07:05:35 crc kubenswrapper[5118]: I0223 07:05:35.134497 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x2lv5" event={"ID":"66106e37-3d73-40dd-b86d-b73ed5e8ae74","Type":"ContainerDied","Data":"a2e3b6a370edeb80f110bba958bb271f7682fd85974733af6d118cfdaf41a31c"} Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.371699 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ms95n"] Feb 23 07:05:36 crc kubenswrapper[5118]: E0223 07:05:36.372554 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235a2c28-6291-44bc-8b66-10fe612e0c9e" containerName="mariadb-account-create-update" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.372571 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="235a2c28-6291-44bc-8b66-10fe612e0c9e" containerName="mariadb-account-create-update" Feb 23 07:05:36 crc kubenswrapper[5118]: E0223 07:05:36.372612 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892a426d-2034-416c-b5ce-9dbe665ff99e" containerName="mariadb-database-create" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.372619 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="892a426d-2034-416c-b5ce-9dbe665ff99e" containerName="mariadb-database-create" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.372829 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="892a426d-2034-416c-b5ce-9dbe665ff99e" containerName="mariadb-database-create" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.372850 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="235a2c28-6291-44bc-8b66-10fe612e0c9e" containerName="mariadb-account-create-update" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.373497 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ms95n"] Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.373591 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.392601 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zf7tv" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.392870 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.526028 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-db-sync-config-data\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.526305 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-config-data\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.526370 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzps2\" (UniqueName: \"kubernetes.io/projected/222c1eb0-e9da-4365-ad64-850496d1ceb7-kube-api-access-xzps2\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.526419 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-combined-ca-bundle\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.628022 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-db-sync-config-data\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.628159 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-config-data\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.628233 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzps2\" (UniqueName: \"kubernetes.io/projected/222c1eb0-e9da-4365-ad64-850496d1ceb7-kube-api-access-xzps2\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.628285 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-combined-ca-bundle\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.636320 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-config-data\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.636337 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-db-sync-config-data\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.638131 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-combined-ca-bundle\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.665705 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzps2\" (UniqueName: \"kubernetes.io/projected/222c1eb0-e9da-4365-ad64-850496d1ceb7-kube-api-access-xzps2\") pod \"glance-db-sync-ms95n\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:36 crc kubenswrapper[5118]: I0223 07:05:36.727165 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ms95n" Feb 23 07:05:37 crc kubenswrapper[5118]: I0223 07:05:37.044067 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:37 crc kubenswrapper[5118]: E0223 07:05:37.044350 5118 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:05:37 crc kubenswrapper[5118]: E0223 07:05:37.044394 5118 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:05:37 crc kubenswrapper[5118]: E0223 07:05:37.044526 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift podName:b7e8b9fa-2985-45f6-97e1-77a56b8ba9da nodeName:}" failed. No retries permitted until 2026-02-23 07:05:45.044472923 +0000 UTC m=+1208.048257506 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift") pod "swift-storage-0" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da") : configmap "swift-ring-files" not found Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.175268 5118 generic.go:334] "Generic (PLEG): container finished" podID="5721793b-d753-4519-b484-fa9cb958def9" containerID="765ded8ac4c466b73ff678e59e3c1bd2c7e26f27d5b181a36f4fa8da845c96bd" exitCode=0 Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.175817 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5721793b-d753-4519-b484-fa9cb958def9","Type":"ContainerDied","Data":"765ded8ac4c466b73ff678e59e3c1bd2c7e26f27d5b181a36f4fa8da845c96bd"} Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.180328 5118 generic.go:334] "Generic (PLEG): container finished" podID="e3b37356-5c38-40b3-af55-4f25a2f16b21" containerID="cdb183b32df15cc29470c163dcec76e49a1bcccca2adcf52c5951d4b1ff228f2" exitCode=0 Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.180378 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e3b37356-5c38-40b3-af55-4f25a2f16b21","Type":"ContainerDied","Data":"cdb183b32df15cc29470c163dcec76e49a1bcccca2adcf52c5951d4b1ff228f2"} Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.193285 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x2lv5" event={"ID":"66106e37-3d73-40dd-b86d-b73ed5e8ae74","Type":"ContainerDied","Data":"0bc3306149e572e71317d75b5cee4ba5f49cb72aadb3a67c927488255996a0b3"} Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.193344 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bc3306149e572e71317d75b5cee4ba5f49cb72aadb3a67c927488255996a0b3" Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.396536 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x2lv5" Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.544290 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66106e37-3d73-40dd-b86d-b73ed5e8ae74-operator-scripts\") pod \"66106e37-3d73-40dd-b86d-b73ed5e8ae74\" (UID: \"66106e37-3d73-40dd-b86d-b73ed5e8ae74\") " Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.544609 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwh6z\" (UniqueName: \"kubernetes.io/projected/66106e37-3d73-40dd-b86d-b73ed5e8ae74-kube-api-access-mwh6z\") pod \"66106e37-3d73-40dd-b86d-b73ed5e8ae74\" (UID: \"66106e37-3d73-40dd-b86d-b73ed5e8ae74\") " Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.546895 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66106e37-3d73-40dd-b86d-b73ed5e8ae74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66106e37-3d73-40dd-b86d-b73ed5e8ae74" (UID: "66106e37-3d73-40dd-b86d-b73ed5e8ae74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.552072 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66106e37-3d73-40dd-b86d-b73ed5e8ae74-kube-api-access-mwh6z" (OuterVolumeSpecName: "kube-api-access-mwh6z") pod "66106e37-3d73-40dd-b86d-b73ed5e8ae74" (UID: "66106e37-3d73-40dd-b86d-b73ed5e8ae74"). InnerVolumeSpecName "kube-api-access-mwh6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.641347 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.646777 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwh6z\" (UniqueName: \"kubernetes.io/projected/66106e37-3d73-40dd-b86d-b73ed5e8ae74-kube-api-access-mwh6z\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.646833 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66106e37-3d73-40dd-b86d-b73ed5e8ae74-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.701521 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-jnx59"] Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.701796 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" podUID="b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" containerName="dnsmasq-dns" containerID="cri-o://e04bed901323060d8bdbf121a8c0a21a8592ac08dc80af47b6dc5111c17834a4" gracePeriod=10 Feb 23 07:05:38 crc kubenswrapper[5118]: I0223 07:05:38.927545 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ms95n"] Feb 23 07:05:38 crc kubenswrapper[5118]: W0223 07:05:38.929670 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod222c1eb0_e9da_4365_ad64_850496d1ceb7.slice/crio-447092a4d8d04432a4890c4c9b676d06a6ea61ddebae9ea5050e374ea476347a WatchSource:0}: Error finding container 447092a4d8d04432a4890c4c9b676d06a6ea61ddebae9ea5050e374ea476347a: Status 404 returned error can't find the container with id 447092a4d8d04432a4890c4c9b676d06a6ea61ddebae9ea5050e374ea476347a Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.203468 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ms95n" event={"ID":"222c1eb0-e9da-4365-ad64-850496d1ceb7","Type":"ContainerStarted","Data":"447092a4d8d04432a4890c4c9b676d06a6ea61ddebae9ea5050e374ea476347a"} Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.205936 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5721793b-d753-4519-b484-fa9cb958def9","Type":"ContainerStarted","Data":"23aa93acb427382b8b9e35e7ba3fe6a0a163d9178713ac3213c7a866a2c6d3e7"} Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.206291 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.207519 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7lndk" event={"ID":"81af6409-f8ce-485c-a2a1-1b1cce7c5433","Type":"ContainerStarted","Data":"140e9d9472eed0af6d34cad05e08bfca2a8201add4f370c162aeadd6dde25013"} Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.210462 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e3b37356-5c38-40b3-af55-4f25a2f16b21","Type":"ContainerStarted","Data":"0a5be7ec0d548d228afd1de2a7afc107749b13909039d656bc1026e5bcf306a3"} Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.210892 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.212564 5118 generic.go:334] "Generic (PLEG): container finished" podID="b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" containerID="e04bed901323060d8bdbf121a8c0a21a8592ac08dc80af47b6dc5111c17834a4" exitCode=0 Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.212621 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x2lv5" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.213535 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" event={"ID":"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10","Type":"ContainerDied","Data":"e04bed901323060d8bdbf121a8c0a21a8592ac08dc80af47b6dc5111c17834a4"} Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.213586 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" event={"ID":"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10","Type":"ContainerDied","Data":"1a015464589a0b86a2ef4880bf31ab7687571218eae14289cdd72e8cc56afcbe"} Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.213605 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a015464589a0b86a2ef4880bf31ab7687571218eae14289cdd72e8cc56afcbe" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.248821 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.595664764 podStartE2EDuration="58.248799098s" podCreationTimestamp="2026-02-23 07:04:41 +0000 UTC" firstStartedPulling="2026-02-23 07:04:48.817489315 +0000 UTC m=+1151.821273888" lastFinishedPulling="2026-02-23 07:05:03.470623649 +0000 UTC m=+1166.474408222" observedRunningTime="2026-02-23 07:05:39.239709954 +0000 UTC m=+1202.243494527" watchObservedRunningTime="2026-02-23 07:05:39.248799098 +0000 UTC m=+1202.252583671" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.259138 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.300629 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.383508408 podStartE2EDuration="59.30061179s" podCreationTimestamp="2026-02-23 07:04:40 +0000 UTC" firstStartedPulling="2026-02-23 07:04:42.428763116 +0000 UTC m=+1145.432547689" lastFinishedPulling="2026-02-23 07:05:03.345866498 +0000 UTC m=+1166.349651071" observedRunningTime="2026-02-23 07:05:39.294085005 +0000 UTC m=+1202.297869578" watchObservedRunningTime="2026-02-23 07:05:39.30061179 +0000 UTC m=+1202.304396363" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.303192 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-7lndk" podStartSLOduration=2.117548367 podStartE2EDuration="6.303181479s" podCreationTimestamp="2026-02-23 07:05:33 +0000 UTC" firstStartedPulling="2026-02-23 07:05:34.033652471 +0000 UTC m=+1197.037437074" lastFinishedPulling="2026-02-23 07:05:38.219285603 +0000 UTC m=+1201.223070186" observedRunningTime="2026-02-23 07:05:39.268725678 +0000 UTC m=+1202.272510251" watchObservedRunningTime="2026-02-23 07:05:39.303181479 +0000 UTC m=+1202.306966052" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.362230 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkz8j\" (UniqueName: \"kubernetes.io/projected/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-kube-api-access-tkz8j\") pod \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.362484 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-sb\") pod \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.362522 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-config\") pod \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.362582 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-nb\") pod \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.362617 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-dns-svc\") pod \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\" (UID: \"b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10\") " Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.383193 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-kube-api-access-tkz8j" (OuterVolumeSpecName: "kube-api-access-tkz8j") pod "b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" (UID: "b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10"). InnerVolumeSpecName "kube-api-access-tkz8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.416199 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-config" (OuterVolumeSpecName: "config") pod "b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" (UID: "b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.447539 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" (UID: "b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.448620 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" (UID: "b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.459606 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" (UID: "b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.465477 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkz8j\" (UniqueName: \"kubernetes.io/projected/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-kube-api-access-tkz8j\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.465517 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.465528 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.465538 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.465546 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:39 crc kubenswrapper[5118]: I0223 07:05:39.800337 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 23 07:05:40 crc kubenswrapper[5118]: I0223 07:05:40.221303 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" Feb 23 07:05:40 crc kubenswrapper[5118]: I0223 07:05:40.259270 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-jnx59"] Feb 23 07:05:40 crc kubenswrapper[5118]: I0223 07:05:40.267852 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-jnx59"] Feb 23 07:05:41 crc kubenswrapper[5118]: I0223 07:05:41.711397 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" path="/var/lib/kubelet/pods/b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10/volumes" Feb 23 07:05:43 crc kubenswrapper[5118]: I0223 07:05:43.945274 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75b7bcc64f-jnx59" podUID="b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: i/o timeout" Feb 23 07:05:44 crc kubenswrapper[5118]: I0223 07:05:44.079012 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-x2lv5"] Feb 23 07:05:44 crc kubenswrapper[5118]: I0223 07:05:44.087264 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-x2lv5"] Feb 23 07:05:45 crc kubenswrapper[5118]: I0223 07:05:45.082395 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:05:45 crc kubenswrapper[5118]: E0223 07:05:45.082657 5118 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 07:05:45 crc kubenswrapper[5118]: E0223 07:05:45.082676 5118 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 07:05:45 crc kubenswrapper[5118]: E0223 07:05:45.082729 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift podName:b7e8b9fa-2985-45f6-97e1-77a56b8ba9da nodeName:}" failed. No retries permitted until 2026-02-23 07:06:01.08271359 +0000 UTC m=+1224.086498163 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift") pod "swift-storage-0" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da") : configmap "swift-ring-files" not found Feb 23 07:05:45 crc kubenswrapper[5118]: I0223 07:05:45.724711 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66106e37-3d73-40dd-b86d-b73ed5e8ae74" path="/var/lib/kubelet/pods/66106e37-3d73-40dd-b86d-b73ed5e8ae74/volumes" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.276365 5118 generic.go:334] "Generic (PLEG): container finished" podID="81af6409-f8ce-485c-a2a1-1b1cce7c5433" containerID="140e9d9472eed0af6d34cad05e08bfca2a8201add4f370c162aeadd6dde25013" exitCode=0 Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.276522 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7lndk" event={"ID":"81af6409-f8ce-485c-a2a1-1b1cce7c5433","Type":"ContainerDied","Data":"140e9d9472eed0af6d34cad05e08bfca2a8201add4f370c162aeadd6dde25013"} Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.688506 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-f8rdg" podUID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" containerName="ovn-controller" probeResult="failure" output=< Feb 23 07:05:46 crc kubenswrapper[5118]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 23 07:05:46 crc kubenswrapper[5118]: > Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.707439 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.729249 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.962388 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-f8rdg-config-gdzbt"] Feb 23 07:05:46 crc kubenswrapper[5118]: E0223 07:05:46.962927 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" containerName="init" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.962950 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" containerName="init" Feb 23 07:05:46 crc kubenswrapper[5118]: E0223 07:05:46.962991 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" containerName="dnsmasq-dns" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.963000 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" containerName="dnsmasq-dns" Feb 23 07:05:46 crc kubenswrapper[5118]: E0223 07:05:46.963016 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66106e37-3d73-40dd-b86d-b73ed5e8ae74" containerName="mariadb-account-create-update" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.963024 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="66106e37-3d73-40dd-b86d-b73ed5e8ae74" containerName="mariadb-account-create-update" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.964135 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bdbfaa-d22f-4983-ab46-f7d2ae6a4e10" containerName="dnsmasq-dns" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.964163 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="66106e37-3d73-40dd-b86d-b73ed5e8ae74" containerName="mariadb-account-create-update" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.965042 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.968462 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 23 07:05:46 crc kubenswrapper[5118]: I0223 07:05:46.982561 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f8rdg-config-gdzbt"] Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.129845 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4s6m\" (UniqueName: \"kubernetes.io/projected/c4682954-c391-4b8e-8f31-0d522d321b21-kube-api-access-s4s6m\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.129902 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run-ovn\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.129923 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-log-ovn\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.129997 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-scripts\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.130027 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.130085 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-additional-scripts\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.232464 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4s6m\" (UniqueName: \"kubernetes.io/projected/c4682954-c391-4b8e-8f31-0d522d321b21-kube-api-access-s4s6m\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.232531 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run-ovn\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.232557 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-log-ovn\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.232586 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-scripts\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.232610 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.232660 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-additional-scripts\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.232896 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run-ovn\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.233519 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-log-ovn\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.233527 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-additional-scripts\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.233545 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.236081 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-scripts\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.253485 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4s6m\" (UniqueName: \"kubernetes.io/projected/c4682954-c391-4b8e-8f31-0d522d321b21-kube-api-access-s4s6m\") pod \"ovn-controller-f8rdg-config-gdzbt\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:47 crc kubenswrapper[5118]: I0223 07:05:47.288570 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.102949 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fst8q"] Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.105322 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fst8q" Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.107957 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.110323 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fst8q"] Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.184192 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkght\" (UniqueName: \"kubernetes.io/projected/3af5d4fa-f037-4eb9-a893-7506a4541eb5-kube-api-access-dkght\") pod \"root-account-create-update-fst8q\" (UID: \"3af5d4fa-f037-4eb9-a893-7506a4541eb5\") " pod="openstack/root-account-create-update-fst8q" Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.184411 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af5d4fa-f037-4eb9-a893-7506a4541eb5-operator-scripts\") pod \"root-account-create-update-fst8q\" (UID: \"3af5d4fa-f037-4eb9-a893-7506a4541eb5\") " pod="openstack/root-account-create-update-fst8q" Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.288947 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af5d4fa-f037-4eb9-a893-7506a4541eb5-operator-scripts\") pod \"root-account-create-update-fst8q\" (UID: \"3af5d4fa-f037-4eb9-a893-7506a4541eb5\") " pod="openstack/root-account-create-update-fst8q" Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.289049 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkght\" (UniqueName: \"kubernetes.io/projected/3af5d4fa-f037-4eb9-a893-7506a4541eb5-kube-api-access-dkght\") pod \"root-account-create-update-fst8q\" (UID: \"3af5d4fa-f037-4eb9-a893-7506a4541eb5\") " pod="openstack/root-account-create-update-fst8q" Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.290197 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af5d4fa-f037-4eb9-a893-7506a4541eb5-operator-scripts\") pod \"root-account-create-update-fst8q\" (UID: \"3af5d4fa-f037-4eb9-a893-7506a4541eb5\") " pod="openstack/root-account-create-update-fst8q" Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.317130 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkght\" (UniqueName: \"kubernetes.io/projected/3af5d4fa-f037-4eb9-a893-7506a4541eb5-kube-api-access-dkght\") pod \"root-account-create-update-fst8q\" (UID: \"3af5d4fa-f037-4eb9-a893-7506a4541eb5\") " pod="openstack/root-account-create-update-fst8q" Feb 23 07:05:49 crc kubenswrapper[5118]: I0223 07:05:49.430248 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fst8q" Feb 23 07:05:51 crc kubenswrapper[5118]: I0223 07:05:51.690958 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-f8rdg" podUID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" containerName="ovn-controller" probeResult="failure" output=< Feb 23 07:05:51 crc kubenswrapper[5118]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 23 07:05:51 crc kubenswrapper[5118]: > Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.069314 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.374231 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-wgxlm"] Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.375523 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wgxlm" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.395488 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wgxlm"] Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.495020 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6db8-account-create-update-c2rqc"] Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.496438 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6db8-account-create-update-c2rqc" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.505848 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.506599 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/262a3252-99e5-4a2e-8164-c29c6e9b7764-operator-scripts\") pod \"cinder-db-create-wgxlm\" (UID: \"262a3252-99e5-4a2e-8164-c29c6e9b7764\") " pod="openstack/cinder-db-create-wgxlm" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.506782 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4acffeac-bca3-441a-bf4c-81033e75dd62-operator-scripts\") pod \"cinder-6db8-account-create-update-c2rqc\" (UID: \"4acffeac-bca3-441a-bf4c-81033e75dd62\") " pod="openstack/cinder-6db8-account-create-update-c2rqc" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.506959 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr4mt\" (UniqueName: \"kubernetes.io/projected/262a3252-99e5-4a2e-8164-c29c6e9b7764-kube-api-access-qr4mt\") pod \"cinder-db-create-wgxlm\" (UID: \"262a3252-99e5-4a2e-8164-c29c6e9b7764\") " pod="openstack/cinder-db-create-wgxlm" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.507164 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdvxm\" (UniqueName: \"kubernetes.io/projected/4acffeac-bca3-441a-bf4c-81033e75dd62-kube-api-access-pdvxm\") pod \"cinder-6db8-account-create-update-c2rqc\" (UID: \"4acffeac-bca3-441a-bf4c-81033e75dd62\") " pod="openstack/cinder-6db8-account-create-update-c2rqc" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.538993 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6db8-account-create-update-c2rqc"] Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.608947 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4acffeac-bca3-441a-bf4c-81033e75dd62-operator-scripts\") pod \"cinder-6db8-account-create-update-c2rqc\" (UID: \"4acffeac-bca3-441a-bf4c-81033e75dd62\") " pod="openstack/cinder-6db8-account-create-update-c2rqc" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.609306 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr4mt\" (UniqueName: \"kubernetes.io/projected/262a3252-99e5-4a2e-8164-c29c6e9b7764-kube-api-access-qr4mt\") pod \"cinder-db-create-wgxlm\" (UID: \"262a3252-99e5-4a2e-8164-c29c6e9b7764\") " pod="openstack/cinder-db-create-wgxlm" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.609491 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdvxm\" (UniqueName: \"kubernetes.io/projected/4acffeac-bca3-441a-bf4c-81033e75dd62-kube-api-access-pdvxm\") pod \"cinder-6db8-account-create-update-c2rqc\" (UID: \"4acffeac-bca3-441a-bf4c-81033e75dd62\") " pod="openstack/cinder-6db8-account-create-update-c2rqc" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.609762 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/262a3252-99e5-4a2e-8164-c29c6e9b7764-operator-scripts\") pod \"cinder-db-create-wgxlm\" (UID: \"262a3252-99e5-4a2e-8164-c29c6e9b7764\") " pod="openstack/cinder-db-create-wgxlm" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.610331 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/262a3252-99e5-4a2e-8164-c29c6e9b7764-operator-scripts\") pod \"cinder-db-create-wgxlm\" (UID: \"262a3252-99e5-4a2e-8164-c29c6e9b7764\") " pod="openstack/cinder-db-create-wgxlm" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.610886 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4acffeac-bca3-441a-bf4c-81033e75dd62-operator-scripts\") pod \"cinder-6db8-account-create-update-c2rqc\" (UID: \"4acffeac-bca3-441a-bf4c-81033e75dd62\") " pod="openstack/cinder-6db8-account-create-update-c2rqc" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.652328 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.654190 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr4mt\" (UniqueName: \"kubernetes.io/projected/262a3252-99e5-4a2e-8164-c29c6e9b7764-kube-api-access-qr4mt\") pod \"cinder-db-create-wgxlm\" (UID: \"262a3252-99e5-4a2e-8164-c29c6e9b7764\") " pod="openstack/cinder-db-create-wgxlm" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.671161 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdvxm\" (UniqueName: \"kubernetes.io/projected/4acffeac-bca3-441a-bf4c-81033e75dd62-kube-api-access-pdvxm\") pod \"cinder-6db8-account-create-update-c2rqc\" (UID: \"4acffeac-bca3-441a-bf4c-81033e75dd62\") " pod="openstack/cinder-6db8-account-create-update-c2rqc" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.749451 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wgxlm" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.770151 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wvskp"] Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.773878 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wvskp" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.783216 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wvskp"] Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.793649 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a094-account-create-update-htzlh"] Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.795203 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a094-account-create-update-htzlh" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.797590 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.798396 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.832931 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6db8-account-create-update-c2rqc" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.860526 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a094-account-create-update-htzlh"] Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.927995 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2fp2g"] Feb 23 07:05:52 crc kubenswrapper[5118]: E0223 07:05:52.928596 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81af6409-f8ce-485c-a2a1-1b1cce7c5433" containerName="swift-ring-rebalance" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.928618 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="81af6409-f8ce-485c-a2a1-1b1cce7c5433" containerName="swift-ring-rebalance" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.928815 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="81af6409-f8ce-485c-a2a1-1b1cce7c5433" containerName="swift-ring-rebalance" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.929644 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2fp2g" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.931551 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-ring-data-devices\") pod \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.931649 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-scripts\") pod \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.931804 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81af6409-f8ce-485c-a2a1-1b1cce7c5433-etc-swift\") pod \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.931927 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-swiftconf\") pod \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.931963 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-combined-ca-bundle\") pod \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.932035 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-dispersionconf\") pod \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.932066 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7t5h\" (UniqueName: \"kubernetes.io/projected/81af6409-f8ce-485c-a2a1-1b1cce7c5433-kube-api-access-s7t5h\") pod \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\" (UID: \"81af6409-f8ce-485c-a2a1-1b1cce7c5433\") " Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.932434 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "81af6409-f8ce-485c-a2a1-1b1cce7c5433" (UID: "81af6409-f8ce-485c-a2a1-1b1cce7c5433"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.936987 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81af6409-f8ce-485c-a2a1-1b1cce7c5433-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "81af6409-f8ce-485c-a2a1-1b1cce7c5433" (UID: "81af6409-f8ce-485c-a2a1-1b1cce7c5433"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.937764 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdc4j\" (UniqueName: \"kubernetes.io/projected/50ec3af7-316f-43bc-9a57-cee6a641b441-kube-api-access-fdc4j\") pod \"barbican-a094-account-create-update-htzlh\" (UID: \"50ec3af7-316f-43bc-9a57-cee6a641b441\") " pod="openstack/barbican-a094-account-create-update-htzlh" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.937801 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t6zq\" (UniqueName: \"kubernetes.io/projected/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-kube-api-access-6t6zq\") pod \"barbican-db-create-wvskp\" (UID: \"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488\") " pod="openstack/barbican-db-create-wvskp" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.937992 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50ec3af7-316f-43bc-9a57-cee6a641b441-operator-scripts\") pod \"barbican-a094-account-create-update-htzlh\" (UID: \"50ec3af7-316f-43bc-9a57-cee6a641b441\") " pod="openstack/barbican-a094-account-create-update-htzlh" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.938120 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-operator-scripts\") pod \"barbican-db-create-wvskp\" (UID: \"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488\") " pod="openstack/barbican-db-create-wvskp" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.938190 5118 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/81af6409-f8ce-485c-a2a1-1b1cce7c5433-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.938206 5118 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.943416 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81af6409-f8ce-485c-a2a1-1b1cce7c5433-kube-api-access-s7t5h" (OuterVolumeSpecName: "kube-api-access-s7t5h") pod "81af6409-f8ce-485c-a2a1-1b1cce7c5433" (UID: "81af6409-f8ce-485c-a2a1-1b1cce7c5433"). InnerVolumeSpecName "kube-api-access-s7t5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.961015 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "81af6409-f8ce-485c-a2a1-1b1cce7c5433" (UID: "81af6409-f8ce-485c-a2a1-1b1cce7c5433"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.984637 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2fp2g"] Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.985776 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "81af6409-f8ce-485c-a2a1-1b1cce7c5433" (UID: "81af6409-f8ce-485c-a2a1-1b1cce7c5433"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.987494 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81af6409-f8ce-485c-a2a1-1b1cce7c5433" (UID: "81af6409-f8ce-485c-a2a1-1b1cce7c5433"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:05:52 crc kubenswrapper[5118]: I0223 07:05:52.993539 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-scripts" (OuterVolumeSpecName: "scripts") pod "81af6409-f8ce-485c-a2a1-1b1cce7c5433" (UID: "81af6409-f8ce-485c-a2a1-1b1cce7c5433"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.008225 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f94b-account-create-update-dxxsj"] Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.009799 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f94b-account-create-update-dxxsj" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.013065 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.019366 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-bk5mz"] Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.020738 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.022998 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.024256 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.024304 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.024329 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-86zb7" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.031529 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bk5mz"] Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.040634 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t6zq\" (UniqueName: \"kubernetes.io/projected/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-kube-api-access-6t6zq\") pod \"barbican-db-create-wvskp\" (UID: \"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488\") " pod="openstack/barbican-db-create-wvskp" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.040683 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdc4j\" (UniqueName: \"kubernetes.io/projected/50ec3af7-316f-43bc-9a57-cee6a641b441-kube-api-access-fdc4j\") pod \"barbican-a094-account-create-update-htzlh\" (UID: \"50ec3af7-316f-43bc-9a57-cee6a641b441\") " pod="openstack/barbican-a094-account-create-update-htzlh" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.040759 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw748\" (UniqueName: \"kubernetes.io/projected/71bdcb4a-de40-40a6-bba6-8010544c618f-kube-api-access-dw748\") pod \"neutron-db-create-2fp2g\" (UID: \"71bdcb4a-de40-40a6-bba6-8010544c618f\") " pod="openstack/neutron-db-create-2fp2g" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.040786 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50ec3af7-316f-43bc-9a57-cee6a641b441-operator-scripts\") pod \"barbican-a094-account-create-update-htzlh\" (UID: \"50ec3af7-316f-43bc-9a57-cee6a641b441\") " pod="openstack/barbican-a094-account-create-update-htzlh" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.040836 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-operator-scripts\") pod \"barbican-db-create-wvskp\" (UID: \"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488\") " pod="openstack/barbican-db-create-wvskp" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.040871 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71bdcb4a-de40-40a6-bba6-8010544c618f-operator-scripts\") pod \"neutron-db-create-2fp2g\" (UID: \"71bdcb4a-de40-40a6-bba6-8010544c618f\") " pod="openstack/neutron-db-create-2fp2g" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.040955 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81af6409-f8ce-485c-a2a1-1b1cce7c5433-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.040973 5118 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.040983 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.040995 5118 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/81af6409-f8ce-485c-a2a1-1b1cce7c5433-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.041005 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7t5h\" (UniqueName: \"kubernetes.io/projected/81af6409-f8ce-485c-a2a1-1b1cce7c5433-kube-api-access-s7t5h\") on node \"crc\" DevicePath \"\"" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.042289 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50ec3af7-316f-43bc-9a57-cee6a641b441-operator-scripts\") pod \"barbican-a094-account-create-update-htzlh\" (UID: \"50ec3af7-316f-43bc-9a57-cee6a641b441\") " pod="openstack/barbican-a094-account-create-update-htzlh" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.042859 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-operator-scripts\") pod \"barbican-db-create-wvskp\" (UID: \"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488\") " pod="openstack/barbican-db-create-wvskp" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.043440 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f94b-account-create-update-dxxsj"] Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.058621 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t6zq\" (UniqueName: \"kubernetes.io/projected/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-kube-api-access-6t6zq\") pod \"barbican-db-create-wvskp\" (UID: \"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488\") " pod="openstack/barbican-db-create-wvskp" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.063494 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdc4j\" (UniqueName: \"kubernetes.io/projected/50ec3af7-316f-43bc-9a57-cee6a641b441-kube-api-access-fdc4j\") pod \"barbican-a094-account-create-update-htzlh\" (UID: \"50ec3af7-316f-43bc-9a57-cee6a641b441\") " pod="openstack/barbican-a094-account-create-update-htzlh" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.112014 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wvskp" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.119409 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a094-account-create-update-htzlh" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.142664 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5m75\" (UniqueName: \"kubernetes.io/projected/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-kube-api-access-s5m75\") pod \"keystone-db-sync-bk5mz\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.142710 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-config-data\") pod \"keystone-db-sync-bk5mz\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.142751 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw748\" (UniqueName: \"kubernetes.io/projected/71bdcb4a-de40-40a6-bba6-8010544c618f-kube-api-access-dw748\") pod \"neutron-db-create-2fp2g\" (UID: \"71bdcb4a-de40-40a6-bba6-8010544c618f\") " pod="openstack/neutron-db-create-2fp2g" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.143080 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deca025a-7a57-4244-84e3-541fd5f7760d-operator-scripts\") pod \"neutron-f94b-account-create-update-dxxsj\" (UID: \"deca025a-7a57-4244-84e3-541fd5f7760d\") " pod="openstack/neutron-f94b-account-create-update-dxxsj" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.143265 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-combined-ca-bundle\") pod \"keystone-db-sync-bk5mz\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.143347 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71bdcb4a-de40-40a6-bba6-8010544c618f-operator-scripts\") pod \"neutron-db-create-2fp2g\" (UID: \"71bdcb4a-de40-40a6-bba6-8010544c618f\") " pod="openstack/neutron-db-create-2fp2g" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.143422 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4859\" (UniqueName: \"kubernetes.io/projected/deca025a-7a57-4244-84e3-541fd5f7760d-kube-api-access-r4859\") pod \"neutron-f94b-account-create-update-dxxsj\" (UID: \"deca025a-7a57-4244-84e3-541fd5f7760d\") " pod="openstack/neutron-f94b-account-create-update-dxxsj" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.143905 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71bdcb4a-de40-40a6-bba6-8010544c618f-operator-scripts\") pod \"neutron-db-create-2fp2g\" (UID: \"71bdcb4a-de40-40a6-bba6-8010544c618f\") " pod="openstack/neutron-db-create-2fp2g" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.160573 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw748\" (UniqueName: \"kubernetes.io/projected/71bdcb4a-de40-40a6-bba6-8010544c618f-kube-api-access-dw748\") pod \"neutron-db-create-2fp2g\" (UID: \"71bdcb4a-de40-40a6-bba6-8010544c618f\") " pod="openstack/neutron-db-create-2fp2g" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.249951 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5m75\" (UniqueName: \"kubernetes.io/projected/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-kube-api-access-s5m75\") pod \"keystone-db-sync-bk5mz\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.249995 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-config-data\") pod \"keystone-db-sync-bk5mz\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.250076 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deca025a-7a57-4244-84e3-541fd5f7760d-operator-scripts\") pod \"neutron-f94b-account-create-update-dxxsj\" (UID: \"deca025a-7a57-4244-84e3-541fd5f7760d\") " pod="openstack/neutron-f94b-account-create-update-dxxsj" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.250116 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-combined-ca-bundle\") pod \"keystone-db-sync-bk5mz\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.250143 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4859\" (UniqueName: \"kubernetes.io/projected/deca025a-7a57-4244-84e3-541fd5f7760d-kube-api-access-r4859\") pod \"neutron-f94b-account-create-update-dxxsj\" (UID: \"deca025a-7a57-4244-84e3-541fd5f7760d\") " pod="openstack/neutron-f94b-account-create-update-dxxsj" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.252826 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deca025a-7a57-4244-84e3-541fd5f7760d-operator-scripts\") pod \"neutron-f94b-account-create-update-dxxsj\" (UID: \"deca025a-7a57-4244-84e3-541fd5f7760d\") " pod="openstack/neutron-f94b-account-create-update-dxxsj" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.259145 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-combined-ca-bundle\") pod \"keystone-db-sync-bk5mz\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.259782 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-config-data\") pod \"keystone-db-sync-bk5mz\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.269243 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5m75\" (UniqueName: \"kubernetes.io/projected/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-kube-api-access-s5m75\") pod \"keystone-db-sync-bk5mz\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.277362 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4859\" (UniqueName: \"kubernetes.io/projected/deca025a-7a57-4244-84e3-541fd5f7760d-kube-api-access-r4859\") pod \"neutron-f94b-account-create-update-dxxsj\" (UID: \"deca025a-7a57-4244-84e3-541fd5f7760d\") " pod="openstack/neutron-f94b-account-create-update-dxxsj" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.297605 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2fp2g" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.330615 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f94b-account-create-update-dxxsj" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.347778 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7lndk" event={"ID":"81af6409-f8ce-485c-a2a1-1b1cce7c5433","Type":"ContainerDied","Data":"8ae886d3dad73cfa747014d2930aeb3c5a885320bbc58f8eb5a8d5c85d9d14a3"} Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.349522 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ae886d3dad73cfa747014d2930aeb3c5a885320bbc58f8eb5a8d5c85d9d14a3" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.351916 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7lndk" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.363766 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.641299 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f8rdg-config-gdzbt"] Feb 23 07:05:53 crc kubenswrapper[5118]: W0223 07:05:53.735606 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4682954_c391_4b8e_8f31_0d522d321b21.slice/crio-96197bdbd3aae2abc734b3ca9030462f9ad022367e61848fa70ef890c42231a2 WatchSource:0}: Error finding container 96197bdbd3aae2abc734b3ca9030462f9ad022367e61848fa70ef890c42231a2: Status 404 returned error can't find the container with id 96197bdbd3aae2abc734b3ca9030462f9ad022367e61848fa70ef890c42231a2 Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.809493 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fst8q"] Feb 23 07:05:53 crc kubenswrapper[5118]: W0223 07:05:53.829363 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3af5d4fa_f037_4eb9_a893_7506a4541eb5.slice/crio-445e6c1dfd76d7c8e5d11a601b358238892d3b8a91497e3e06ff21dbd9acad0d WatchSource:0}: Error finding container 445e6c1dfd76d7c8e5d11a601b358238892d3b8a91497e3e06ff21dbd9acad0d: Status 404 returned error can't find the container with id 445e6c1dfd76d7c8e5d11a601b358238892d3b8a91497e3e06ff21dbd9acad0d Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.920578 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a094-account-create-update-htzlh"] Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.938769 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wgxlm"] Feb 23 07:05:53 crc kubenswrapper[5118]: I0223 07:05:53.956166 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wvskp"] Feb 23 07:05:53 crc kubenswrapper[5118]: W0223 07:05:53.964814 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b574a0c_8f8a_4db9_b5bd_cfe8ea5fd488.slice/crio-e31ff6571175167a1b5120c31b7bfc0e610376dc83d59a26c4a878f8efe955f4 WatchSource:0}: Error finding container e31ff6571175167a1b5120c31b7bfc0e610376dc83d59a26c4a878f8efe955f4: Status 404 returned error can't find the container with id e31ff6571175167a1b5120c31b7bfc0e610376dc83d59a26c4a878f8efe955f4 Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.222646 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2fp2g"] Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.316457 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bk5mz"] Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.412401 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f94b-account-create-update-dxxsj"] Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.418336 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fst8q" event={"ID":"3af5d4fa-f037-4eb9-a893-7506a4541eb5","Type":"ContainerStarted","Data":"fd21cfd0580525c1b7844427525a7e2066f46567f71940f55223ebed1457c9b1"} Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.418397 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fst8q" event={"ID":"3af5d4fa-f037-4eb9-a893-7506a4541eb5","Type":"ContainerStarted","Data":"445e6c1dfd76d7c8e5d11a601b358238892d3b8a91497e3e06ff21dbd9acad0d"} Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.424138 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wgxlm" event={"ID":"262a3252-99e5-4a2e-8164-c29c6e9b7764","Type":"ContainerStarted","Data":"66d827275b15ff20a3cfbc3dd393cb5035b993b3aeed3198f3f409305773544b"} Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.425127 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a094-account-create-update-htzlh" event={"ID":"50ec3af7-316f-43bc-9a57-cee6a641b441","Type":"ContainerStarted","Data":"47d609891fac2e2d9a632b341df1f0a0498780bb7beccf74e917a9a91f693252"} Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.439403 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2fp2g" event={"ID":"71bdcb4a-de40-40a6-bba6-8010544c618f","Type":"ContainerStarted","Data":"bb0af51d6d62146c6780ecbb5876094089fc5e7f35e35da9317e0f0c159cbe63"} Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.440239 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6db8-account-create-update-c2rqc"] Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.449052 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-fst8q" podStartSLOduration=5.449032298 podStartE2EDuration="5.449032298s" podCreationTimestamp="2026-02-23 07:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:05:54.448229959 +0000 UTC m=+1217.452014532" watchObservedRunningTime="2026-02-23 07:05:54.449032298 +0000 UTC m=+1217.452816871" Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.460818 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wvskp" event={"ID":"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488","Type":"ContainerStarted","Data":"e31ff6571175167a1b5120c31b7bfc0e610376dc83d59a26c4a878f8efe955f4"} Feb 23 07:05:54 crc kubenswrapper[5118]: I0223 07:05:54.463545 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8rdg-config-gdzbt" event={"ID":"c4682954-c391-4b8e-8f31-0d522d321b21","Type":"ContainerStarted","Data":"96197bdbd3aae2abc734b3ca9030462f9ad022367e61848fa70ef890c42231a2"} Feb 23 07:05:54 crc kubenswrapper[5118]: W0223 07:05:54.528920 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f945343_e53c_4dc0_a8e9_4165dd32b8b8.slice/crio-940e6d8e7d78ee465418a0ee9c5ccff182b25f7ea8455b8b1e1507e7435437a0 WatchSource:0}: Error finding container 940e6d8e7d78ee465418a0ee9c5ccff182b25f7ea8455b8b1e1507e7435437a0: Status 404 returned error can't find the container with id 940e6d8e7d78ee465418a0ee9c5ccff182b25f7ea8455b8b1e1507e7435437a0 Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.474023 5118 generic.go:334] "Generic (PLEG): container finished" podID="71bdcb4a-de40-40a6-bba6-8010544c618f" containerID="ef137dc1dae09884fa1f818bc1a3acbbc6d8e89520fb8a24cc76bcacf3b1d973" exitCode=0 Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.474195 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2fp2g" event={"ID":"71bdcb4a-de40-40a6-bba6-8010544c618f","Type":"ContainerDied","Data":"ef137dc1dae09884fa1f818bc1a3acbbc6d8e89520fb8a24cc76bcacf3b1d973"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.486143 5118 generic.go:334] "Generic (PLEG): container finished" podID="6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488" containerID="402a757ca854a9d41f888fbe557b2790ea14bfab0b9b1beebf4734563316e710" exitCode=0 Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.486245 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wvskp" event={"ID":"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488","Type":"ContainerDied","Data":"402a757ca854a9d41f888fbe557b2790ea14bfab0b9b1beebf4734563316e710"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.489662 5118 generic.go:334] "Generic (PLEG): container finished" podID="4acffeac-bca3-441a-bf4c-81033e75dd62" containerID="08f5ae809a52605089fc9cdb111ba3bc2e8bb6a1dc22b8e8172f0f7b7b1321e9" exitCode=0 Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.489710 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6db8-account-create-update-c2rqc" event={"ID":"4acffeac-bca3-441a-bf4c-81033e75dd62","Type":"ContainerDied","Data":"08f5ae809a52605089fc9cdb111ba3bc2e8bb6a1dc22b8e8172f0f7b7b1321e9"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.489727 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6db8-account-create-update-c2rqc" event={"ID":"4acffeac-bca3-441a-bf4c-81033e75dd62","Type":"ContainerStarted","Data":"afc28299aaf840f7aac8832c706dfa1efb4aadfd7746eb1e26d21b2c67ac3c7f"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.499581 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bk5mz" event={"ID":"5f945343-e53c-4dc0-a8e9-4165dd32b8b8","Type":"ContainerStarted","Data":"940e6d8e7d78ee465418a0ee9c5ccff182b25f7ea8455b8b1e1507e7435437a0"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.502415 5118 generic.go:334] "Generic (PLEG): container finished" podID="262a3252-99e5-4a2e-8164-c29c6e9b7764" containerID="53c815e6e0d97274de16eccba1c763fdb5994fa4f5076151e86ee65fd19101e1" exitCode=0 Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.502503 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wgxlm" event={"ID":"262a3252-99e5-4a2e-8164-c29c6e9b7764","Type":"ContainerDied","Data":"53c815e6e0d97274de16eccba1c763fdb5994fa4f5076151e86ee65fd19101e1"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.506725 5118 generic.go:334] "Generic (PLEG): container finished" podID="3af5d4fa-f037-4eb9-a893-7506a4541eb5" containerID="fd21cfd0580525c1b7844427525a7e2066f46567f71940f55223ebed1457c9b1" exitCode=0 Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.506796 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fst8q" event={"ID":"3af5d4fa-f037-4eb9-a893-7506a4541eb5","Type":"ContainerDied","Data":"fd21cfd0580525c1b7844427525a7e2066f46567f71940f55223ebed1457c9b1"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.509606 5118 generic.go:334] "Generic (PLEG): container finished" podID="50ec3af7-316f-43bc-9a57-cee6a641b441" containerID="bf753c5ff204f7671f6f916f3471b964c3b8e931842477bec2466966122fb249" exitCode=0 Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.509675 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a094-account-create-update-htzlh" event={"ID":"50ec3af7-316f-43bc-9a57-cee6a641b441","Type":"ContainerDied","Data":"bf753c5ff204f7671f6f916f3471b964c3b8e931842477bec2466966122fb249"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.517910 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ms95n" event={"ID":"222c1eb0-e9da-4365-ad64-850496d1ceb7","Type":"ContainerStarted","Data":"f91de33e332b8b4a6dbf6738e42a54c52bd94c43aa50a9f0ff04ca4baa4554bd"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.522974 5118 generic.go:334] "Generic (PLEG): container finished" podID="c4682954-c391-4b8e-8f31-0d522d321b21" containerID="7cb8242dbfa61e0770e846acb68fe689bced9a76568c54dd9f46da08dc2ffd0a" exitCode=0 Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.523208 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8rdg-config-gdzbt" event={"ID":"c4682954-c391-4b8e-8f31-0d522d321b21","Type":"ContainerDied","Data":"7cb8242dbfa61e0770e846acb68fe689bced9a76568c54dd9f46da08dc2ffd0a"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.540991 5118 generic.go:334] "Generic (PLEG): container finished" podID="deca025a-7a57-4244-84e3-541fd5f7760d" containerID="c97640bf824262195b04827fc1d9a2c28b53219df60d5c4d4ae2cebf410083aa" exitCode=0 Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.541071 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f94b-account-create-update-dxxsj" event={"ID":"deca025a-7a57-4244-84e3-541fd5f7760d","Type":"ContainerDied","Data":"c97640bf824262195b04827fc1d9a2c28b53219df60d5c4d4ae2cebf410083aa"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.541129 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f94b-account-create-update-dxxsj" event={"ID":"deca025a-7a57-4244-84e3-541fd5f7760d","Type":"ContainerStarted","Data":"d02ae2550b3a278ca1e78f53727110779d3d59f9735828b833970664d74c1c91"} Feb 23 07:05:55 crc kubenswrapper[5118]: I0223 07:05:55.671969 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ms95n" podStartSLOduration=5.419497949 podStartE2EDuration="19.671949931s" podCreationTimestamp="2026-02-23 07:05:36 +0000 UTC" firstStartedPulling="2026-02-23 07:05:38.935986065 +0000 UTC m=+1201.939770638" lastFinishedPulling="2026-02-23 07:05:53.188438047 +0000 UTC m=+1216.192222620" observedRunningTime="2026-02-23 07:05:55.661009443 +0000 UTC m=+1218.664794006" watchObservedRunningTime="2026-02-23 07:05:55.671949931 +0000 UTC m=+1218.675734504" Feb 23 07:05:56 crc kubenswrapper[5118]: I0223 07:05:56.742338 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-f8rdg" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.162000 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.176297 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") pod \"swift-storage-0\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " pod="openstack/swift-storage-0" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.402403 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.610484 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f94b-account-create-update-dxxsj" event={"ID":"deca025a-7a57-4244-84e3-541fd5f7760d","Type":"ContainerDied","Data":"d02ae2550b3a278ca1e78f53727110779d3d59f9735828b833970664d74c1c91"} Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.610707 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d02ae2550b3a278ca1e78f53727110779d3d59f9735828b833970664d74c1c91" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.618520 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a094-account-create-update-htzlh" event={"ID":"50ec3af7-316f-43bc-9a57-cee6a641b441","Type":"ContainerDied","Data":"47d609891fac2e2d9a632b341df1f0a0498780bb7beccf74e917a9a91f693252"} Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.618583 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47d609891fac2e2d9a632b341df1f0a0498780bb7beccf74e917a9a91f693252" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.622270 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2fp2g" event={"ID":"71bdcb4a-de40-40a6-bba6-8010544c618f","Type":"ContainerDied","Data":"bb0af51d6d62146c6780ecbb5876094089fc5e7f35e35da9317e0f0c159cbe63"} Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.622324 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb0af51d6d62146c6780ecbb5876094089fc5e7f35e35da9317e0f0c159cbe63" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.624602 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6db8-account-create-update-c2rqc" event={"ID":"4acffeac-bca3-441a-bf4c-81033e75dd62","Type":"ContainerDied","Data":"afc28299aaf840f7aac8832c706dfa1efb4aadfd7746eb1e26d21b2c67ac3c7f"} Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.624642 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc28299aaf840f7aac8832c706dfa1efb4aadfd7746eb1e26d21b2c67ac3c7f" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.774186 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f94b-account-create-update-dxxsj" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.811897 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2fp2g" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.815554 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a094-account-create-update-htzlh" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.845589 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6db8-account-create-update-c2rqc" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.858833 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wvskp" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.859135 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.866452 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fst8q" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.882005 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4859\" (UniqueName: \"kubernetes.io/projected/deca025a-7a57-4244-84e3-541fd5f7760d-kube-api-access-r4859\") pod \"deca025a-7a57-4244-84e3-541fd5f7760d\" (UID: \"deca025a-7a57-4244-84e3-541fd5f7760d\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.882065 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deca025a-7a57-4244-84e3-541fd5f7760d-operator-scripts\") pod \"deca025a-7a57-4244-84e3-541fd5f7760d\" (UID: \"deca025a-7a57-4244-84e3-541fd5f7760d\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.883640 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deca025a-7a57-4244-84e3-541fd5f7760d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "deca025a-7a57-4244-84e3-541fd5f7760d" (UID: "deca025a-7a57-4244-84e3-541fd5f7760d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.886999 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wgxlm" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.890703 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deca025a-7a57-4244-84e3-541fd5f7760d-kube-api-access-r4859" (OuterVolumeSpecName: "kube-api-access-r4859") pod "deca025a-7a57-4244-84e3-541fd5f7760d" (UID: "deca025a-7a57-4244-84e3-541fd5f7760d"). InnerVolumeSpecName "kube-api-access-r4859". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.983982 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-scripts\") pod \"c4682954-c391-4b8e-8f31-0d522d321b21\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984023 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-log-ovn\") pod \"c4682954-c391-4b8e-8f31-0d522d321b21\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984058 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr4mt\" (UniqueName: \"kubernetes.io/projected/262a3252-99e5-4a2e-8164-c29c6e9b7764-kube-api-access-qr4mt\") pod \"262a3252-99e5-4a2e-8164-c29c6e9b7764\" (UID: \"262a3252-99e5-4a2e-8164-c29c6e9b7764\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984082 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af5d4fa-f037-4eb9-a893-7506a4541eb5-operator-scripts\") pod \"3af5d4fa-f037-4eb9-a893-7506a4541eb5\" (UID: \"3af5d4fa-f037-4eb9-a893-7506a4541eb5\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984122 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/262a3252-99e5-4a2e-8164-c29c6e9b7764-operator-scripts\") pod \"262a3252-99e5-4a2e-8164-c29c6e9b7764\" (UID: \"262a3252-99e5-4a2e-8164-c29c6e9b7764\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984178 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw748\" (UniqueName: \"kubernetes.io/projected/71bdcb4a-de40-40a6-bba6-8010544c618f-kube-api-access-dw748\") pod \"71bdcb4a-de40-40a6-bba6-8010544c618f\" (UID: \"71bdcb4a-de40-40a6-bba6-8010544c618f\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984209 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-additional-scripts\") pod \"c4682954-c391-4b8e-8f31-0d522d321b21\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984243 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4acffeac-bca3-441a-bf4c-81033e75dd62-operator-scripts\") pod \"4acffeac-bca3-441a-bf4c-81033e75dd62\" (UID: \"4acffeac-bca3-441a-bf4c-81033e75dd62\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984278 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50ec3af7-316f-43bc-9a57-cee6a641b441-operator-scripts\") pod \"50ec3af7-316f-43bc-9a57-cee6a641b441\" (UID: \"50ec3af7-316f-43bc-9a57-cee6a641b441\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984302 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkght\" (UniqueName: \"kubernetes.io/projected/3af5d4fa-f037-4eb9-a893-7506a4541eb5-kube-api-access-dkght\") pod \"3af5d4fa-f037-4eb9-a893-7506a4541eb5\" (UID: \"3af5d4fa-f037-4eb9-a893-7506a4541eb5\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984333 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71bdcb4a-de40-40a6-bba6-8010544c618f-operator-scripts\") pod \"71bdcb4a-de40-40a6-bba6-8010544c618f\" (UID: \"71bdcb4a-de40-40a6-bba6-8010544c618f\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984359 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdc4j\" (UniqueName: \"kubernetes.io/projected/50ec3af7-316f-43bc-9a57-cee6a641b441-kube-api-access-fdc4j\") pod \"50ec3af7-316f-43bc-9a57-cee6a641b441\" (UID: \"50ec3af7-316f-43bc-9a57-cee6a641b441\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984377 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run-ovn\") pod \"c4682954-c391-4b8e-8f31-0d522d321b21\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984394 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-operator-scripts\") pod \"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488\" (UID: \"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984417 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t6zq\" (UniqueName: \"kubernetes.io/projected/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-kube-api-access-6t6zq\") pod \"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488\" (UID: \"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984458 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdvxm\" (UniqueName: \"kubernetes.io/projected/4acffeac-bca3-441a-bf4c-81033e75dd62-kube-api-access-pdvxm\") pod \"4acffeac-bca3-441a-bf4c-81033e75dd62\" (UID: \"4acffeac-bca3-441a-bf4c-81033e75dd62\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984511 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4s6m\" (UniqueName: \"kubernetes.io/projected/c4682954-c391-4b8e-8f31-0d522d321b21-kube-api-access-s4s6m\") pod \"c4682954-c391-4b8e-8f31-0d522d321b21\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984540 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run\") pod \"c4682954-c391-4b8e-8f31-0d522d321b21\" (UID: \"c4682954-c391-4b8e-8f31-0d522d321b21\") " Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984879 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4859\" (UniqueName: \"kubernetes.io/projected/deca025a-7a57-4244-84e3-541fd5f7760d-kube-api-access-r4859\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984890 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/deca025a-7a57-4244-84e3-541fd5f7760d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.984942 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run" (OuterVolumeSpecName: "var-run") pod "c4682954-c391-4b8e-8f31-0d522d321b21" (UID: "c4682954-c391-4b8e-8f31-0d522d321b21"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.985166 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4acffeac-bca3-441a-bf4c-81033e75dd62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4acffeac-bca3-441a-bf4c-81033e75dd62" (UID: "4acffeac-bca3-441a-bf4c-81033e75dd62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.985394 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50ec3af7-316f-43bc-9a57-cee6a641b441-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50ec3af7-316f-43bc-9a57-cee6a641b441" (UID: "50ec3af7-316f-43bc-9a57-cee6a641b441"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.986188 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af5d4fa-f037-4eb9-a893-7506a4541eb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3af5d4fa-f037-4eb9-a893-7506a4541eb5" (UID: "3af5d4fa-f037-4eb9-a893-7506a4541eb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.986214 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c4682954-c391-4b8e-8f31-0d522d321b21" (UID: "c4682954-c391-4b8e-8f31-0d522d321b21"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.987340 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-scripts" (OuterVolumeSpecName: "scripts") pod "c4682954-c391-4b8e-8f31-0d522d321b21" (UID: "c4682954-c391-4b8e-8f31-0d522d321b21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.987907 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/262a3252-99e5-4a2e-8164-c29c6e9b7764-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "262a3252-99e5-4a2e-8164-c29c6e9b7764" (UID: "262a3252-99e5-4a2e-8164-c29c6e9b7764"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.989913 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c4682954-c391-4b8e-8f31-0d522d321b21" (UID: "c4682954-c391-4b8e-8f31-0d522d321b21"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.990014 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488" (UID: "6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.990075 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c4682954-c391-4b8e-8f31-0d522d321b21" (UID: "c4682954-c391-4b8e-8f31-0d522d321b21"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.990521 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71bdcb4a-de40-40a6-bba6-8010544c618f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71bdcb4a-de40-40a6-bba6-8010544c618f" (UID: "71bdcb4a-de40-40a6-bba6-8010544c618f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.992490 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af5d4fa-f037-4eb9-a893-7506a4541eb5-kube-api-access-dkght" (OuterVolumeSpecName: "kube-api-access-dkght") pod "3af5d4fa-f037-4eb9-a893-7506a4541eb5" (UID: "3af5d4fa-f037-4eb9-a893-7506a4541eb5"). InnerVolumeSpecName "kube-api-access-dkght". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.992523 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/262a3252-99e5-4a2e-8164-c29c6e9b7764-kube-api-access-qr4mt" (OuterVolumeSpecName: "kube-api-access-qr4mt") pod "262a3252-99e5-4a2e-8164-c29c6e9b7764" (UID: "262a3252-99e5-4a2e-8164-c29c6e9b7764"). InnerVolumeSpecName "kube-api-access-qr4mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.992738 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bdcb4a-de40-40a6-bba6-8010544c618f-kube-api-access-dw748" (OuterVolumeSpecName: "kube-api-access-dw748") pod "71bdcb4a-de40-40a6-bba6-8010544c618f" (UID: "71bdcb4a-de40-40a6-bba6-8010544c618f"). InnerVolumeSpecName "kube-api-access-dw748". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.993897 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4682954-c391-4b8e-8f31-0d522d321b21-kube-api-access-s4s6m" (OuterVolumeSpecName: "kube-api-access-s4s6m") pod "c4682954-c391-4b8e-8f31-0d522d321b21" (UID: "c4682954-c391-4b8e-8f31-0d522d321b21"). InnerVolumeSpecName "kube-api-access-s4s6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.994334 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-kube-api-access-6t6zq" (OuterVolumeSpecName: "kube-api-access-6t6zq") pod "6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488" (UID: "6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488"). InnerVolumeSpecName "kube-api-access-6t6zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.994742 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ec3af7-316f-43bc-9a57-cee6a641b441-kube-api-access-fdc4j" (OuterVolumeSpecName: "kube-api-access-fdc4j") pod "50ec3af7-316f-43bc-9a57-cee6a641b441" (UID: "50ec3af7-316f-43bc-9a57-cee6a641b441"). InnerVolumeSpecName "kube-api-access-fdc4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:01 crc kubenswrapper[5118]: I0223 07:06:01.995967 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4acffeac-bca3-441a-bf4c-81033e75dd62-kube-api-access-pdvxm" (OuterVolumeSpecName: "kube-api-access-pdvxm") pod "4acffeac-bca3-441a-bf4c-81033e75dd62" (UID: "4acffeac-bca3-441a-bf4c-81033e75dd62"). InnerVolumeSpecName "kube-api-access-pdvxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086527 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdc4j\" (UniqueName: \"kubernetes.io/projected/50ec3af7-316f-43bc-9a57-cee6a641b441-kube-api-access-fdc4j\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086561 5118 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086580 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086589 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t6zq\" (UniqueName: \"kubernetes.io/projected/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488-kube-api-access-6t6zq\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086598 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdvxm\" (UniqueName: \"kubernetes.io/projected/4acffeac-bca3-441a-bf4c-81033e75dd62-kube-api-access-pdvxm\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086612 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4s6m\" (UniqueName: \"kubernetes.io/projected/c4682954-c391-4b8e-8f31-0d522d321b21-kube-api-access-s4s6m\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086634 5118 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086645 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086652 5118 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c4682954-c391-4b8e-8f31-0d522d321b21-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086660 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr4mt\" (UniqueName: \"kubernetes.io/projected/262a3252-99e5-4a2e-8164-c29c6e9b7764-kube-api-access-qr4mt\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086668 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af5d4fa-f037-4eb9-a893-7506a4541eb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086676 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/262a3252-99e5-4a2e-8164-c29c6e9b7764-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086684 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw748\" (UniqueName: \"kubernetes.io/projected/71bdcb4a-de40-40a6-bba6-8010544c618f-kube-api-access-dw748\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086691 5118 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c4682954-c391-4b8e-8f31-0d522d321b21-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086700 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4acffeac-bca3-441a-bf4c-81033e75dd62-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086708 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50ec3af7-316f-43bc-9a57-cee6a641b441-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086716 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkght\" (UniqueName: \"kubernetes.io/projected/3af5d4fa-f037-4eb9-a893-7506a4541eb5-kube-api-access-dkght\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.086724 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71bdcb4a-de40-40a6-bba6-8010544c618f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.278777 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.636741 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wvskp" event={"ID":"6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488","Type":"ContainerDied","Data":"e31ff6571175167a1b5120c31b7bfc0e610376dc83d59a26c4a878f8efe955f4"} Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.636825 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e31ff6571175167a1b5120c31b7bfc0e610376dc83d59a26c4a878f8efe955f4" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.636769 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wvskp" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.638537 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8rdg-config-gdzbt" event={"ID":"c4682954-c391-4b8e-8f31-0d522d321b21","Type":"ContainerDied","Data":"96197bdbd3aae2abc734b3ca9030462f9ad022367e61848fa70ef890c42231a2"} Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.638586 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96197bdbd3aae2abc734b3ca9030462f9ad022367e61848fa70ef890c42231a2" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.638652 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8rdg-config-gdzbt" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.647924 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"5a9bb9aa7eb6e2635b6815a29d567a3e06c60854cd41b883f534e669bdc8ace2"} Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.655353 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bk5mz" event={"ID":"5f945343-e53c-4dc0-a8e9-4165dd32b8b8","Type":"ContainerStarted","Data":"1de6314811ebb64ce7d6e81e27c3ebe9806a6a2be25b7adcba9d4d318f00ff32"} Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.659530 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wgxlm" event={"ID":"262a3252-99e5-4a2e-8164-c29c6e9b7764","Type":"ContainerDied","Data":"66d827275b15ff20a3cfbc3dd393cb5035b993b3aeed3198f3f409305773544b"} Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.659576 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d827275b15ff20a3cfbc3dd393cb5035b993b3aeed3198f3f409305773544b" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.659679 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wgxlm" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.661607 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f94b-account-create-update-dxxsj" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.662396 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a094-account-create-update-htzlh" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.670829 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fst8q" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.672392 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6db8-account-create-update-c2rqc" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.672519 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2fp2g" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.672713 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fst8q" event={"ID":"3af5d4fa-f037-4eb9-a893-7506a4541eb5","Type":"ContainerDied","Data":"445e6c1dfd76d7c8e5d11a601b358238892d3b8a91497e3e06ff21dbd9acad0d"} Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.672887 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="445e6c1dfd76d7c8e5d11a601b358238892d3b8a91497e3e06ff21dbd9acad0d" Feb 23 07:06:02 crc kubenswrapper[5118]: I0223 07:06:02.700718 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-bk5mz" podStartSLOduration=3.498581706 podStartE2EDuration="10.700693605s" podCreationTimestamp="2026-02-23 07:05:52 +0000 UTC" firstStartedPulling="2026-02-23 07:05:54.531233906 +0000 UTC m=+1217.535018479" lastFinishedPulling="2026-02-23 07:06:01.733345795 +0000 UTC m=+1224.737130378" observedRunningTime="2026-02-23 07:06:02.69580059 +0000 UTC m=+1225.699585173" watchObservedRunningTime="2026-02-23 07:06:02.700693605 +0000 UTC m=+1225.704478198" Feb 23 07:06:03 crc kubenswrapper[5118]: I0223 07:06:03.023075 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-f8rdg-config-gdzbt"] Feb 23 07:06:03 crc kubenswrapper[5118]: I0223 07:06:03.029886 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-f8rdg-config-gdzbt"] Feb 23 07:06:03 crc kubenswrapper[5118]: I0223 07:06:03.674774 5118 generic.go:334] "Generic (PLEG): container finished" podID="222c1eb0-e9da-4365-ad64-850496d1ceb7" containerID="f91de33e332b8b4a6dbf6738e42a54c52bd94c43aa50a9f0ff04ca4baa4554bd" exitCode=0 Feb 23 07:06:03 crc kubenswrapper[5118]: I0223 07:06:03.674887 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ms95n" event={"ID":"222c1eb0-e9da-4365-ad64-850496d1ceb7","Type":"ContainerDied","Data":"f91de33e332b8b4a6dbf6738e42a54c52bd94c43aa50a9f0ff04ca4baa4554bd"} Feb 23 07:06:03 crc kubenswrapper[5118]: I0223 07:06:03.677386 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56"} Feb 23 07:06:03 crc kubenswrapper[5118]: I0223 07:06:03.715575 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4682954-c391-4b8e-8f31-0d522d321b21" path="/var/lib/kubelet/pods/c4682954-c391-4b8e-8f31-0d522d321b21/volumes" Feb 23 07:06:04 crc kubenswrapper[5118]: I0223 07:06:04.690543 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a"} Feb 23 07:06:04 crc kubenswrapper[5118]: I0223 07:06:04.690907 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865"} Feb 23 07:06:04 crc kubenswrapper[5118]: I0223 07:06:04.690921 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b"} Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.107779 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ms95n" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.249154 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-combined-ca-bundle\") pod \"222c1eb0-e9da-4365-ad64-850496d1ceb7\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.249480 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzps2\" (UniqueName: \"kubernetes.io/projected/222c1eb0-e9da-4365-ad64-850496d1ceb7-kube-api-access-xzps2\") pod \"222c1eb0-e9da-4365-ad64-850496d1ceb7\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.249535 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-db-sync-config-data\") pod \"222c1eb0-e9da-4365-ad64-850496d1ceb7\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.249575 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-config-data\") pod \"222c1eb0-e9da-4365-ad64-850496d1ceb7\" (UID: \"222c1eb0-e9da-4365-ad64-850496d1ceb7\") " Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.256483 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/222c1eb0-e9da-4365-ad64-850496d1ceb7-kube-api-access-xzps2" (OuterVolumeSpecName: "kube-api-access-xzps2") pod "222c1eb0-e9da-4365-ad64-850496d1ceb7" (UID: "222c1eb0-e9da-4365-ad64-850496d1ceb7"). InnerVolumeSpecName "kube-api-access-xzps2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.258773 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "222c1eb0-e9da-4365-ad64-850496d1ceb7" (UID: "222c1eb0-e9da-4365-ad64-850496d1ceb7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.276432 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "222c1eb0-e9da-4365-ad64-850496d1ceb7" (UID: "222c1eb0-e9da-4365-ad64-850496d1ceb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.314689 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-config-data" (OuterVolumeSpecName: "config-data") pod "222c1eb0-e9da-4365-ad64-850496d1ceb7" (UID: "222c1eb0-e9da-4365-ad64-850496d1ceb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.352214 5118 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.352254 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.352265 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzps2\" (UniqueName: \"kubernetes.io/projected/222c1eb0-e9da-4365-ad64-850496d1ceb7-kube-api-access-xzps2\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.352275 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222c1eb0-e9da-4365-ad64-850496d1ceb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.703236 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ms95n" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.716617 5118 generic.go:334] "Generic (PLEG): container finished" podID="5f945343-e53c-4dc0-a8e9-4165dd32b8b8" containerID="1de6314811ebb64ce7d6e81e27c3ebe9806a6a2be25b7adcba9d4d318f00ff32" exitCode=0 Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.765599 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ms95n" event={"ID":"222c1eb0-e9da-4365-ad64-850496d1ceb7","Type":"ContainerDied","Data":"447092a4d8d04432a4890c4c9b676d06a6ea61ddebae9ea5050e374ea476347a"} Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.766217 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="447092a4d8d04432a4890c4c9b676d06a6ea61ddebae9ea5050e374ea476347a" Feb 23 07:06:05 crc kubenswrapper[5118]: I0223 07:06:05.766444 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bk5mz" event={"ID":"5f945343-e53c-4dc0-a8e9-4165dd32b8b8","Type":"ContainerDied","Data":"1de6314811ebb64ce7d6e81e27c3ebe9806a6a2be25b7adcba9d4d318f00ff32"} Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.206886 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74cc88677c-zltzg"] Feb 23 07:06:06 crc kubenswrapper[5118]: E0223 07:06:06.207414 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488" containerName="mariadb-database-create" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207441 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488" containerName="mariadb-database-create" Feb 23 07:06:06 crc kubenswrapper[5118]: E0223 07:06:06.207464 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4682954-c391-4b8e-8f31-0d522d321b21" containerName="ovn-config" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207473 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4682954-c391-4b8e-8f31-0d522d321b21" containerName="ovn-config" Feb 23 07:06:06 crc kubenswrapper[5118]: E0223 07:06:06.207487 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="262a3252-99e5-4a2e-8164-c29c6e9b7764" containerName="mariadb-database-create" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207495 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="262a3252-99e5-4a2e-8164-c29c6e9b7764" containerName="mariadb-database-create" Feb 23 07:06:06 crc kubenswrapper[5118]: E0223 07:06:06.207509 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af5d4fa-f037-4eb9-a893-7506a4541eb5" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207518 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af5d4fa-f037-4eb9-a893-7506a4541eb5" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: E0223 07:06:06.207532 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ec3af7-316f-43bc-9a57-cee6a641b441" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207539 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ec3af7-316f-43bc-9a57-cee6a641b441" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: E0223 07:06:06.207555 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222c1eb0-e9da-4365-ad64-850496d1ceb7" containerName="glance-db-sync" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207562 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="222c1eb0-e9da-4365-ad64-850496d1ceb7" containerName="glance-db-sync" Feb 23 07:06:06 crc kubenswrapper[5118]: E0223 07:06:06.207572 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deca025a-7a57-4244-84e3-541fd5f7760d" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207580 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="deca025a-7a57-4244-84e3-541fd5f7760d" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: E0223 07:06:06.207605 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bdcb4a-de40-40a6-bba6-8010544c618f" containerName="mariadb-database-create" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207613 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bdcb4a-de40-40a6-bba6-8010544c618f" containerName="mariadb-database-create" Feb 23 07:06:06 crc kubenswrapper[5118]: E0223 07:06:06.207626 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4acffeac-bca3-441a-bf4c-81033e75dd62" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207635 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4acffeac-bca3-441a-bf4c-81033e75dd62" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207823 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ec3af7-316f-43bc-9a57-cee6a641b441" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207841 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af5d4fa-f037-4eb9-a893-7506a4541eb5" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207852 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4acffeac-bca3-441a-bf4c-81033e75dd62" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207863 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488" containerName="mariadb-database-create" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207872 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bdcb4a-de40-40a6-bba6-8010544c618f" containerName="mariadb-database-create" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207892 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4682954-c391-4b8e-8f31-0d522d321b21" containerName="ovn-config" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207904 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="222c1eb0-e9da-4365-ad64-850496d1ceb7" containerName="glance-db-sync" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207912 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="deca025a-7a57-4244-84e3-541fd5f7760d" containerName="mariadb-account-create-update" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.207924 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="262a3252-99e5-4a2e-8164-c29c6e9b7764" containerName="mariadb-database-create" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.214815 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.236565 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cc88677c-zltzg"] Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.391505 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv5tr\" (UniqueName: \"kubernetes.io/projected/3304d1b4-e4de-404f-ace5-2417aa3d766b-kube-api-access-wv5tr\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.391608 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-dns-svc\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.391653 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-config\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.391684 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-nb\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.391707 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-sb\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.494316 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv5tr\" (UniqueName: \"kubernetes.io/projected/3304d1b4-e4de-404f-ace5-2417aa3d766b-kube-api-access-wv5tr\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.494414 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-dns-svc\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.494461 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-config\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.494482 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-nb\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.494503 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-sb\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.495433 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-sb\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.496369 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-nb\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.496501 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-dns-svc\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.496649 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-config\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.529052 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv5tr\" (UniqueName: \"kubernetes.io/projected/3304d1b4-e4de-404f-ace5-2417aa3d766b-kube-api-access-wv5tr\") pod \"dnsmasq-dns-74cc88677c-zltzg\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.550695 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.776203 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f"} Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.776530 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553"} Feb 23 07:06:06 crc kubenswrapper[5118]: I0223 07:06:06.776542 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed"} Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.064590 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.068552 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cc88677c-zltzg"] Feb 23 07:06:07 crc kubenswrapper[5118]: W0223 07:06:07.072164 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3304d1b4_e4de_404f_ace5_2417aa3d766b.slice/crio-7063916477eaea519a3984d827d633f5734348e872595945567ae0aeb54356c4 WatchSource:0}: Error finding container 7063916477eaea519a3984d827d633f5734348e872595945567ae0aeb54356c4: Status 404 returned error can't find the container with id 7063916477eaea519a3984d827d633f5734348e872595945567ae0aeb54356c4 Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.210646 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-combined-ca-bundle\") pod \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.210869 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5m75\" (UniqueName: \"kubernetes.io/projected/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-kube-api-access-s5m75\") pod \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.210991 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-config-data\") pod \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\" (UID: \"5f945343-e53c-4dc0-a8e9-4165dd32b8b8\") " Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.215042 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-kube-api-access-s5m75" (OuterVolumeSpecName: "kube-api-access-s5m75") pod "5f945343-e53c-4dc0-a8e9-4165dd32b8b8" (UID: "5f945343-e53c-4dc0-a8e9-4165dd32b8b8"). InnerVolumeSpecName "kube-api-access-s5m75". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.245089 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f945343-e53c-4dc0-a8e9-4165dd32b8b8" (UID: "5f945343-e53c-4dc0-a8e9-4165dd32b8b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.258698 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-config-data" (OuterVolumeSpecName: "config-data") pod "5f945343-e53c-4dc0-a8e9-4165dd32b8b8" (UID: "5f945343-e53c-4dc0-a8e9-4165dd32b8b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.314729 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.314767 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.314781 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5m75\" (UniqueName: \"kubernetes.io/projected/5f945343-e53c-4dc0-a8e9-4165dd32b8b8-kube-api-access-s5m75\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.793190 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9"} Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.795954 5118 generic.go:334] "Generic (PLEG): container finished" podID="3304d1b4-e4de-404f-ace5-2417aa3d766b" containerID="36227bd2ee2474a1b4cee3744c1b31993590cab58dba5cade7b3f1d0c88df3c1" exitCode=0 Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.796007 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cc88677c-zltzg" event={"ID":"3304d1b4-e4de-404f-ace5-2417aa3d766b","Type":"ContainerDied","Data":"36227bd2ee2474a1b4cee3744c1b31993590cab58dba5cade7b3f1d0c88df3c1"} Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.796025 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cc88677c-zltzg" event={"ID":"3304d1b4-e4de-404f-ace5-2417aa3d766b","Type":"ContainerStarted","Data":"7063916477eaea519a3984d827d633f5734348e872595945567ae0aeb54356c4"} Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.799541 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bk5mz" event={"ID":"5f945343-e53c-4dc0-a8e9-4165dd32b8b8","Type":"ContainerDied","Data":"940e6d8e7d78ee465418a0ee9c5ccff182b25f7ea8455b8b1e1507e7435437a0"} Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.799820 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="940e6d8e7d78ee465418a0ee9c5ccff182b25f7ea8455b8b1e1507e7435437a0" Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.799566 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bk5mz" Feb 23 07:06:07 crc kubenswrapper[5118]: I0223 07:06:07.997283 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74cc88677c-zltzg"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.021678 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dvxfg"] Feb 23 07:06:08 crc kubenswrapper[5118]: E0223 07:06:08.022106 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f945343-e53c-4dc0-a8e9-4165dd32b8b8" containerName="keystone-db-sync" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.022141 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f945343-e53c-4dc0-a8e9-4165dd32b8b8" containerName="keystone-db-sync" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.022344 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f945343-e53c-4dc0-a8e9-4165dd32b8b8" containerName="keystone-db-sync" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.022957 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.028241 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.031129 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.031531 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-86zb7" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.039274 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.039344 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.046583 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dvxfg"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.085306 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567dccfc7f-k576w"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.088650 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.110470 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567dccfc7f-k576w"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.133212 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-fernet-keys\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.133503 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-credential-keys\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.133614 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-combined-ca-bundle\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.133692 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-scripts\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.133772 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-config-data\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.133869 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mlls\" (UniqueName: \"kubernetes.io/projected/6d0b5007-ea48-438d-a216-e78519ab4d0d-kube-api-access-2mlls\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.238928 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-nb\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.239004 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-dns-svc\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.239044 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-fernet-keys\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.239063 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-credential-keys\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.239083 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-config\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.239175 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-scripts\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.239211 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-combined-ca-bundle\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.239244 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-config-data\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.239278 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-sb\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.239315 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvcgs\" (UniqueName: \"kubernetes.io/projected/8347e765-6b8c-4da9-93d7-2d127a7f9389-kube-api-access-rvcgs\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.239332 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mlls\" (UniqueName: \"kubernetes.io/projected/6d0b5007-ea48-438d-a216-e78519ab4d0d-kube-api-access-2mlls\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.248662 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fblpx"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.252679 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-config-data\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.253223 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-combined-ca-bundle\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.253443 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.254707 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-credential-keys\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.254900 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-fernet-keys\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.257793 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-s4k5s" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.258117 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.258450 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.260286 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-scripts\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.277013 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.277583 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mlls\" (UniqueName: \"kubernetes.io/projected/6d0b5007-ea48-438d-a216-e78519ab4d0d-kube-api-access-2mlls\") pod \"keystone-bootstrap-dvxfg\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.279372 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.281477 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.284927 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.297264 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fblpx"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.317404 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343300 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-combined-ca-bundle\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343357 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvcgs\" (UniqueName: \"kubernetes.io/projected/8347e765-6b8c-4da9-93d7-2d127a7f9389-kube-api-access-rvcgs\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343386 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-db-sync-config-data\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343404 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-config-data\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343430 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28lfg\" (UniqueName: \"kubernetes.io/projected/2df7ebd2-0918-4164-beec-18057338255d-kube-api-access-28lfg\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343472 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-scripts\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343497 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-nb\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343527 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-dns-svc\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343557 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-config\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343592 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2df7ebd2-0918-4164-beec-18057338255d-etc-machine-id\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.343637 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-sb\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.344461 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-sb\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.345327 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-nb\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.349782 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-config\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.350437 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-dns-svc\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.385035 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvcgs\" (UniqueName: \"kubernetes.io/projected/8347e765-6b8c-4da9-93d7-2d127a7f9389-kube-api-access-rvcgs\") pod \"dnsmasq-dns-567dccfc7f-k576w\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.407341 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.425259 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-95g6h"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.426603 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.428442 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.429963 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-krcbq" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.430230 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.430340 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.445773 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbvtl\" (UniqueName: \"kubernetes.io/projected/ff639ff8-dd33-445f-8321-6528b227179d-kube-api-access-lbvtl\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.445831 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2df7ebd2-0918-4164-beec-18057338255d-etc-machine-id\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.445883 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-run-httpd\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.445915 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-combined-ca-bundle\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.445939 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-db-sync-config-data\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.445957 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-config-data\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.445984 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28lfg\" (UniqueName: \"kubernetes.io/projected/2df7ebd2-0918-4164-beec-18057338255d-kube-api-access-28lfg\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.446001 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-log-httpd\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.446040 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-scripts\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.446064 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-scripts\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.446079 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.446129 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.446159 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-config-data\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.446276 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2df7ebd2-0918-4164-beec-18057338255d-etc-machine-id\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.452866 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-combined-ca-bundle\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.468710 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-db-sync-config-data\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.482123 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-config-data\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.482440 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-tv452"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.483649 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.498620 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-scripts\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.498917 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.499028 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nn6zr" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.513953 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tv452"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.524746 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28lfg\" (UniqueName: \"kubernetes.io/projected/2df7ebd2-0918-4164-beec-18057338255d-kube-api-access-28lfg\") pod \"cinder-db-sync-fblpx\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.547390 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.547455 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-config-data\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.547487 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbvtl\" (UniqueName: \"kubernetes.io/projected/ff639ff8-dd33-445f-8321-6528b227179d-kube-api-access-lbvtl\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.547508 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-config\") pod \"neutron-db-sync-95g6h\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.547569 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-run-httpd\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.547600 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-combined-ca-bundle\") pod \"neutron-db-sync-95g6h\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.547641 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dfr4\" (UniqueName: \"kubernetes.io/projected/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-kube-api-access-6dfr4\") pod \"neutron-db-sync-95g6h\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.547677 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-log-httpd\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.547710 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-scripts\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.547744 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.550321 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-95g6h"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.550657 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-run-httpd\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.551158 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-log-httpd\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.552340 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.558918 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-scripts\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.569609 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.571561 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-config-data\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.573905 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbvtl\" (UniqueName: \"kubernetes.io/projected/ff639ff8-dd33-445f-8321-6528b227179d-kube-api-access-lbvtl\") pod \"ceilometer-0\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.594352 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-lhskw"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.595857 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.598542 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.598649 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.606829 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gg6n8" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.620540 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lhskw"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.629095 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.632713 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567dccfc7f-k576w"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.651390 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-combined-ca-bundle\") pod \"barbican-db-sync-tv452\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.651535 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-combined-ca-bundle\") pod \"neutron-db-sync-95g6h\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.651570 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wf72\" (UniqueName: \"kubernetes.io/projected/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-kube-api-access-8wf72\") pod \"barbican-db-sync-tv452\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.651602 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dfr4\" (UniqueName: \"kubernetes.io/projected/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-kube-api-access-6dfr4\") pod \"neutron-db-sync-95g6h\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.651691 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-db-sync-config-data\") pod \"barbican-db-sync-tv452\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.651714 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-config\") pod \"neutron-db-sync-95g6h\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.657686 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.660303 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-649bbcf5cc-5vgdl"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.662510 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.671824 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-combined-ca-bundle\") pod \"neutron-db-sync-95g6h\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.674335 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dfr4\" (UniqueName: \"kubernetes.io/projected/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-kube-api-access-6dfr4\") pod \"neutron-db-sync-95g6h\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.676544 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-config\") pod \"neutron-db-sync-95g6h\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.682203 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-649bbcf5cc-5vgdl"] Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.754667 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.755572 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-combined-ca-bundle\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.755637 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-db-sync-config-data\") pod \"barbican-db-sync-tv452\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.755677 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-combined-ca-bundle\") pod \"barbican-db-sync-tv452\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.755713 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzj8c\" (UniqueName: \"kubernetes.io/projected/70884006-8eb9-4cbb-ac48-a18531a8fe62-kube-api-access-tzj8c\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.755774 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70884006-8eb9-4cbb-ac48-a18531a8fe62-logs\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.757360 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wf72\" (UniqueName: \"kubernetes.io/projected/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-kube-api-access-8wf72\") pod \"barbican-db-sync-tv452\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.757461 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-config-data\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.757596 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-dns-svc\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.757623 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-sb\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.757692 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-nb\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.757758 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-config\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.757807 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcs5b\" (UniqueName: \"kubernetes.io/projected/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-kube-api-access-mcs5b\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.757862 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-scripts\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.762212 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-db-sync-config-data\") pod \"barbican-db-sync-tv452\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.767042 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-combined-ca-bundle\") pod \"barbican-db-sync-tv452\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.773098 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wf72\" (UniqueName: \"kubernetes.io/projected/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-kube-api-access-8wf72\") pod \"barbican-db-sync-tv452\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.860398 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcs5b\" (UniqueName: \"kubernetes.io/projected/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-kube-api-access-mcs5b\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.861360 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-scripts\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.861770 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-combined-ca-bundle\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.861976 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzj8c\" (UniqueName: \"kubernetes.io/projected/70884006-8eb9-4cbb-ac48-a18531a8fe62-kube-api-access-tzj8c\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.862140 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70884006-8eb9-4cbb-ac48-a18531a8fe62-logs\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.862313 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-config-data\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.862492 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-dns-svc\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.862544 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-sb\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.862610 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-nb\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.862671 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-config\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.863286 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70884006-8eb9-4cbb-ac48-a18531a8fe62-logs\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.864371 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-nb\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.864395 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-sb\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.864558 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-dns-svc\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.864954 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-config\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.869774 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-scripts\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.874642 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-combined-ca-bundle\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.881955 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-config-data\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.885541 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcs5b\" (UniqueName: \"kubernetes.io/projected/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-kube-api-access-mcs5b\") pod \"dnsmasq-dns-649bbcf5cc-5vgdl\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:08 crc kubenswrapper[5118]: I0223 07:06:08.886510 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzj8c\" (UniqueName: \"kubernetes.io/projected/70884006-8eb9-4cbb-ac48-a18531a8fe62-kube-api-access-tzj8c\") pod \"placement-db-sync-lhskw\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.001158 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.026975 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.037857 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.154244 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.155780 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.157887 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zf7tv" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.158983 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.160031 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.172437 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:09 crc kubenswrapper[5118]: E0223 07:06:09.267683 5118 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 23 07:06:09 crc kubenswrapper[5118]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/3304d1b4-e4de-404f-ace5-2417aa3d766b/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 23 07:06:09 crc kubenswrapper[5118]: > podSandboxID="7063916477eaea519a3984d827d633f5734348e872595945567ae0aeb54356c4" Feb 23 07:06:09 crc kubenswrapper[5118]: E0223 07:06:09.268313 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:06:09 crc kubenswrapper[5118]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n59fh58bhfh5f8h568hb8h79h5b6h544h588h575h665h5dbhf4hfdh6h66dh5h5dch695h655h569h5bchf4hd7h684hd9hdch57h549h54ch586q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wv5tr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-74cc88677c-zltzg_openstack(3304d1b4-e4de-404f-ace5-2417aa3d766b): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/3304d1b4-e4de-404f-ace5-2417aa3d766b/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 23 07:06:09 crc kubenswrapper[5118]: > logger="UnhandledError" Feb 23 07:06:09 crc kubenswrapper[5118]: E0223 07:06:09.270020 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/3304d1b4-e4de-404f-ace5-2417aa3d766b/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-74cc88677c-zltzg" podUID="3304d1b4-e4de-404f-ace5-2417aa3d766b" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.271746 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.271899 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.271970 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.272053 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcs7t\" (UniqueName: \"kubernetes.io/projected/1051d5a3-d667-43ac-8805-587cc295fd33-kube-api-access-zcs7t\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.272169 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.272422 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-logs\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.272610 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.283639 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.286500 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.292707 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.293502 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374222 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374542 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-scripts\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374589 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374613 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374649 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374681 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcs7t\" (UniqueName: \"kubernetes.io/projected/1051d5a3-d667-43ac-8805-587cc295fd33-kube-api-access-zcs7t\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374705 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc6hr\" (UniqueName: \"kubernetes.io/projected/d2f8b7be-c950-4cf1-82c2-4edec370501c-kube-api-access-pc6hr\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374731 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374755 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-logs\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374838 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374879 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-logs\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374954 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.374975 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-config-data\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.375012 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.375055 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.375557 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.375670 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-logs\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.379993 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.380664 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.380672 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.392127 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcs7t\" (UniqueName: \"kubernetes.io/projected/1051d5a3-d667-43ac-8805-587cc295fd33-kube-api-access-zcs7t\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.411828 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.476802 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.476867 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc6hr\" (UniqueName: \"kubernetes.io/projected/d2f8b7be-c950-4cf1-82c2-4edec370501c-kube-api-access-pc6hr\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.476895 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-logs\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.476976 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.477061 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.477083 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-config-data\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.477157 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-scripts\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.478568 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.478817 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-logs\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.478892 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.482416 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.484067 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.484890 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-config-data\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.493544 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-scripts\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.498345 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc6hr\" (UniqueName: \"kubernetes.io/projected/d2f8b7be-c950-4cf1-82c2-4edec370501c-kube-api-access-pc6hr\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.511977 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.609034 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:06:09 crc kubenswrapper[5118]: I0223 07:06:09.957561 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fblpx"] Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.153047 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567dccfc7f-k576w"] Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.360431 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.401944 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-sb\") pod \"3304d1b4-e4de-404f-ace5-2417aa3d766b\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.401997 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-nb\") pod \"3304d1b4-e4de-404f-ace5-2417aa3d766b\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.402052 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv5tr\" (UniqueName: \"kubernetes.io/projected/3304d1b4-e4de-404f-ace5-2417aa3d766b-kube-api-access-wv5tr\") pod \"3304d1b4-e4de-404f-ace5-2417aa3d766b\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.402094 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-dns-svc\") pod \"3304d1b4-e4de-404f-ace5-2417aa3d766b\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.408393 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-config\") pod \"3304d1b4-e4de-404f-ace5-2417aa3d766b\" (UID: \"3304d1b4-e4de-404f-ace5-2417aa3d766b\") " Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.411414 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3304d1b4-e4de-404f-ace5-2417aa3d766b-kube-api-access-wv5tr" (OuterVolumeSpecName: "kube-api-access-wv5tr") pod "3304d1b4-e4de-404f-ace5-2417aa3d766b" (UID: "3304d1b4-e4de-404f-ace5-2417aa3d766b"). InnerVolumeSpecName "kube-api-access-wv5tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.521413 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv5tr\" (UniqueName: \"kubernetes.io/projected/3304d1b4-e4de-404f-ace5-2417aa3d766b-kube-api-access-wv5tr\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.522710 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3304d1b4-e4de-404f-ace5-2417aa3d766b" (UID: "3304d1b4-e4de-404f-ace5-2417aa3d766b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.526934 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-config" (OuterVolumeSpecName: "config") pod "3304d1b4-e4de-404f-ace5-2417aa3d766b" (UID: "3304d1b4-e4de-404f-ace5-2417aa3d766b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.532837 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3304d1b4-e4de-404f-ace5-2417aa3d766b" (UID: "3304d1b4-e4de-404f-ace5-2417aa3d766b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.537718 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3304d1b4-e4de-404f-ace5-2417aa3d766b" (UID: "3304d1b4-e4de-404f-ace5-2417aa3d766b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.560938 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.574914 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lhskw"] Feb 23 07:06:10 crc kubenswrapper[5118]: W0223 07:06:10.607678 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3867ac0b_7e84_483e_917e_3f08a8ee2ae0.slice/crio-fadf0a2cee97def2333bbeeb4bb827b6e9c5c27dcc0d5c449c25ea7b72cb54a5 WatchSource:0}: Error finding container fadf0a2cee97def2333bbeeb4bb827b6e9c5c27dcc0d5c449c25ea7b72cb54a5: Status 404 returned error can't find the container with id fadf0a2cee97def2333bbeeb4bb827b6e9c5c27dcc0d5c449c25ea7b72cb54a5 Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.615400 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-95g6h"] Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.625561 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.625599 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.625608 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.625757 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3304d1b4-e4de-404f-ace5-2417aa3d766b-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.654486 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-649bbcf5cc-5vgdl"] Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.708461 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dvxfg"] Feb 23 07:06:10 crc kubenswrapper[5118]: I0223 07:06:10.762174 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.010779 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-95g6h" event={"ID":"3867ac0b-7e84-483e-917e-3f08a8ee2ae0","Type":"ContainerStarted","Data":"fadf0a2cee97def2333bbeeb4bb827b6e9c5c27dcc0d5c449c25ea7b72cb54a5"} Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.032271 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.104665 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lhskw" event={"ID":"70884006-8eb9-4cbb-ac48-a18531a8fe62","Type":"ContainerStarted","Data":"fcb06f1d231ae10929513a90b7cf4d242fb1cde0d9a7e1426fbe77ff163c2b93"} Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.150555 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.151974 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dvxfg" event={"ID":"6d0b5007-ea48-438d-a216-e78519ab4d0d","Type":"ContainerStarted","Data":"3d7059b37a54f5b7f21ed4b5a0af749e6c8e1ae67192e90c25af50d8fedd112a"} Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.160187 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tv452"] Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.167938 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.176016 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.199260 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c"} Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.199306 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64"} Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.214574 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cc88677c-zltzg" event={"ID":"3304d1b4-e4de-404f-ace5-2417aa3d766b","Type":"ContainerDied","Data":"7063916477eaea519a3984d827d633f5734348e872595945567ae0aeb54356c4"} Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.214667 5118 scope.go:117] "RemoveContainer" containerID="36227bd2ee2474a1b4cee3744c1b31993590cab58dba5cade7b3f1d0c88df3c1" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.214916 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cc88677c-zltzg" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.227460 5118 generic.go:334] "Generic (PLEG): container finished" podID="8347e765-6b8c-4da9-93d7-2d127a7f9389" containerID="a4cbe1c5d6d2c83e36e12661477d1819dfae8c8f0c8bc202dc676babc253ff91" exitCode=0 Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.227732 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567dccfc7f-k576w" event={"ID":"8347e765-6b8c-4da9-93d7-2d127a7f9389","Type":"ContainerDied","Data":"a4cbe1c5d6d2c83e36e12661477d1819dfae8c8f0c8bc202dc676babc253ff91"} Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.227766 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567dccfc7f-k576w" event={"ID":"8347e765-6b8c-4da9-93d7-2d127a7f9389","Type":"ContainerStarted","Data":"6835e504ef1d5e5bba6edf7442f7239f37452865d8e4a99f2ca11aea3217b585"} Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.245382 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" event={"ID":"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b","Type":"ContainerStarted","Data":"8642957e6d426e68a0ae6c1c7341eef1ece2eb8d11ccdeacbf41eeabaa969403"} Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.249149 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fblpx" event={"ID":"2df7ebd2-0918-4164-beec-18057338255d","Type":"ContainerStarted","Data":"f17b3306d490c28d3c5c4745327401652eb09de2794354c7c7d46f11d0bc1c37"} Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.365709 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74cc88677c-zltzg"] Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.381933 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74cc88677c-zltzg"] Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.726308 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3304d1b4-e4de-404f-ace5-2417aa3d766b" path="/var/lib/kubelet/pods/3304d1b4-e4de-404f-ace5-2417aa3d766b/volumes" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.789254 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.874504 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-nb\") pod \"8347e765-6b8c-4da9-93d7-2d127a7f9389\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.874587 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-dns-svc\") pod \"8347e765-6b8c-4da9-93d7-2d127a7f9389\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.874650 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-config\") pod \"8347e765-6b8c-4da9-93d7-2d127a7f9389\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.874830 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvcgs\" (UniqueName: \"kubernetes.io/projected/8347e765-6b8c-4da9-93d7-2d127a7f9389-kube-api-access-rvcgs\") pod \"8347e765-6b8c-4da9-93d7-2d127a7f9389\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.874937 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-sb\") pod \"8347e765-6b8c-4da9-93d7-2d127a7f9389\" (UID: \"8347e765-6b8c-4da9-93d7-2d127a7f9389\") " Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.889902 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8347e765-6b8c-4da9-93d7-2d127a7f9389-kube-api-access-rvcgs" (OuterVolumeSpecName: "kube-api-access-rvcgs") pod "8347e765-6b8c-4da9-93d7-2d127a7f9389" (UID: "8347e765-6b8c-4da9-93d7-2d127a7f9389"). InnerVolumeSpecName "kube-api-access-rvcgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.962135 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8347e765-6b8c-4da9-93d7-2d127a7f9389" (UID: "8347e765-6b8c-4da9-93d7-2d127a7f9389"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.974688 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8347e765-6b8c-4da9-93d7-2d127a7f9389" (UID: "8347e765-6b8c-4da9-93d7-2d127a7f9389"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.979389 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.979450 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvcgs\" (UniqueName: \"kubernetes.io/projected/8347e765-6b8c-4da9-93d7-2d127a7f9389-kube-api-access-rvcgs\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.979468 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.984551 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8347e765-6b8c-4da9-93d7-2d127a7f9389" (UID: "8347e765-6b8c-4da9-93d7-2d127a7f9389"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:11 crc kubenswrapper[5118]: I0223 07:06:11.990433 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-config" (OuterVolumeSpecName: "config") pod "8347e765-6b8c-4da9-93d7-2d127a7f9389" (UID: "8347e765-6b8c-4da9-93d7-2d127a7f9389"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.081015 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.081046 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8347e765-6b8c-4da9-93d7-2d127a7f9389-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.284724 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-95g6h" event={"ID":"3867ac0b-7e84-483e-917e-3f08a8ee2ae0","Type":"ContainerStarted","Data":"4321cb33d1b2bec2a061b811cc47a8f13fcdd296c2f386145952801ad6df548b"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.287985 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff639ff8-dd33-445f-8321-6528b227179d","Type":"ContainerStarted","Data":"826a07d78195a74f00e2c51d67f74e217188667a281c3a321098529f73635374"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.290856 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1051d5a3-d667-43ac-8805-587cc295fd33","Type":"ContainerStarted","Data":"5e25d102deb2a700be6004ed2ec1d482a85c613b6513c3edd5806a90dfbeef3a"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.290905 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1051d5a3-d667-43ac-8805-587cc295fd33","Type":"ContainerStarted","Data":"3668b27a2d675e42ac3e5779a046a186275f530a2ec40c3a06426c9b1aeb4291"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.307501 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dvxfg" event={"ID":"6d0b5007-ea48-438d-a216-e78519ab4d0d","Type":"ContainerStarted","Data":"085fb994609b4b7cd53dbff3fc6fd8454ce84798c4da501d558fe01c7bd93cad"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.311350 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-95g6h" podStartSLOduration=4.311331332 podStartE2EDuration="4.311331332s" podCreationTimestamp="2026-02-23 07:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:12.301405047 +0000 UTC m=+1235.305189610" watchObservedRunningTime="2026-02-23 07:06:12.311331332 +0000 UTC m=+1235.315115905" Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.326466 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dvxfg" podStartSLOduration=5.3264472099999995 podStartE2EDuration="5.32644721s" podCreationTimestamp="2026-02-23 07:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:12.322906957 +0000 UTC m=+1235.326691530" watchObservedRunningTime="2026-02-23 07:06:12.32644721 +0000 UTC m=+1235.330231783" Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.341530 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.341583 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.341596 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.343673 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d2f8b7be-c950-4cf1-82c2-4edec370501c","Type":"ContainerStarted","Data":"78c4c91e92c0a7b641e6107db12d1d587833c1ce2c4e4b022ec8f985be322ef4"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.347645 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567dccfc7f-k576w" event={"ID":"8347e765-6b8c-4da9-93d7-2d127a7f9389","Type":"ContainerDied","Data":"6835e504ef1d5e5bba6edf7442f7239f37452865d8e4a99f2ca11aea3217b585"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.347684 5118 scope.go:117] "RemoveContainer" containerID="a4cbe1c5d6d2c83e36e12661477d1819dfae8c8f0c8bc202dc676babc253ff91" Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.347814 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567dccfc7f-k576w" Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.363544 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tv452" event={"ID":"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7","Type":"ContainerStarted","Data":"9339e69902fbcaf68e8d5a44484e67089668392aec61939eecd42466fcdec4de"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.374070 5118 generic.go:334] "Generic (PLEG): container finished" podID="ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" containerID="3eb8a68ed74f2e0eec73c120d4d54d12affb6e868fd1bc0c126db4feffda3e82" exitCode=0 Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.374127 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" event={"ID":"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b","Type":"ContainerDied","Data":"3eb8a68ed74f2e0eec73c120d4d54d12affb6e868fd1bc0c126db4feffda3e82"} Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.529384 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567dccfc7f-k576w"] Feb 23 07:06:12 crc kubenswrapper[5118]: I0223 07:06:12.540906 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567dccfc7f-k576w"] Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.391465 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" event={"ID":"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b","Type":"ContainerStarted","Data":"df7130e45233b0a2085fcd4fa49fa016541f2ff80a77ab2ee380163038158536"} Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.393301 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.415300 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" podStartSLOduration=5.41528117 podStartE2EDuration="5.41528117s" podCreationTimestamp="2026-02-23 07:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:13.410885226 +0000 UTC m=+1236.414669799" watchObservedRunningTime="2026-02-23 07:06:13.41528117 +0000 UTC m=+1236.419065743" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.417221 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb"} Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.417294 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerStarted","Data":"6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54"} Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.425255 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d2f8b7be-c950-4cf1-82c2-4edec370501c","Type":"ContainerStarted","Data":"5886647b90af1a0366940b8e2157a3e6335cdd9ad47f8417b4e1b4f75a108c99"} Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.464623 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.309739345 podStartE2EDuration="45.46460398s" podCreationTimestamp="2026-02-23 07:05:28 +0000 UTC" firstStartedPulling="2026-02-23 07:06:02.291530291 +0000 UTC m=+1225.295314864" lastFinishedPulling="2026-02-23 07:06:09.446394916 +0000 UTC m=+1232.450179499" observedRunningTime="2026-02-23 07:06:13.454520651 +0000 UTC m=+1236.458305224" watchObservedRunningTime="2026-02-23 07:06:13.46460398 +0000 UTC m=+1236.468388553" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.708849 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8347e765-6b8c-4da9-93d7-2d127a7f9389" path="/var/lib/kubelet/pods/8347e765-6b8c-4da9-93d7-2d127a7f9389/volumes" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.839996 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-649bbcf5cc-5vgdl"] Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.874558 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-qfwc4"] Feb 23 07:06:13 crc kubenswrapper[5118]: E0223 07:06:13.874959 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8347e765-6b8c-4da9-93d7-2d127a7f9389" containerName="init" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.874978 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8347e765-6b8c-4da9-93d7-2d127a7f9389" containerName="init" Feb 23 07:06:13 crc kubenswrapper[5118]: E0223 07:06:13.875004 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3304d1b4-e4de-404f-ace5-2417aa3d766b" containerName="init" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.875011 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="3304d1b4-e4de-404f-ace5-2417aa3d766b" containerName="init" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.875219 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="3304d1b4-e4de-404f-ace5-2417aa3d766b" containerName="init" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.875249 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8347e765-6b8c-4da9-93d7-2d127a7f9389" containerName="init" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.876214 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.889539 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.901985 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-qfwc4"] Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.925779 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-nb\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.925929 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-svc\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.925998 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-config\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.926079 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-sb\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.926257 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6g4p\" (UniqueName: \"kubernetes.io/projected/4ce713ab-ad6f-4b84-88e1-236d326a1b58-kube-api-access-r6g4p\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:13 crc kubenswrapper[5118]: I0223 07:06:13.926290 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-swift-storage-0\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.028289 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-svc\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.028364 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-config\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.028420 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-sb\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.028471 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6g4p\" (UniqueName: \"kubernetes.io/projected/4ce713ab-ad6f-4b84-88e1-236d326a1b58-kube-api-access-r6g4p\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.028496 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-swift-storage-0\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.029351 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-sb\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.029421 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-config\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.029924 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-swift-storage-0\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.029973 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-nb\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.030147 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-svc\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.030587 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-nb\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.050862 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6g4p\" (UniqueName: \"kubernetes.io/projected/4ce713ab-ad6f-4b84-88e1-236d326a1b58-kube-api-access-r6g4p\") pod \"dnsmasq-dns-68bc8f6695-qfwc4\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.200266 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.445191 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1051d5a3-d667-43ac-8805-587cc295fd33","Type":"ContainerStarted","Data":"f57c6a161d5773b5f0e68cdd32f141e669e1c38a8dd6043f1a4b7973fd5f15c6"} Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.445314 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1051d5a3-d667-43ac-8805-587cc295fd33" containerName="glance-log" containerID="cri-o://5e25d102deb2a700be6004ed2ec1d482a85c613b6513c3edd5806a90dfbeef3a" gracePeriod=30 Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.445585 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1051d5a3-d667-43ac-8805-587cc295fd33" containerName="glance-httpd" containerID="cri-o://f57c6a161d5773b5f0e68cdd32f141e669e1c38a8dd6043f1a4b7973fd5f15c6" gracePeriod=30 Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.450803 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d2f8b7be-c950-4cf1-82c2-4edec370501c","Type":"ContainerStarted","Data":"5765e6ec607d44ef01ff4b6fc9bbf475f330625eb9dabcadd9e78c99c8a59fd8"} Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.451855 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d2f8b7be-c950-4cf1-82c2-4edec370501c" containerName="glance-log" containerID="cri-o://5886647b90af1a0366940b8e2157a3e6335cdd9ad47f8417b4e1b4f75a108c99" gracePeriod=30 Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.451917 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d2f8b7be-c950-4cf1-82c2-4edec370501c" containerName="glance-httpd" containerID="cri-o://5765e6ec607d44ef01ff4b6fc9bbf475f330625eb9dabcadd9e78c99c8a59fd8" gracePeriod=30 Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.510063 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.5100422909999995 podStartE2EDuration="6.510042291s" podCreationTimestamp="2026-02-23 07:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:14.481816801 +0000 UTC m=+1237.485601374" watchObservedRunningTime="2026-02-23 07:06:14.510042291 +0000 UTC m=+1237.513826864" Feb 23 07:06:14 crc kubenswrapper[5118]: I0223 07:06:14.510206 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.510200724 podStartE2EDuration="6.510200724s" podCreationTimestamp="2026-02-23 07:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:14.505912423 +0000 UTC m=+1237.509696986" watchObservedRunningTime="2026-02-23 07:06:14.510200724 +0000 UTC m=+1237.513985297" Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.468889 5118 generic.go:334] "Generic (PLEG): container finished" podID="d2f8b7be-c950-4cf1-82c2-4edec370501c" containerID="5765e6ec607d44ef01ff4b6fc9bbf475f330625eb9dabcadd9e78c99c8a59fd8" exitCode=0 Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.468926 5118 generic.go:334] "Generic (PLEG): container finished" podID="d2f8b7be-c950-4cf1-82c2-4edec370501c" containerID="5886647b90af1a0366940b8e2157a3e6335cdd9ad47f8417b4e1b4f75a108c99" exitCode=143 Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.469005 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d2f8b7be-c950-4cf1-82c2-4edec370501c","Type":"ContainerDied","Data":"5765e6ec607d44ef01ff4b6fc9bbf475f330625eb9dabcadd9e78c99c8a59fd8"} Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.469073 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d2f8b7be-c950-4cf1-82c2-4edec370501c","Type":"ContainerDied","Data":"5886647b90af1a0366940b8e2157a3e6335cdd9ad47f8417b4e1b4f75a108c99"} Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.471645 5118 generic.go:334] "Generic (PLEG): container finished" podID="1051d5a3-d667-43ac-8805-587cc295fd33" containerID="f57c6a161d5773b5f0e68cdd32f141e669e1c38a8dd6043f1a4b7973fd5f15c6" exitCode=0 Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.471666 5118 generic.go:334] "Generic (PLEG): container finished" podID="1051d5a3-d667-43ac-8805-587cc295fd33" containerID="5e25d102deb2a700be6004ed2ec1d482a85c613b6513c3edd5806a90dfbeef3a" exitCode=143 Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.471709 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1051d5a3-d667-43ac-8805-587cc295fd33","Type":"ContainerDied","Data":"f57c6a161d5773b5f0e68cdd32f141e669e1c38a8dd6043f1a4b7973fd5f15c6"} Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.471741 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1051d5a3-d667-43ac-8805-587cc295fd33","Type":"ContainerDied","Data":"5e25d102deb2a700be6004ed2ec1d482a85c613b6513c3edd5806a90dfbeef3a"} Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.473841 5118 generic.go:334] "Generic (PLEG): container finished" podID="6d0b5007-ea48-438d-a216-e78519ab4d0d" containerID="085fb994609b4b7cd53dbff3fc6fd8454ce84798c4da501d558fe01c7bd93cad" exitCode=0 Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.473873 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dvxfg" event={"ID":"6d0b5007-ea48-438d-a216-e78519ab4d0d","Type":"ContainerDied","Data":"085fb994609b4b7cd53dbff3fc6fd8454ce84798c4da501d558fe01c7bd93cad"} Feb 23 07:06:15 crc kubenswrapper[5118]: I0223 07:06:15.474047 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" podUID="ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" containerName="dnsmasq-dns" containerID="cri-o://df7130e45233b0a2085fcd4fa49fa016541f2ff80a77ab2ee380163038158536" gracePeriod=10 Feb 23 07:06:16 crc kubenswrapper[5118]: I0223 07:06:16.511622 5118 generic.go:334] "Generic (PLEG): container finished" podID="ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" containerID="df7130e45233b0a2085fcd4fa49fa016541f2ff80a77ab2ee380163038158536" exitCode=0 Feb 23 07:06:16 crc kubenswrapper[5118]: I0223 07:06:16.511704 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" event={"ID":"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b","Type":"ContainerDied","Data":"df7130e45233b0a2085fcd4fa49fa016541f2ff80a77ab2ee380163038158536"} Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.837314 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.848790 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900168 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-combined-ca-bundle\") pod \"1051d5a3-d667-43ac-8805-587cc295fd33\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900254 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcs5b\" (UniqueName: \"kubernetes.io/projected/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-kube-api-access-mcs5b\") pod \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900313 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-config\") pod \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900418 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-logs\") pod \"1051d5a3-d667-43ac-8805-587cc295fd33\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900476 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"1051d5a3-d667-43ac-8805-587cc295fd33\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900524 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-dns-svc\") pod \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900579 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-httpd-run\") pod \"1051d5a3-d667-43ac-8805-587cc295fd33\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900617 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-scripts\") pod \"1051d5a3-d667-43ac-8805-587cc295fd33\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900723 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-nb\") pod \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900757 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcs7t\" (UniqueName: \"kubernetes.io/projected/1051d5a3-d667-43ac-8805-587cc295fd33-kube-api-access-zcs7t\") pod \"1051d5a3-d667-43ac-8805-587cc295fd33\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900777 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-config-data\") pod \"1051d5a3-d667-43ac-8805-587cc295fd33\" (UID: \"1051d5a3-d667-43ac-8805-587cc295fd33\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.900801 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-sb\") pod \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\" (UID: \"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b\") " Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.902857 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1051d5a3-d667-43ac-8805-587cc295fd33" (UID: "1051d5a3-d667-43ac-8805-587cc295fd33"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.903192 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-logs" (OuterVolumeSpecName: "logs") pod "1051d5a3-d667-43ac-8805-587cc295fd33" (UID: "1051d5a3-d667-43ac-8805-587cc295fd33"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.910592 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-kube-api-access-mcs5b" (OuterVolumeSpecName: "kube-api-access-mcs5b") pod "ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" (UID: "ce8fc42c-7dfd-4242-85ca-a7a641f0f91b"). InnerVolumeSpecName "kube-api-access-mcs5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.914470 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1051d5a3-d667-43ac-8805-587cc295fd33-kube-api-access-zcs7t" (OuterVolumeSpecName: "kube-api-access-zcs7t") pod "1051d5a3-d667-43ac-8805-587cc295fd33" (UID: "1051d5a3-d667-43ac-8805-587cc295fd33"). InnerVolumeSpecName "kube-api-access-zcs7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.917236 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "1051d5a3-d667-43ac-8805-587cc295fd33" (UID: "1051d5a3-d667-43ac-8805-587cc295fd33"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.933768 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-scripts" (OuterVolumeSpecName: "scripts") pod "1051d5a3-d667-43ac-8805-587cc295fd33" (UID: "1051d5a3-d667-43ac-8805-587cc295fd33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.967886 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1051d5a3-d667-43ac-8805-587cc295fd33" (UID: "1051d5a3-d667-43ac-8805-587cc295fd33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.969134 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" (UID: "ce8fc42c-7dfd-4242-85ca-a7a641f0f91b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.975451 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" (UID: "ce8fc42c-7dfd-4242-85ca-a7a641f0f91b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.981460 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" (UID: "ce8fc42c-7dfd-4242-85ca-a7a641f0f91b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.989784 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-config-data" (OuterVolumeSpecName: "config-data") pod "1051d5a3-d667-43ac-8805-587cc295fd33" (UID: "1051d5a3-d667-43ac-8805-587cc295fd33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:20 crc kubenswrapper[5118]: I0223 07:06:20.993810 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-config" (OuterVolumeSpecName: "config") pod "ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" (UID: "ce8fc42c-7dfd-4242-85ca-a7a641f0f91b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003511 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003547 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003559 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003570 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003580 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003593 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcs7t\" (UniqueName: \"kubernetes.io/projected/1051d5a3-d667-43ac-8805-587cc295fd33-kube-api-access-zcs7t\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003602 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003611 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003621 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1051d5a3-d667-43ac-8805-587cc295fd33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003629 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcs5b\" (UniqueName: \"kubernetes.io/projected/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-kube-api-access-mcs5b\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003639 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.003648 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1051d5a3-d667-43ac-8805-587cc295fd33-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.019286 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.105899 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.565508 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1051d5a3-d667-43ac-8805-587cc295fd33","Type":"ContainerDied","Data":"3668b27a2d675e42ac3e5779a046a186275f530a2ec40c3a06426c9b1aeb4291"} Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.565578 5118 scope.go:117] "RemoveContainer" containerID="f57c6a161d5773b5f0e68cdd32f141e669e1c38a8dd6043f1a4b7973fd5f15c6" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.565697 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.573735 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" event={"ID":"ce8fc42c-7dfd-4242-85ca-a7a641f0f91b","Type":"ContainerDied","Data":"8642957e6d426e68a0ae6c1c7341eef1ece2eb8d11ccdeacbf41eeabaa969403"} Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.573817 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.621859 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-649bbcf5cc-5vgdl"] Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.673567 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-649bbcf5cc-5vgdl"] Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.766660 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" path="/var/lib/kubelet/pods/ce8fc42c-7dfd-4242-85ca-a7a641f0f91b/volumes" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.767579 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.767609 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.785484 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:21 crc kubenswrapper[5118]: E0223 07:06:21.786179 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1051d5a3-d667-43ac-8805-587cc295fd33" containerName="glance-log" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.786264 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1051d5a3-d667-43ac-8805-587cc295fd33" containerName="glance-log" Feb 23 07:06:21 crc kubenswrapper[5118]: E0223 07:06:21.786333 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1051d5a3-d667-43ac-8805-587cc295fd33" containerName="glance-httpd" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.786385 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1051d5a3-d667-43ac-8805-587cc295fd33" containerName="glance-httpd" Feb 23 07:06:21 crc kubenswrapper[5118]: E0223 07:06:21.786439 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" containerName="init" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.786490 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" containerName="init" Feb 23 07:06:21 crc kubenswrapper[5118]: E0223 07:06:21.786575 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" containerName="dnsmasq-dns" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.786628 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" containerName="dnsmasq-dns" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.786863 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" containerName="dnsmasq-dns" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.786925 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="1051d5a3-d667-43ac-8805-587cc295fd33" containerName="glance-log" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.786989 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="1051d5a3-d667-43ac-8805-587cc295fd33" containerName="glance-httpd" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.788397 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.796591 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.797333 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.798559 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.931896 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.932038 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.932061 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.932118 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.932142 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.932168 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.932223 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:21 crc kubenswrapper[5118]: I0223 07:06:21.932255 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n7rx\" (UniqueName: \"kubernetes.io/projected/22653bef-e824-4fc6-9a90-3a275160b3a4-kube-api-access-9n7rx\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.034417 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.034485 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.034506 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.034529 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.034551 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.034595 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.034634 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n7rx\" (UniqueName: \"kubernetes.io/projected/22653bef-e824-4fc6-9a90-3a275160b3a4-kube-api-access-9n7rx\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.034694 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.035395 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.036530 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.036885 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.062394 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.062539 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.063444 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.063841 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.066515 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n7rx\" (UniqueName: \"kubernetes.io/projected/22653bef-e824-4fc6-9a90-3a275160b3a4-kube-api-access-9n7rx\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.067823 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:06:22 crc kubenswrapper[5118]: I0223 07:06:22.117427 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:23 crc kubenswrapper[5118]: I0223 07:06:23.711732 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1051d5a3-d667-43ac-8805-587cc295fd33" path="/var/lib/kubelet/pods/1051d5a3-d667-43ac-8805-587cc295fd33/volumes" Feb 23 07:06:24 crc kubenswrapper[5118]: I0223 07:06:24.038945 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-649bbcf5cc-5vgdl" podUID="ce8fc42c-7dfd-4242-85ca-a7a641f0f91b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Feb 23 07:06:28 crc kubenswrapper[5118]: E0223 07:06:28.846038 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5ac8ede62671a3b3695cf29bd3a6f124f27c93d1730f9030cc3daa05034d4af4" Feb 23 07:06:28 crc kubenswrapper[5118]: E0223 07:06:28.846640 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5ac8ede62671a3b3695cf29bd3a6f124f27c93d1730f9030cc3daa05034d4af4,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8ch5b7h696h5f8h55h55ch54h5f5h5ch57dhf9h655hcdh59fh695h655h58fh58bh89hd5h94hf7h59fh7ch96hbbh648h96h9ch567h674h59bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lbvtl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ff639ff8-dd33-445f-8321-6528b227179d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.019000 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.027315 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.085866 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-scripts\") pod \"d2f8b7be-c950-4cf1-82c2-4edec370501c\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.085977 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-credential-keys\") pod \"6d0b5007-ea48-438d-a216-e78519ab4d0d\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.086024 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-logs\") pod \"d2f8b7be-c950-4cf1-82c2-4edec370501c\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.087239 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-config-data\") pod \"6d0b5007-ea48-438d-a216-e78519ab4d0d\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.087317 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-httpd-run\") pod \"d2f8b7be-c950-4cf1-82c2-4edec370501c\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.087372 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-config-data\") pod \"d2f8b7be-c950-4cf1-82c2-4edec370501c\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.088117 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d2f8b7be-c950-4cf1-82c2-4edec370501c" (UID: "d2f8b7be-c950-4cf1-82c2-4edec370501c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.089005 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-logs" (OuterVolumeSpecName: "logs") pod "d2f8b7be-c950-4cf1-82c2-4edec370501c" (UID: "d2f8b7be-c950-4cf1-82c2-4edec370501c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.089166 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-combined-ca-bundle\") pod \"6d0b5007-ea48-438d-a216-e78519ab4d0d\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.089338 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-combined-ca-bundle\") pod \"d2f8b7be-c950-4cf1-82c2-4edec370501c\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.089464 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mlls\" (UniqueName: \"kubernetes.io/projected/6d0b5007-ea48-438d-a216-e78519ab4d0d-kube-api-access-2mlls\") pod \"6d0b5007-ea48-438d-a216-e78519ab4d0d\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.089543 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d2f8b7be-c950-4cf1-82c2-4edec370501c\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.089706 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-fernet-keys\") pod \"6d0b5007-ea48-438d-a216-e78519ab4d0d\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.089793 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc6hr\" (UniqueName: \"kubernetes.io/projected/d2f8b7be-c950-4cf1-82c2-4edec370501c-kube-api-access-pc6hr\") pod \"d2f8b7be-c950-4cf1-82c2-4edec370501c\" (UID: \"d2f8b7be-c950-4cf1-82c2-4edec370501c\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.089889 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-scripts\") pod \"6d0b5007-ea48-438d-a216-e78519ab4d0d\" (UID: \"6d0b5007-ea48-438d-a216-e78519ab4d0d\") " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.093494 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6d0b5007-ea48-438d-a216-e78519ab4d0d" (UID: "6d0b5007-ea48-438d-a216-e78519ab4d0d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.094906 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f8b7be-c950-4cf1-82c2-4edec370501c-kube-api-access-pc6hr" (OuterVolumeSpecName: "kube-api-access-pc6hr") pod "d2f8b7be-c950-4cf1-82c2-4edec370501c" (UID: "d2f8b7be-c950-4cf1-82c2-4edec370501c"). InnerVolumeSpecName "kube-api-access-pc6hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.098392 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-scripts" (OuterVolumeSpecName: "scripts") pod "d2f8b7be-c950-4cf1-82c2-4edec370501c" (UID: "d2f8b7be-c950-4cf1-82c2-4edec370501c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.098414 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc6hr\" (UniqueName: \"kubernetes.io/projected/d2f8b7be-c950-4cf1-82c2-4edec370501c-kube-api-access-pc6hr\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.098517 5118 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.098543 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.098565 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d2f8b7be-c950-4cf1-82c2-4edec370501c-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.101309 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-scripts" (OuterVolumeSpecName: "scripts") pod "6d0b5007-ea48-438d-a216-e78519ab4d0d" (UID: "6d0b5007-ea48-438d-a216-e78519ab4d0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.111516 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "d2f8b7be-c950-4cf1-82c2-4edec370501c" (UID: "d2f8b7be-c950-4cf1-82c2-4edec370501c"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.117529 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d0b5007-ea48-438d-a216-e78519ab4d0d-kube-api-access-2mlls" (OuterVolumeSpecName: "kube-api-access-2mlls") pod "6d0b5007-ea48-438d-a216-e78519ab4d0d" (UID: "6d0b5007-ea48-438d-a216-e78519ab4d0d"). InnerVolumeSpecName "kube-api-access-2mlls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.142451 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2f8b7be-c950-4cf1-82c2-4edec370501c" (UID: "d2f8b7be-c950-4cf1-82c2-4edec370501c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.143889 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6d0b5007-ea48-438d-a216-e78519ab4d0d" (UID: "6d0b5007-ea48-438d-a216-e78519ab4d0d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.153961 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-config-data" (OuterVolumeSpecName: "config-data") pod "6d0b5007-ea48-438d-a216-e78519ab4d0d" (UID: "6d0b5007-ea48-438d-a216-e78519ab4d0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.156381 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-config-data" (OuterVolumeSpecName: "config-data") pod "d2f8b7be-c950-4cf1-82c2-4edec370501c" (UID: "d2f8b7be-c950-4cf1-82c2-4edec370501c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.170815 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d0b5007-ea48-438d-a216-e78519ab4d0d" (UID: "6d0b5007-ea48-438d-a216-e78519ab4d0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.200872 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.200913 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mlls\" (UniqueName: \"kubernetes.io/projected/6d0b5007-ea48-438d-a216-e78519ab4d0d-kube-api-access-2mlls\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.200962 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.200973 5118 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.200982 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.200993 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.201005 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.201020 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2f8b7be-c950-4cf1-82c2-4edec370501c-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.201031 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0b5007-ea48-438d-a216-e78519ab4d0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.214846 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.303835 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:29 crc kubenswrapper[5118]: E0223 07:06:29.513332 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec" Feb 23 07:06:29 crc kubenswrapper[5118]: E0223 07:06:29.513995 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wf72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-tv452_openstack(07f00e7b-49f2-4bb1-a540-c7f9a13a76b7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:06:29 crc kubenswrapper[5118]: E0223 07:06:29.515271 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-tv452" podUID="07f00e7b-49f2-4bb1-a540-c7f9a13a76b7" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.655371 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dvxfg" event={"ID":"6d0b5007-ea48-438d-a216-e78519ab4d0d","Type":"ContainerDied","Data":"3d7059b37a54f5b7f21ed4b5a0af749e6c8e1ae67192e90c25af50d8fedd112a"} Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.655431 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d7059b37a54f5b7f21ed4b5a0af749e6c8e1ae67192e90c25af50d8fedd112a" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.655639 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dvxfg" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.661279 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d2f8b7be-c950-4cf1-82c2-4edec370501c","Type":"ContainerDied","Data":"78c4c91e92c0a7b641e6107db12d1d587833c1ce2c4e4b022ec8f985be322ef4"} Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.661481 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.665892 5118 generic.go:334] "Generic (PLEG): container finished" podID="3867ac0b-7e84-483e-917e-3f08a8ee2ae0" containerID="4321cb33d1b2bec2a061b811cc47a8f13fcdd296c2f386145952801ad6df548b" exitCode=0 Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.666234 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-95g6h" event={"ID":"3867ac0b-7e84-483e-917e-3f08a8ee2ae0","Type":"ContainerDied","Data":"4321cb33d1b2bec2a061b811cc47a8f13fcdd296c2f386145952801ad6df548b"} Feb 23 07:06:29 crc kubenswrapper[5118]: E0223 07:06:29.669389 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec\\\"\"" pod="openstack/barbican-db-sync-tv452" podUID="07f00e7b-49f2-4bb1-a540-c7f9a13a76b7" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.746056 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.767170 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.776266 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:29 crc kubenswrapper[5118]: E0223 07:06:29.777398 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f8b7be-c950-4cf1-82c2-4edec370501c" containerName="glance-httpd" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.777433 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f8b7be-c950-4cf1-82c2-4edec370501c" containerName="glance-httpd" Feb 23 07:06:29 crc kubenswrapper[5118]: E0223 07:06:29.777452 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0b5007-ea48-438d-a216-e78519ab4d0d" containerName="keystone-bootstrap" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.777461 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0b5007-ea48-438d-a216-e78519ab4d0d" containerName="keystone-bootstrap" Feb 23 07:06:29 crc kubenswrapper[5118]: E0223 07:06:29.777493 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f8b7be-c950-4cf1-82c2-4edec370501c" containerName="glance-log" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.777502 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f8b7be-c950-4cf1-82c2-4edec370501c" containerName="glance-log" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.777744 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2f8b7be-c950-4cf1-82c2-4edec370501c" containerName="glance-httpd" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.781130 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2f8b7be-c950-4cf1-82c2-4edec370501c" containerName="glance-log" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.781168 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d0b5007-ea48-438d-a216-e78519ab4d0d" containerName="keystone-bootstrap" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.783439 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.784769 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.786895 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.787070 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.844580 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-qfwc4"] Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.926969 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.927026 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-scripts\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.927048 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp9ml\" (UniqueName: \"kubernetes.io/projected/31dd5760-ad76-42fa-bc30-b6cb90f353e5-kube-api-access-wp9ml\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.927073 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.927151 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.927170 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-logs\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.927194 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:29 crc kubenswrapper[5118]: I0223 07:06:29.927229 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-config-data\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.029110 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.029172 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-scripts\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.029204 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp9ml\" (UniqueName: \"kubernetes.io/projected/31dd5760-ad76-42fa-bc30-b6cb90f353e5-kube-api-access-wp9ml\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.029238 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.029284 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.029309 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-logs\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.029345 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.029720 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.029785 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-logs\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.029932 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-config-data\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.030309 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.037118 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-config-data\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.040046 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-scripts\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.040348 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.042533 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.054926 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp9ml\" (UniqueName: \"kubernetes.io/projected/31dd5760-ad76-42fa-bc30-b6cb90f353e5-kube-api-access-wp9ml\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.067428 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.102467 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.217153 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dvxfg"] Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.225413 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dvxfg"] Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.319179 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-45wnt"] Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.322629 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.327376 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.327533 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.328636 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.328642 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-86zb7" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.328735 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.334087 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-45wnt"] Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.439812 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-fernet-keys\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.439859 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4j2t\" (UniqueName: \"kubernetes.io/projected/d221d72b-6274-48da-900c-284185365e14-kube-api-access-p4j2t\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.439900 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-scripts\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.439938 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-combined-ca-bundle\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.439980 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-config-data\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.440319 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-credential-keys\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.543901 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-fernet-keys\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.543956 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4j2t\" (UniqueName: \"kubernetes.io/projected/d221d72b-6274-48da-900c-284185365e14-kube-api-access-p4j2t\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.544004 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-scripts\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.544042 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-combined-ca-bundle\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.544121 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-config-data\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.544217 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-credential-keys\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.551014 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-scripts\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.551313 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-config-data\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.551533 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-credential-keys\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.557920 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-fernet-keys\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.558692 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-combined-ca-bundle\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.568161 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4j2t\" (UniqueName: \"kubernetes.io/projected/d221d72b-6274-48da-900c-284185365e14-kube-api-access-p4j2t\") pod \"keystone-bootstrap-45wnt\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.648090 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:30 crc kubenswrapper[5118]: I0223 07:06:30.981699 5118 scope.go:117] "RemoveContainer" containerID="5e25d102deb2a700be6004ed2ec1d482a85c613b6513c3edd5806a90dfbeef3a" Feb 23 07:06:31 crc kubenswrapper[5118]: E0223 07:06:31.026576 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 23 07:06:31 crc kubenswrapper[5118]: E0223 07:06:31.027203 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-28lfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fblpx_openstack(2df7ebd2-0918-4164-beec-18057338255d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 07:06:31 crc kubenswrapper[5118]: E0223 07:06:31.028694 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fblpx" podUID="2df7ebd2-0918-4164-beec-18057338255d" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.198296 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.261244 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-combined-ca-bundle\") pod \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.261486 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dfr4\" (UniqueName: \"kubernetes.io/projected/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-kube-api-access-6dfr4\") pod \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.261547 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-config\") pod \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\" (UID: \"3867ac0b-7e84-483e-917e-3f08a8ee2ae0\") " Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.267480 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-kube-api-access-6dfr4" (OuterVolumeSpecName: "kube-api-access-6dfr4") pod "3867ac0b-7e84-483e-917e-3f08a8ee2ae0" (UID: "3867ac0b-7e84-483e-917e-3f08a8ee2ae0"). InnerVolumeSpecName "kube-api-access-6dfr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.288032 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-config" (OuterVolumeSpecName: "config") pod "3867ac0b-7e84-483e-917e-3f08a8ee2ae0" (UID: "3867ac0b-7e84-483e-917e-3f08a8ee2ae0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.289134 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3867ac0b-7e84-483e-917e-3f08a8ee2ae0" (UID: "3867ac0b-7e84-483e-917e-3f08a8ee2ae0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.364322 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dfr4\" (UniqueName: \"kubernetes.io/projected/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-kube-api-access-6dfr4\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.364360 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.364376 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3867ac0b-7e84-483e-917e-3f08a8ee2ae0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.506216 5118 scope.go:117] "RemoveContainer" containerID="df7130e45233b0a2085fcd4fa49fa016541f2ff80a77ab2ee380163038158536" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.556970 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.584964 5118 scope.go:117] "RemoveContainer" containerID="3eb8a68ed74f2e0eec73c120d4d54d12affb6e868fd1bc0c126db4feffda3e82" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.594000 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-45wnt"] Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.655335 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.664940 5118 scope.go:117] "RemoveContainer" containerID="5765e6ec607d44ef01ff4b6fc9bbf475f330625eb9dabcadd9e78c99c8a59fd8" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.687976 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" event={"ID":"4ce713ab-ad6f-4b84-88e1-236d326a1b58","Type":"ContainerStarted","Data":"b38874902cbd2837089b3bd594a16997ae6dc281fa6a4a45455209f50844adba"} Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.708823 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-95g6h" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.720502 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0b5007-ea48-438d-a216-e78519ab4d0d" path="/var/lib/kubelet/pods/6d0b5007-ea48-438d-a216-e78519ab4d0d/volumes" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.721184 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2f8b7be-c950-4cf1-82c2-4edec370501c" path="/var/lib/kubelet/pods/d2f8b7be-c950-4cf1-82c2-4edec370501c/volumes" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.722269 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22653bef-e824-4fc6-9a90-3a275160b3a4","Type":"ContainerStarted","Data":"198ac77e2e045888b34196e59bffa93fd7e5ad63f3a76cdaa3617e8783ace41b"} Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.722302 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-95g6h" event={"ID":"3867ac0b-7e84-483e-917e-3f08a8ee2ae0","Type":"ContainerDied","Data":"fadf0a2cee97def2333bbeeb4bb827b6e9c5c27dcc0d5c449c25ea7b72cb54a5"} Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.722319 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fadf0a2cee97def2333bbeeb4bb827b6e9c5c27dcc0d5c449c25ea7b72cb54a5" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.755744 5118 scope.go:117] "RemoveContainer" containerID="5886647b90af1a0366940b8e2157a3e6335cdd9ad47f8417b4e1b4f75a108c99" Feb 23 07:06:31 crc kubenswrapper[5118]: I0223 07:06:31.757980 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-45wnt" event={"ID":"d221d72b-6274-48da-900c-284185365e14","Type":"ContainerStarted","Data":"5cc2080302b9574d9e21694929ad1f0156cbe6e7a81e0df28e605163feaa0b8e"} Feb 23 07:06:31 crc kubenswrapper[5118]: E0223 07:06:31.759050 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-fblpx" podUID="2df7ebd2-0918-4164-beec-18057338255d" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.022469 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-qfwc4"] Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.086213 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-dvztp"] Feb 23 07:06:32 crc kubenswrapper[5118]: E0223 07:06:32.086688 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3867ac0b-7e84-483e-917e-3f08a8ee2ae0" containerName="neutron-db-sync" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.086708 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="3867ac0b-7e84-483e-917e-3f08a8ee2ae0" containerName="neutron-db-sync" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.086929 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="3867ac0b-7e84-483e-917e-3f08a8ee2ae0" containerName="neutron-db-sync" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.087985 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.095830 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-dvztp"] Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.155196 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-669cbc77fb-t2crt"] Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.156940 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.168348 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.169187 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.169332 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-krcbq" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.169488 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.184132 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htqkt\" (UniqueName: \"kubernetes.io/projected/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-kube-api-access-htqkt\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.184212 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-config\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.184296 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-swift-storage-0\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.184319 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-nb\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.184369 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-sb\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.184418 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-svc\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.202884 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-669cbc77fb-t2crt"] Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287498 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-ovndb-tls-certs\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287576 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htqkt\" (UniqueName: \"kubernetes.io/projected/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-kube-api-access-htqkt\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287618 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-httpd-config\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287652 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-config\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287711 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-swift-storage-0\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287738 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-nb\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287777 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-config\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287809 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-sb\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287883 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-svc\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287933 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-combined-ca-bundle\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.287979 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvt8s\" (UniqueName: \"kubernetes.io/projected/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-kube-api-access-fvt8s\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.289419 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-sb\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.289544 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-config\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.290052 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-nb\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.290186 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-svc\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.290629 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-swift-storage-0\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.313385 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htqkt\" (UniqueName: \"kubernetes.io/projected/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-kube-api-access-htqkt\") pod \"dnsmasq-dns-77d55b9c69-dvztp\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.390520 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-config\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.390644 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-combined-ca-bundle\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.390692 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvt8s\" (UniqueName: \"kubernetes.io/projected/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-kube-api-access-fvt8s\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.390726 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-ovndb-tls-certs\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.390765 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-httpd-config\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.397490 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-ovndb-tls-certs\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.401581 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-httpd-config\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.404917 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-combined-ca-bundle\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.404956 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-config\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.408373 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvt8s\" (UniqueName: \"kubernetes.io/projected/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-kube-api-access-fvt8s\") pod \"neutron-669cbc77fb-t2crt\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.431250 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.510456 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.786859 5118 generic.go:334] "Generic (PLEG): container finished" podID="4ce713ab-ad6f-4b84-88e1-236d326a1b58" containerID="c6855b07d8f4422d8e94a4cb61f0646072a968d65fd63d18da86c51b79f6c67c" exitCode=0 Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.787353 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" event={"ID":"4ce713ab-ad6f-4b84-88e1-236d326a1b58","Type":"ContainerDied","Data":"c6855b07d8f4422d8e94a4cb61f0646072a968d65fd63d18da86c51b79f6c67c"} Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.796843 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff639ff8-dd33-445f-8321-6528b227179d","Type":"ContainerStarted","Data":"93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500"} Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.823281 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lhskw" event={"ID":"70884006-8eb9-4cbb-ac48-a18531a8fe62","Type":"ContainerStarted","Data":"320a0c8fe4ce9d33211c93e92ea408a8c42d0c18964f7dda8ef928a9afbb7e0f"} Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.834411 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31dd5760-ad76-42fa-bc30-b6cb90f353e5","Type":"ContainerStarted","Data":"3d29e9738ad95cfa97695152f119423f98d65f291597f6d1f5769ffb912c7feb"} Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.849663 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-45wnt" event={"ID":"d221d72b-6274-48da-900c-284185365e14","Type":"ContainerStarted","Data":"db1d242dbe5d63f480d76e72117a96ec58b79d79ceb0c11acd8fe34bb58ce66f"} Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.943238 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-lhskw" podStartSLOduration=6.662800872 podStartE2EDuration="24.943217966s" podCreationTimestamp="2026-02-23 07:06:08 +0000 UTC" firstStartedPulling="2026-02-23 07:06:10.555918226 +0000 UTC m=+1233.559702799" lastFinishedPulling="2026-02-23 07:06:28.83633532 +0000 UTC m=+1251.840119893" observedRunningTime="2026-02-23 07:06:32.852446134 +0000 UTC m=+1255.856230707" watchObservedRunningTime="2026-02-23 07:06:32.943217966 +0000 UTC m=+1255.947002539" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.974079 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-45wnt" podStartSLOduration=2.974056258 podStartE2EDuration="2.974056258s" podCreationTimestamp="2026-02-23 07:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:32.878226295 +0000 UTC m=+1255.882010878" watchObservedRunningTime="2026-02-23 07:06:32.974056258 +0000 UTC m=+1255.977840831" Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.975276 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:06:32 crc kubenswrapper[5118]: I0223 07:06:32.975369 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.012481 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-669cbc77fb-t2crt"] Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.053736 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-dvztp"] Feb 23 07:06:33 crc kubenswrapper[5118]: E0223 07:06:33.350295 5118 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 23 07:06:33 crc kubenswrapper[5118]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/4ce713ab-ad6f-4b84-88e1-236d326a1b58/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 23 07:06:33 crc kubenswrapper[5118]: > podSandboxID="b38874902cbd2837089b3bd594a16997ae6dc281fa6a4a45455209f50844adba" Feb 23 07:06:33 crc kubenswrapper[5118]: E0223 07:06:33.351207 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:06:33 crc kubenswrapper[5118]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dbh55fhfh596h68dh54dh55ch68ch5c7h5f8h68ch58fh644hb8h65bhc5h5d9h5bdhdh577h5dbh64dh694h687h667hcdh664h695h6fh579h95h565q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r6g4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-68bc8f6695-qfwc4_openstack(4ce713ab-ad6f-4b84-88e1-236d326a1b58): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/4ce713ab-ad6f-4b84-88e1-236d326a1b58/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 23 07:06:33 crc kubenswrapper[5118]: > logger="UnhandledError" Feb 23 07:06:33 crc kubenswrapper[5118]: E0223 07:06:33.352317 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/4ce713ab-ad6f-4b84-88e1-236d326a1b58/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" podUID="4ce713ab-ad6f-4b84-88e1-236d326a1b58" Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.866731 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31dd5760-ad76-42fa-bc30-b6cb90f353e5","Type":"ContainerStarted","Data":"460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127"} Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.867197 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31dd5760-ad76-42fa-bc30-b6cb90f353e5","Type":"ContainerStarted","Data":"115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63"} Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.869361 5118 generic.go:334] "Generic (PLEG): container finished" podID="b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" containerID="ab973d15165d83835af258696bd69fae7cee6956504739dbf641f604a05c4f3b" exitCode=0 Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.869428 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" event={"ID":"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea","Type":"ContainerDied","Data":"ab973d15165d83835af258696bd69fae7cee6956504739dbf641f604a05c4f3b"} Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.869463 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" event={"ID":"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea","Type":"ContainerStarted","Data":"6f7295c341d553d38257baec77da0c8c36a3c6d74cf484d2a659f796ae940dd4"} Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.874240 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669cbc77fb-t2crt" event={"ID":"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2","Type":"ContainerStarted","Data":"2bd607225ed6de7ecdf4edab5d293168921b5d05789c1817b483b260e5a22348"} Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.874305 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669cbc77fb-t2crt" event={"ID":"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2","Type":"ContainerStarted","Data":"8f35eaca65d796ab21f4ca0bcf43c8c10780748560c0fee8cea0a49d6e4d4212"} Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.874322 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669cbc77fb-t2crt" event={"ID":"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2","Type":"ContainerStarted","Data":"53cb9a6f5b6607145d0065f1061b097e7990aff809ac6e35ec98e20aed2dae2d"} Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.875516 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.884724 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22653bef-e824-4fc6-9a90-3a275160b3a4","Type":"ContainerStarted","Data":"5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022"} Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.884772 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22653bef-e824-4fc6-9a90-3a275160b3a4","Type":"ContainerStarted","Data":"67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0"} Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.948354 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.94832097 podStartE2EDuration="12.94832097s" podCreationTimestamp="2026-02-23 07:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:33.9284778 +0000 UTC m=+1256.932262393" watchObservedRunningTime="2026-02-23 07:06:33.94832097 +0000 UTC m=+1256.952105543" Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.948524 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.948519175 podStartE2EDuration="4.948519175s" podCreationTimestamp="2026-02-23 07:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:33.898467178 +0000 UTC m=+1256.902251751" watchObservedRunningTime="2026-02-23 07:06:33.948519175 +0000 UTC m=+1256.952303748" Feb 23 07:06:33 crc kubenswrapper[5118]: I0223 07:06:33.994537 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-669cbc77fb-t2crt" podStartSLOduration=1.994517846 podStartE2EDuration="1.994517846s" podCreationTimestamp="2026-02-23 07:06:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:33.989744202 +0000 UTC m=+1256.993528775" watchObservedRunningTime="2026-02-23 07:06:33.994517846 +0000 UTC m=+1256.998302419" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.434988 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.580353 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-nb\") pod \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.580466 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-sb\") pod \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.580624 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-svc\") pod \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.580740 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6g4p\" (UniqueName: \"kubernetes.io/projected/4ce713ab-ad6f-4b84-88e1-236d326a1b58-kube-api-access-r6g4p\") pod \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.580846 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-swift-storage-0\") pod \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.580882 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-config\") pod \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\" (UID: \"4ce713ab-ad6f-4b84-88e1-236d326a1b58\") " Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.606171 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce713ab-ad6f-4b84-88e1-236d326a1b58-kube-api-access-r6g4p" (OuterVolumeSpecName: "kube-api-access-r6g4p") pod "4ce713ab-ad6f-4b84-88e1-236d326a1b58" (UID: "4ce713ab-ad6f-4b84-88e1-236d326a1b58"). InnerVolumeSpecName "kube-api-access-r6g4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.637185 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ce713ab-ad6f-4b84-88e1-236d326a1b58" (UID: "4ce713ab-ad6f-4b84-88e1-236d326a1b58"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.642136 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4ce713ab-ad6f-4b84-88e1-236d326a1b58" (UID: "4ce713ab-ad6f-4b84-88e1-236d326a1b58"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.653505 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-config" (OuterVolumeSpecName: "config") pod "4ce713ab-ad6f-4b84-88e1-236d326a1b58" (UID: "4ce713ab-ad6f-4b84-88e1-236d326a1b58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.655023 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ce713ab-ad6f-4b84-88e1-236d326a1b58" (UID: "4ce713ab-ad6f-4b84-88e1-236d326a1b58"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.671567 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ce713ab-ad6f-4b84-88e1-236d326a1b58" (UID: "4ce713ab-ad6f-4b84-88e1-236d326a1b58"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.683761 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.683788 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.683801 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6g4p\" (UniqueName: \"kubernetes.io/projected/4ce713ab-ad6f-4b84-88e1-236d326a1b58-kube-api-access-r6g4p\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.683816 5118 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.683830 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.683842 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ce713ab-ad6f-4b84-88e1-236d326a1b58-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.905092 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" event={"ID":"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea","Type":"ContainerStarted","Data":"d4e13a58c524fa157e11bb80e1c91fd643b862183d0f6d64de6798e3f21b49a4"} Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.905614 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.911656 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" event={"ID":"4ce713ab-ad6f-4b84-88e1-236d326a1b58","Type":"ContainerDied","Data":"b38874902cbd2837089b3bd594a16997ae6dc281fa6a4a45455209f50844adba"} Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.911724 5118 scope.go:117] "RemoveContainer" containerID="c6855b07d8f4422d8e94a4cb61f0646072a968d65fd63d18da86c51b79f6c67c" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.911868 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68bc8f6695-qfwc4" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.918902 5118 generic.go:334] "Generic (PLEG): container finished" podID="70884006-8eb9-4cbb-ac48-a18531a8fe62" containerID="320a0c8fe4ce9d33211c93e92ea408a8c42d0c18964f7dda8ef928a9afbb7e0f" exitCode=0 Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.919276 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lhskw" event={"ID":"70884006-8eb9-4cbb-ac48-a18531a8fe62","Type":"ContainerDied","Data":"320a0c8fe4ce9d33211c93e92ea408a8c42d0c18964f7dda8ef928a9afbb7e0f"} Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.952631 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" podStartSLOduration=2.952606685 podStartE2EDuration="2.952606685s" podCreationTimestamp="2026-02-23 07:06:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:34.9308861 +0000 UTC m=+1257.934670673" watchObservedRunningTime="2026-02-23 07:06:34.952606685 +0000 UTC m=+1257.956391268" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.985698 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f964cfbb7-9np4m"] Feb 23 07:06:34 crc kubenswrapper[5118]: E0223 07:06:34.986251 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce713ab-ad6f-4b84-88e1-236d326a1b58" containerName="init" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.986265 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce713ab-ad6f-4b84-88e1-236d326a1b58" containerName="init" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.986463 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce713ab-ad6f-4b84-88e1-236d326a1b58" containerName="init" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.988063 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.992445 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 23 07:06:34 crc kubenswrapper[5118]: I0223 07:06:34.992640 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.080935 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f964cfbb7-9np4m"] Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.099206 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-internal-tls-certs\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.099270 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-public-tls-certs\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.099302 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-combined-ca-bundle\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.099351 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-ovndb-tls-certs\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.099393 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l7bd\" (UniqueName: \"kubernetes.io/projected/cfc07232-4f7f-4922-996a-2f95a26bc25d-kube-api-access-2l7bd\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.099427 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-config\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.099462 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-httpd-config\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.116710 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-qfwc4"] Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.124019 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68bc8f6695-qfwc4"] Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.201708 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l7bd\" (UniqueName: \"kubernetes.io/projected/cfc07232-4f7f-4922-996a-2f95a26bc25d-kube-api-access-2l7bd\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.201813 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-config\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.201872 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-httpd-config\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.201941 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-internal-tls-certs\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.201967 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-public-tls-certs\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.201987 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-combined-ca-bundle\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.202029 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-ovndb-tls-certs\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.208139 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-internal-tls-certs\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.211669 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-combined-ca-bundle\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.211798 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-ovndb-tls-certs\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.214535 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-public-tls-certs\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.218646 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l7bd\" (UniqueName: \"kubernetes.io/projected/cfc07232-4f7f-4922-996a-2f95a26bc25d-kube-api-access-2l7bd\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.219141 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-httpd-config\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.233047 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-config\") pod \"neutron-6f964cfbb7-9np4m\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.356366 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.742464 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce713ab-ad6f-4b84-88e1-236d326a1b58" path="/var/lib/kubelet/pods/4ce713ab-ad6f-4b84-88e1-236d326a1b58/volumes" Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.938798 5118 generic.go:334] "Generic (PLEG): container finished" podID="d221d72b-6274-48da-900c-284185365e14" containerID="db1d242dbe5d63f480d76e72117a96ec58b79d79ceb0c11acd8fe34bb58ce66f" exitCode=0 Feb 23 07:06:35 crc kubenswrapper[5118]: I0223 07:06:35.938911 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-45wnt" event={"ID":"d221d72b-6274-48da-900c-284185365e14","Type":"ContainerDied","Data":"db1d242dbe5d63f480d76e72117a96ec58b79d79ceb0c11acd8fe34bb58ce66f"} Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.087535 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f964cfbb7-9np4m"] Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.388713 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.438179 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70884006-8eb9-4cbb-ac48-a18531a8fe62-logs\") pod \"70884006-8eb9-4cbb-ac48-a18531a8fe62\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.438606 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-config-data\") pod \"70884006-8eb9-4cbb-ac48-a18531a8fe62\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.438743 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-combined-ca-bundle\") pod \"70884006-8eb9-4cbb-ac48-a18531a8fe62\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.438776 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-scripts\") pod \"70884006-8eb9-4cbb-ac48-a18531a8fe62\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.438925 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzj8c\" (UniqueName: \"kubernetes.io/projected/70884006-8eb9-4cbb-ac48-a18531a8fe62-kube-api-access-tzj8c\") pod \"70884006-8eb9-4cbb-ac48-a18531a8fe62\" (UID: \"70884006-8eb9-4cbb-ac48-a18531a8fe62\") " Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.440066 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70884006-8eb9-4cbb-ac48-a18531a8fe62-logs" (OuterVolumeSpecName: "logs") pod "70884006-8eb9-4cbb-ac48-a18531a8fe62" (UID: "70884006-8eb9-4cbb-ac48-a18531a8fe62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.443989 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70884006-8eb9-4cbb-ac48-a18531a8fe62-kube-api-access-tzj8c" (OuterVolumeSpecName: "kube-api-access-tzj8c") pod "70884006-8eb9-4cbb-ac48-a18531a8fe62" (UID: "70884006-8eb9-4cbb-ac48-a18531a8fe62"). InnerVolumeSpecName "kube-api-access-tzj8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.447575 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-scripts" (OuterVolumeSpecName: "scripts") pod "70884006-8eb9-4cbb-ac48-a18531a8fe62" (UID: "70884006-8eb9-4cbb-ac48-a18531a8fe62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.464921 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-config-data" (OuterVolumeSpecName: "config-data") pod "70884006-8eb9-4cbb-ac48-a18531a8fe62" (UID: "70884006-8eb9-4cbb-ac48-a18531a8fe62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.469825 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70884006-8eb9-4cbb-ac48-a18531a8fe62" (UID: "70884006-8eb9-4cbb-ac48-a18531a8fe62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.541317 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.541347 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.541357 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzj8c\" (UniqueName: \"kubernetes.io/projected/70884006-8eb9-4cbb-ac48-a18531a8fe62-kube-api-access-tzj8c\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.541367 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70884006-8eb9-4cbb-ac48-a18531a8fe62-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.541376 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70884006-8eb9-4cbb-ac48-a18531a8fe62-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.956366 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lhskw" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.956431 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lhskw" event={"ID":"70884006-8eb9-4cbb-ac48-a18531a8fe62","Type":"ContainerDied","Data":"fcb06f1d231ae10929513a90b7cf4d242fb1cde0d9a7e1426fbe77ff163c2b93"} Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.956485 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcb06f1d231ae10929513a90b7cf4d242fb1cde0d9a7e1426fbe77ff163c2b93" Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.976787 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f964cfbb7-9np4m" event={"ID":"cfc07232-4f7f-4922-996a-2f95a26bc25d","Type":"ContainerStarted","Data":"1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e"} Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.976852 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f964cfbb7-9np4m" event={"ID":"cfc07232-4f7f-4922-996a-2f95a26bc25d","Type":"ContainerStarted","Data":"2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8"} Feb 23 07:06:36 crc kubenswrapper[5118]: I0223 07:06:36.976864 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f964cfbb7-9np4m" event={"ID":"cfc07232-4f7f-4922-996a-2f95a26bc25d","Type":"ContainerStarted","Data":"20adc449979bbfb184e306c25c6f13fefb5bed7347151e335b26ef7ca7869140"} Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.005782 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f964cfbb7-9np4m" podStartSLOduration=3.005751431 podStartE2EDuration="3.005751431s" podCreationTimestamp="2026-02-23 07:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:36.994127965 +0000 UTC m=+1259.997912548" watchObservedRunningTime="2026-02-23 07:06:37.005751431 +0000 UTC m=+1260.009536004" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.167043 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b976557fd-6lkcx"] Feb 23 07:06:37 crc kubenswrapper[5118]: E0223 07:06:37.167808 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70884006-8eb9-4cbb-ac48-a18531a8fe62" containerName="placement-db-sync" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.167822 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="70884006-8eb9-4cbb-ac48-a18531a8fe62" containerName="placement-db-sync" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.168011 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="70884006-8eb9-4cbb-ac48-a18531a8fe62" containerName="placement-db-sync" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.168985 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.176042 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.177735 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.177941 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.178054 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gg6n8" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.178376 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.205444 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b976557fd-6lkcx"] Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.255360 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-internal-tls-certs\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.255427 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-config-data\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.255707 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-combined-ca-bundle\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.256013 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-public-tls-certs\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.256074 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-scripts\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.256100 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-logs\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.256184 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfht9\" (UniqueName: \"kubernetes.io/projected/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-kube-api-access-qfht9\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.358078 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-public-tls-certs\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.358290 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-scripts\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.358318 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-logs\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.358348 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfht9\" (UniqueName: \"kubernetes.io/projected/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-kube-api-access-qfht9\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.358400 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-internal-tls-certs\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.358439 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-config-data\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.359113 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-logs\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.359144 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-combined-ca-bundle\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.362368 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-scripts\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.363972 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-internal-tls-certs\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.364708 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-config-data\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.366570 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-public-tls-certs\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.375680 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-combined-ca-bundle\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.378593 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfht9\" (UniqueName: \"kubernetes.io/projected/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-kube-api-access-qfht9\") pod \"placement-b976557fd-6lkcx\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.496528 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:37 crc kubenswrapper[5118]: I0223 07:06:37.984719 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.723824 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.795889 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-fernet-keys\") pod \"d221d72b-6274-48da-900c-284185365e14\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.795940 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-config-data\") pod \"d221d72b-6274-48da-900c-284185365e14\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.795988 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-scripts\") pod \"d221d72b-6274-48da-900c-284185365e14\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.796020 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4j2t\" (UniqueName: \"kubernetes.io/projected/d221d72b-6274-48da-900c-284185365e14-kube-api-access-p4j2t\") pod \"d221d72b-6274-48da-900c-284185365e14\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.796091 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-credential-keys\") pod \"d221d72b-6274-48da-900c-284185365e14\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.796297 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-combined-ca-bundle\") pod \"d221d72b-6274-48da-900c-284185365e14\" (UID: \"d221d72b-6274-48da-900c-284185365e14\") " Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.803781 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d221d72b-6274-48da-900c-284185365e14" (UID: "d221d72b-6274-48da-900c-284185365e14"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.804078 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d221d72b-6274-48da-900c-284185365e14" (UID: "d221d72b-6274-48da-900c-284185365e14"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.806153 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d221d72b-6274-48da-900c-284185365e14-kube-api-access-p4j2t" (OuterVolumeSpecName: "kube-api-access-p4j2t") pod "d221d72b-6274-48da-900c-284185365e14" (UID: "d221d72b-6274-48da-900c-284185365e14"). InnerVolumeSpecName "kube-api-access-p4j2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.806801 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-scripts" (OuterVolumeSpecName: "scripts") pod "d221d72b-6274-48da-900c-284185365e14" (UID: "d221d72b-6274-48da-900c-284185365e14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.835185 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-config-data" (OuterVolumeSpecName: "config-data") pod "d221d72b-6274-48da-900c-284185365e14" (UID: "d221d72b-6274-48da-900c-284185365e14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.851055 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d221d72b-6274-48da-900c-284185365e14" (UID: "d221d72b-6274-48da-900c-284185365e14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.898591 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.898620 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4j2t\" (UniqueName: \"kubernetes.io/projected/d221d72b-6274-48da-900c-284185365e14-kube-api-access-p4j2t\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.898633 5118 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.898642 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.898650 5118 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.898658 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d221d72b-6274-48da-900c-284185365e14-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.995041 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff639ff8-dd33-445f-8321-6528b227179d","Type":"ContainerStarted","Data":"5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8"} Feb 23 07:06:38 crc kubenswrapper[5118]: I0223 07:06:38.998481 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-45wnt" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.007296 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-45wnt" event={"ID":"d221d72b-6274-48da-900c-284185365e14","Type":"ContainerDied","Data":"5cc2080302b9574d9e21694929ad1f0156cbe6e7a81e0df28e605163feaa0b8e"} Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.007325 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cc2080302b9574d9e21694929ad1f0156cbe6e7a81e0df28e605163feaa0b8e" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.120561 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b976557fd-6lkcx"] Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.911550 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-777dc4b79-zkfms"] Feb 23 07:06:39 crc kubenswrapper[5118]: E0223 07:06:39.913061 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d221d72b-6274-48da-900c-284185365e14" containerName="keystone-bootstrap" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.913087 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d221d72b-6274-48da-900c-284185365e14" containerName="keystone-bootstrap" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.913509 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d221d72b-6274-48da-900c-284185365e14" containerName="keystone-bootstrap" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.914716 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.919245 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.920157 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.920956 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-86zb7" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.921335 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.922351 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.925004 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 07:06:39 crc kubenswrapper[5118]: I0223 07:06:39.947466 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-777dc4b79-zkfms"] Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.006033 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b976557fd-6lkcx" event={"ID":"9a0dba31-6eac-45a8-b327-cfa8af20e3cb","Type":"ContainerStarted","Data":"9c2df29044a4a6505db392529c5d66e0c129921b569f448d9246b1dfaaeab1f6"} Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.006185 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b976557fd-6lkcx" event={"ID":"9a0dba31-6eac-45a8-b327-cfa8af20e3cb","Type":"ContainerStarted","Data":"45ffa8bb3a7b2fda9250327348a8dbae14fb92a0d0dba32bea29be4860f5a723"} Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.006199 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b976557fd-6lkcx" event={"ID":"9a0dba31-6eac-45a8-b327-cfa8af20e3cb","Type":"ContainerStarted","Data":"dab074f4c5a6cda8167c62ebbfc6403ad510cb26329d6e63a6effd40549b2e61"} Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.006332 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.027747 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-b976557fd-6lkcx" podStartSLOduration=3.027726502 podStartE2EDuration="3.027726502s" podCreationTimestamp="2026-02-23 07:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:40.024361261 +0000 UTC m=+1263.028145834" watchObservedRunningTime="2026-02-23 07:06:40.027726502 +0000 UTC m=+1263.031511085" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.041427 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-credential-keys\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.041483 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-internal-tls-certs\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.041509 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-config-data\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.041896 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tlhr\" (UniqueName: \"kubernetes.io/projected/92a80c23-cba1-417f-bbd5-5c5138c3664a-kube-api-access-7tlhr\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.042053 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-public-tls-certs\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.042262 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-scripts\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.042398 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-fernet-keys\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.042589 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-combined-ca-bundle\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.103182 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.103281 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.143730 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.144707 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-public-tls-certs\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.144823 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-scripts\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.144895 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-fernet-keys\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.144988 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-combined-ca-bundle\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.145030 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-credential-keys\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.145069 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-internal-tls-certs\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.145119 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-config-data\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.145224 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tlhr\" (UniqueName: \"kubernetes.io/projected/92a80c23-cba1-417f-bbd5-5c5138c3664a-kube-api-access-7tlhr\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.149638 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.152023 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-combined-ca-bundle\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.153307 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-credential-keys\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.155543 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-config-data\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.156455 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-scripts\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.161953 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tlhr\" (UniqueName: \"kubernetes.io/projected/92a80c23-cba1-417f-bbd5-5c5138c3664a-kube-api-access-7tlhr\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.170986 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-fernet-keys\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.171649 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-internal-tls-certs\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.174419 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-public-tls-certs\") pod \"keystone-777dc4b79-zkfms\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.245599 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:40 crc kubenswrapper[5118]: W0223 07:06:40.734834 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92a80c23_cba1_417f_bbd5_5c5138c3664a.slice/crio-591ea039ba39ce6f2d7336c52c3117c715770dcae9d1daf7a2d08d88be55b1a2 WatchSource:0}: Error finding container 591ea039ba39ce6f2d7336c52c3117c715770dcae9d1daf7a2d08d88be55b1a2: Status 404 returned error can't find the container with id 591ea039ba39ce6f2d7336c52c3117c715770dcae9d1daf7a2d08d88be55b1a2 Feb 23 07:06:40 crc kubenswrapper[5118]: I0223 07:06:40.735076 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-777dc4b79-zkfms"] Feb 23 07:06:41 crc kubenswrapper[5118]: I0223 07:06:41.017354 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-777dc4b79-zkfms" event={"ID":"92a80c23-cba1-417f-bbd5-5c5138c3664a","Type":"ContainerStarted","Data":"591ea039ba39ce6f2d7336c52c3117c715770dcae9d1daf7a2d08d88be55b1a2"} Feb 23 07:06:41 crc kubenswrapper[5118]: I0223 07:06:41.018543 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:06:41 crc kubenswrapper[5118]: I0223 07:06:41.018601 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:06:41 crc kubenswrapper[5118]: I0223 07:06:41.018623 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:06:42 crc kubenswrapper[5118]: I0223 07:06:42.030413 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-777dc4b79-zkfms" event={"ID":"92a80c23-cba1-417f-bbd5-5c5138c3664a","Type":"ContainerStarted","Data":"7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3"} Feb 23 07:06:42 crc kubenswrapper[5118]: I0223 07:06:42.031124 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:06:42 crc kubenswrapper[5118]: I0223 07:06:42.060135 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-777dc4b79-zkfms" podStartSLOduration=3.060089384 podStartE2EDuration="3.060089384s" podCreationTimestamp="2026-02-23 07:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:42.054753529 +0000 UTC m=+1265.058538102" watchObservedRunningTime="2026-02-23 07:06:42.060089384 +0000 UTC m=+1265.063873957" Feb 23 07:06:42 crc kubenswrapper[5118]: I0223 07:06:42.117699 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:42 crc kubenswrapper[5118]: I0223 07:06:42.117795 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:42 crc kubenswrapper[5118]: I0223 07:06:42.170477 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:42 crc kubenswrapper[5118]: I0223 07:06:42.170861 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:42 crc kubenswrapper[5118]: I0223 07:06:42.443086 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:06:42 crc kubenswrapper[5118]: I0223 07:06:42.514698 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-vhntj"] Feb 23 07:06:42 crc kubenswrapper[5118]: I0223 07:06:42.514970 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" podUID="d4b341f1-f2c2-4114-b200-1d1c351e84ac" containerName="dnsmasq-dns" containerID="cri-o://396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1" gracePeriod=10 Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.027438 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.059583 5118 generic.go:334] "Generic (PLEG): container finished" podID="d4b341f1-f2c2-4114-b200-1d1c351e84ac" containerID="396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1" exitCode=0 Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.059742 5118 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.059752 5118 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.060717 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.061253 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" event={"ID":"d4b341f1-f2c2-4114-b200-1d1c351e84ac","Type":"ContainerDied","Data":"396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1"} Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.061284 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-vhntj" event={"ID":"d4b341f1-f2c2-4114-b200-1d1c351e84ac","Type":"ContainerDied","Data":"b5424f608bab6d406d9e103c42ccd47452c79d433df11ccf0e604ab2feaa728e"} Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.061305 5118 scope.go:117] "RemoveContainer" containerID="396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.062491 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.062569 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.094516 5118 scope.go:117] "RemoveContainer" containerID="aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.116857 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-sb\") pod \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.116946 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx9bv\" (UniqueName: \"kubernetes.io/projected/d4b341f1-f2c2-4114-b200-1d1c351e84ac-kube-api-access-sx9bv\") pod \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.116974 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-nb\") pod \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.117050 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-config\") pod \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.117123 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-dns-svc\") pod \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\" (UID: \"d4b341f1-f2c2-4114-b200-1d1c351e84ac\") " Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.124616 5118 scope.go:117] "RemoveContainer" containerID="396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1" Feb 23 07:06:43 crc kubenswrapper[5118]: E0223 07:06:43.125709 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1\": container with ID starting with 396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1 not found: ID does not exist" containerID="396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.125740 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1"} err="failed to get container status \"396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1\": rpc error: code = NotFound desc = could not find container \"396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1\": container with ID starting with 396196a9a8b3ea34651973e5bea2333ef8caaa7e9b06c6c050c1e8f06318e6c1 not found: ID does not exist" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.125764 5118 scope.go:117] "RemoveContainer" containerID="aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.126613 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b341f1-f2c2-4114-b200-1d1c351e84ac-kube-api-access-sx9bv" (OuterVolumeSpecName: "kube-api-access-sx9bv") pod "d4b341f1-f2c2-4114-b200-1d1c351e84ac" (UID: "d4b341f1-f2c2-4114-b200-1d1c351e84ac"). InnerVolumeSpecName "kube-api-access-sx9bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:43 crc kubenswrapper[5118]: E0223 07:06:43.129827 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b\": container with ID starting with aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b not found: ID does not exist" containerID="aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.129857 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b"} err="failed to get container status \"aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b\": rpc error: code = NotFound desc = could not find container \"aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b\": container with ID starting with aed2b7e06e5114781de232bbf35986f8c21c5b0b6f271eeed9b699e5ad7ad12b not found: ID does not exist" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.179007 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d4b341f1-f2c2-4114-b200-1d1c351e84ac" (UID: "d4b341f1-f2c2-4114-b200-1d1c351e84ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.180828 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d4b341f1-f2c2-4114-b200-1d1c351e84ac" (UID: "d4b341f1-f2c2-4114-b200-1d1c351e84ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.219828 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.219878 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx9bv\" (UniqueName: \"kubernetes.io/projected/d4b341f1-f2c2-4114-b200-1d1c351e84ac-kube-api-access-sx9bv\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.219855 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-config" (OuterVolumeSpecName: "config") pod "d4b341f1-f2c2-4114-b200-1d1c351e84ac" (UID: "d4b341f1-f2c2-4114-b200-1d1c351e84ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.219891 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.228718 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4b341f1-f2c2-4114-b200-1d1c351e84ac" (UID: "d4b341f1-f2c2-4114-b200-1d1c351e84ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.322579 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.322624 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4b341f1-f2c2-4114-b200-1d1c351e84ac-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.398667 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-vhntj"] Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.407151 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-vhntj"] Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.446190 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.454001 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:06:43 crc kubenswrapper[5118]: I0223 07:06:43.708903 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b341f1-f2c2-4114-b200-1d1c351e84ac" path="/var/lib/kubelet/pods/d4b341f1-f2c2-4114-b200-1d1c351e84ac/volumes" Feb 23 07:06:44 crc kubenswrapper[5118]: I0223 07:06:44.069742 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tv452" event={"ID":"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7","Type":"ContainerStarted","Data":"21901553cbeacd59d5ec7cc9bf72db01ee50046e05f98b576ecedf03ec161b8b"} Feb 23 07:06:44 crc kubenswrapper[5118]: I0223 07:06:44.090710 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-tv452" podStartSLOduration=4.081509632 podStartE2EDuration="36.090621605s" podCreationTimestamp="2026-02-23 07:06:08 +0000 UTC" firstStartedPulling="2026-02-23 07:06:11.164744353 +0000 UTC m=+1234.168528926" lastFinishedPulling="2026-02-23 07:06:43.173856326 +0000 UTC m=+1266.177640899" observedRunningTime="2026-02-23 07:06:44.087935502 +0000 UTC m=+1267.091720095" watchObservedRunningTime="2026-02-23 07:06:44.090621605 +0000 UTC m=+1267.094406178" Feb 23 07:06:45 crc kubenswrapper[5118]: I0223 07:06:45.270824 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:45 crc kubenswrapper[5118]: I0223 07:06:45.271338 5118 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:06:45 crc kubenswrapper[5118]: I0223 07:06:45.343047 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:06:46 crc kubenswrapper[5118]: I0223 07:06:46.093296 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fblpx" event={"ID":"2df7ebd2-0918-4164-beec-18057338255d","Type":"ContainerStarted","Data":"e3d135e626ac942021e458bc4a7ab385023f6d9287f81b2ebc06d10db2903a65"} Feb 23 07:06:46 crc kubenswrapper[5118]: I0223 07:06:46.096145 5118 generic.go:334] "Generic (PLEG): container finished" podID="07f00e7b-49f2-4bb1-a540-c7f9a13a76b7" containerID="21901553cbeacd59d5ec7cc9bf72db01ee50046e05f98b576ecedf03ec161b8b" exitCode=0 Feb 23 07:06:46 crc kubenswrapper[5118]: I0223 07:06:46.096215 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tv452" event={"ID":"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7","Type":"ContainerDied","Data":"21901553cbeacd59d5ec7cc9bf72db01ee50046e05f98b576ecedf03ec161b8b"} Feb 23 07:06:46 crc kubenswrapper[5118]: I0223 07:06:46.122514 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fblpx" podStartSLOduration=3.018807154 podStartE2EDuration="38.122491367s" podCreationTimestamp="2026-02-23 07:06:08 +0000 UTC" firstStartedPulling="2026-02-23 07:06:09.994287419 +0000 UTC m=+1232.998071992" lastFinishedPulling="2026-02-23 07:06:45.097971632 +0000 UTC m=+1268.101756205" observedRunningTime="2026-02-23 07:06:46.113886443 +0000 UTC m=+1269.117671016" watchObservedRunningTime="2026-02-23 07:06:46.122491367 +0000 UTC m=+1269.126275940" Feb 23 07:06:47 crc kubenswrapper[5118]: I0223 07:06:47.611729 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:47 crc kubenswrapper[5118]: I0223 07:06:47.734136 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-db-sync-config-data\") pod \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " Feb 23 07:06:47 crc kubenswrapper[5118]: I0223 07:06:47.734212 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wf72\" (UniqueName: \"kubernetes.io/projected/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-kube-api-access-8wf72\") pod \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " Feb 23 07:06:47 crc kubenswrapper[5118]: I0223 07:06:47.734289 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-combined-ca-bundle\") pod \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\" (UID: \"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7\") " Feb 23 07:06:47 crc kubenswrapper[5118]: I0223 07:06:47.742460 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-kube-api-access-8wf72" (OuterVolumeSpecName: "kube-api-access-8wf72") pod "07f00e7b-49f2-4bb1-a540-c7f9a13a76b7" (UID: "07f00e7b-49f2-4bb1-a540-c7f9a13a76b7"). InnerVolumeSpecName "kube-api-access-8wf72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:47 crc kubenswrapper[5118]: I0223 07:06:47.743414 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "07f00e7b-49f2-4bb1-a540-c7f9a13a76b7" (UID: "07f00e7b-49f2-4bb1-a540-c7f9a13a76b7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:47 crc kubenswrapper[5118]: I0223 07:06:47.766463 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07f00e7b-49f2-4bb1-a540-c7f9a13a76b7" (UID: "07f00e7b-49f2-4bb1-a540-c7f9a13a76b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:47 crc kubenswrapper[5118]: I0223 07:06:47.838359 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:47 crc kubenswrapper[5118]: I0223 07:06:47.838408 5118 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:47 crc kubenswrapper[5118]: I0223 07:06:47.838426 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wf72\" (UniqueName: \"kubernetes.io/projected/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7-kube-api-access-8wf72\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.127036 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tv452" event={"ID":"07f00e7b-49f2-4bb1-a540-c7f9a13a76b7","Type":"ContainerDied","Data":"9339e69902fbcaf68e8d5a44484e67089668392aec61939eecd42466fcdec4de"} Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.127501 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9339e69902fbcaf68e8d5a44484e67089668392aec61939eecd42466fcdec4de" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.127601 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tv452" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.411640 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-69db8f76f-xbrx7"] Feb 23 07:06:48 crc kubenswrapper[5118]: E0223 07:06:48.412362 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b341f1-f2c2-4114-b200-1d1c351e84ac" containerName="dnsmasq-dns" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.412387 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b341f1-f2c2-4114-b200-1d1c351e84ac" containerName="dnsmasq-dns" Feb 23 07:06:48 crc kubenswrapper[5118]: E0223 07:06:48.412415 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b341f1-f2c2-4114-b200-1d1c351e84ac" containerName="init" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.412424 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b341f1-f2c2-4114-b200-1d1c351e84ac" containerName="init" Feb 23 07:06:48 crc kubenswrapper[5118]: E0223 07:06:48.412447 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f00e7b-49f2-4bb1-a540-c7f9a13a76b7" containerName="barbican-db-sync" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.412457 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f00e7b-49f2-4bb1-a540-c7f9a13a76b7" containerName="barbican-db-sync" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.412619 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b341f1-f2c2-4114-b200-1d1c351e84ac" containerName="dnsmasq-dns" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.412646 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f00e7b-49f2-4bb1-a540-c7f9a13a76b7" containerName="barbican-db-sync" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.413701 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.423579 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nn6zr" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.423813 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.423925 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.441162 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69db8f76f-xbrx7"] Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.519998 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-868db49cd-qncjp"] Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.521771 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.526702 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.547976 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7489f6876c-jrdzk"] Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.550372 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.555854 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data-custom\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.556182 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-combined-ca-bundle\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.556297 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.556340 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsk5g\" (UniqueName: \"kubernetes.io/projected/e66bcbb5-075a-4a87-981c-0dc608f19742-kube-api-access-qsk5g\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.556402 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66bcbb5-075a-4a87-981c-0dc608f19742-logs\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.572496 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-868db49cd-qncjp"] Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.585669 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7489f6876c-jrdzk"] Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.657885 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-combined-ca-bundle\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.657955 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-nb\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658041 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b27p\" (UniqueName: \"kubernetes.io/projected/f266dcd3-521e-4624-81e5-0eabd56cac4d-kube-api-access-4b27p\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658067 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08b38d85-cf57-41a9-9779-1593300b77a3-logs\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658120 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data-custom\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658145 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-combined-ca-bundle\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658179 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658199 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-swift-storage-0\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658221 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data-custom\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658239 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9wp\" (UniqueName: \"kubernetes.io/projected/08b38d85-cf57-41a9-9779-1593300b77a3-kube-api-access-5q9wp\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658265 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-svc\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658306 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-config\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658328 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658367 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsk5g\" (UniqueName: \"kubernetes.io/projected/e66bcbb5-075a-4a87-981c-0dc608f19742-kube-api-access-qsk5g\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658424 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66bcbb5-075a-4a87-981c-0dc608f19742-logs\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.658451 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-sb\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.659757 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66bcbb5-075a-4a87-981c-0dc608f19742-logs\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.664816 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-combined-ca-bundle\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.665236 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.665304 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data-custom\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.700275 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-85db9c6f6b-8bxj7"] Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.703766 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.707278 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsk5g\" (UniqueName: \"kubernetes.io/projected/e66bcbb5-075a-4a87-981c-0dc608f19742-kube-api-access-qsk5g\") pod \"barbican-worker-69db8f76f-xbrx7\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.712369 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.722286 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85db9c6f6b-8bxj7"] Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.733042 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.760325 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.760809 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-swift-storage-0\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.760951 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data-custom\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.761038 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9wp\" (UniqueName: \"kubernetes.io/projected/08b38d85-cf57-41a9-9779-1593300b77a3-kube-api-access-5q9wp\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.761316 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-svc\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.761403 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-config\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.761730 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-sb\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.761818 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-combined-ca-bundle\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.762302 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-swift-storage-0\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.762566 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-nb\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.762686 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b27p\" (UniqueName: \"kubernetes.io/projected/f266dcd3-521e-4624-81e5-0eabd56cac4d-kube-api-access-4b27p\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.762793 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08b38d85-cf57-41a9-9779-1593300b77a3-logs\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.762833 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-nb\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.762689 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-config\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.762891 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-sb\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.763153 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08b38d85-cf57-41a9-9779-1593300b77a3-logs\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.763652 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-svc\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.764751 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.766341 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data-custom\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.768142 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-combined-ca-bundle\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.784079 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9wp\" (UniqueName: \"kubernetes.io/projected/08b38d85-cf57-41a9-9779-1593300b77a3-kube-api-access-5q9wp\") pod \"barbican-keystone-listener-868db49cd-qncjp\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.784154 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b27p\" (UniqueName: \"kubernetes.io/projected/f266dcd3-521e-4624-81e5-0eabd56cac4d-kube-api-access-4b27p\") pod \"dnsmasq-dns-7489f6876c-jrdzk\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.862220 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.864608 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.864763 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-combined-ca-bundle\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.864888 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data-custom\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.864997 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62662e6-3502-42da-a237-b3d2a4d97153-logs\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.865074 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj8jw\" (UniqueName: \"kubernetes.io/projected/f62662e6-3502-42da-a237-b3d2a4d97153-kube-api-access-gj8jw\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.892476 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.966955 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-combined-ca-bundle\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.967044 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data-custom\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.967123 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62662e6-3502-42da-a237-b3d2a4d97153-logs\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.967146 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj8jw\" (UniqueName: \"kubernetes.io/projected/f62662e6-3502-42da-a237-b3d2a4d97153-kube-api-access-gj8jw\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.967232 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.968451 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62662e6-3502-42da-a237-b3d2a4d97153-logs\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.974848 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data-custom\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.975733 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-combined-ca-bundle\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.976971 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:48 crc kubenswrapper[5118]: I0223 07:06:48.985878 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj8jw\" (UniqueName: \"kubernetes.io/projected/f62662e6-3502-42da-a237-b3d2a4d97153-kube-api-access-gj8jw\") pod \"barbican-api-85db9c6f6b-8bxj7\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:49 crc kubenswrapper[5118]: I0223 07:06:49.186469 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:49 crc kubenswrapper[5118]: I0223 07:06:49.234198 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69db8f76f-xbrx7"] Feb 23 07:06:49 crc kubenswrapper[5118]: I0223 07:06:49.344953 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-868db49cd-qncjp"] Feb 23 07:06:49 crc kubenswrapper[5118]: I0223 07:06:49.431927 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7489f6876c-jrdzk"] Feb 23 07:06:49 crc kubenswrapper[5118]: W0223 07:06:49.440847 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf266dcd3_521e_4624_81e5_0eabd56cac4d.slice/crio-a7aeb4ce120722b01099c984ae9f354f9112d66b7ea85367e3d06b86589b3f80 WatchSource:0}: Error finding container a7aeb4ce120722b01099c984ae9f354f9112d66b7ea85367e3d06b86589b3f80: Status 404 returned error can't find the container with id a7aeb4ce120722b01099c984ae9f354f9112d66b7ea85367e3d06b86589b3f80 Feb 23 07:06:49 crc kubenswrapper[5118]: I0223 07:06:49.676285 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-85db9c6f6b-8bxj7"] Feb 23 07:06:50 crc kubenswrapper[5118]: I0223 07:06:50.148698 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" event={"ID":"08b38d85-cf57-41a9-9779-1593300b77a3","Type":"ContainerStarted","Data":"c1a7cde756082566a8a408e773200dce857577a40647d552b637bbb7c31095d8"} Feb 23 07:06:50 crc kubenswrapper[5118]: I0223 07:06:50.150758 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85db9c6f6b-8bxj7" event={"ID":"f62662e6-3502-42da-a237-b3d2a4d97153","Type":"ContainerStarted","Data":"7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7"} Feb 23 07:06:50 crc kubenswrapper[5118]: I0223 07:06:50.150788 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85db9c6f6b-8bxj7" event={"ID":"f62662e6-3502-42da-a237-b3d2a4d97153","Type":"ContainerStarted","Data":"ec9425b1c66e54214e7ae5f43b94a33209df9058840ba3447f4b614f2c85b353"} Feb 23 07:06:50 crc kubenswrapper[5118]: I0223 07:06:50.154334 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69db8f76f-xbrx7" event={"ID":"e66bcbb5-075a-4a87-981c-0dc608f19742","Type":"ContainerStarted","Data":"ff230597bdf1525b2a5fda543a890742ba45b434150c6322561fdf78b0de4c0e"} Feb 23 07:06:50 crc kubenswrapper[5118]: I0223 07:06:50.158255 5118 generic.go:334] "Generic (PLEG): container finished" podID="2df7ebd2-0918-4164-beec-18057338255d" containerID="e3d135e626ac942021e458bc4a7ab385023f6d9287f81b2ebc06d10db2903a65" exitCode=0 Feb 23 07:06:50 crc kubenswrapper[5118]: I0223 07:06:50.158319 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fblpx" event={"ID":"2df7ebd2-0918-4164-beec-18057338255d","Type":"ContainerDied","Data":"e3d135e626ac942021e458bc4a7ab385023f6d9287f81b2ebc06d10db2903a65"} Feb 23 07:06:50 crc kubenswrapper[5118]: I0223 07:06:50.160646 5118 generic.go:334] "Generic (PLEG): container finished" podID="f266dcd3-521e-4624-81e5-0eabd56cac4d" containerID="396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe" exitCode=0 Feb 23 07:06:50 crc kubenswrapper[5118]: I0223 07:06:50.160706 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" event={"ID":"f266dcd3-521e-4624-81e5-0eabd56cac4d","Type":"ContainerDied","Data":"396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe"} Feb 23 07:06:50 crc kubenswrapper[5118]: I0223 07:06:50.160738 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" event={"ID":"f266dcd3-521e-4624-81e5-0eabd56cac4d","Type":"ContainerStarted","Data":"a7aeb4ce120722b01099c984ae9f354f9112d66b7ea85367e3d06b86589b3f80"} Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.173329 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85db9c6f6b-8bxj7" event={"ID":"f62662e6-3502-42da-a237-b3d2a4d97153","Type":"ContainerStarted","Data":"f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19"} Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.194683 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-85db9c6f6b-8bxj7" podStartSLOduration=3.194658433 podStartE2EDuration="3.194658433s" podCreationTimestamp="2026-02-23 07:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:51.193205688 +0000 UTC m=+1274.196990301" watchObservedRunningTime="2026-02-23 07:06:51.194658433 +0000 UTC m=+1274.198443006" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.578112 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5df75dfc9b-mpgf2"] Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.580246 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.615046 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.615277 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.627665 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5df75dfc9b-mpgf2"] Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.657831 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.657905 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp268\" (UniqueName: \"kubernetes.io/projected/95c7c403-ece4-4778-9a1c-25dbc355a0bf-kube-api-access-kp268\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.657951 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data-custom\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.657980 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-public-tls-certs\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.658014 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-combined-ca-bundle\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.658033 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-internal-tls-certs\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.679786 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c7c403-ece4-4778-9a1c-25dbc355a0bf-logs\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.788191 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c7c403-ece4-4778-9a1c-25dbc355a0bf-logs\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.788263 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.788323 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp268\" (UniqueName: \"kubernetes.io/projected/95c7c403-ece4-4778-9a1c-25dbc355a0bf-kube-api-access-kp268\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.788369 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data-custom\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.788405 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-public-tls-certs\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.788441 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-combined-ca-bundle\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.788468 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-internal-tls-certs\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.794038 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c7c403-ece4-4778-9a1c-25dbc355a0bf-logs\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.843708 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-combined-ca-bundle\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.844207 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data-custom\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.844284 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-public-tls-certs\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.844421 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-internal-tls-certs\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.845239 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.859771 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp268\" (UniqueName: \"kubernetes.io/projected/95c7c403-ece4-4778-9a1c-25dbc355a0bf-kube-api-access-kp268\") pod \"barbican-api-5df75dfc9b-mpgf2\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.967291 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:51 crc kubenswrapper[5118]: I0223 07:06:51.971853 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.093755 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2df7ebd2-0918-4164-beec-18057338255d-etc-machine-id\") pod \"2df7ebd2-0918-4164-beec-18057338255d\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.093813 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-combined-ca-bundle\") pod \"2df7ebd2-0918-4164-beec-18057338255d\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.093907 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-config-data\") pod \"2df7ebd2-0918-4164-beec-18057338255d\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.094157 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28lfg\" (UniqueName: \"kubernetes.io/projected/2df7ebd2-0918-4164-beec-18057338255d-kube-api-access-28lfg\") pod \"2df7ebd2-0918-4164-beec-18057338255d\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.094226 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-db-sync-config-data\") pod \"2df7ebd2-0918-4164-beec-18057338255d\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.094261 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-scripts\") pod \"2df7ebd2-0918-4164-beec-18057338255d\" (UID: \"2df7ebd2-0918-4164-beec-18057338255d\") " Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.100423 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2df7ebd2-0918-4164-beec-18057338255d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2df7ebd2-0918-4164-beec-18057338255d" (UID: "2df7ebd2-0918-4164-beec-18057338255d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.106606 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-scripts" (OuterVolumeSpecName: "scripts") pod "2df7ebd2-0918-4164-beec-18057338255d" (UID: "2df7ebd2-0918-4164-beec-18057338255d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.106688 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2df7ebd2-0918-4164-beec-18057338255d" (UID: "2df7ebd2-0918-4164-beec-18057338255d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.107074 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df7ebd2-0918-4164-beec-18057338255d-kube-api-access-28lfg" (OuterVolumeSpecName: "kube-api-access-28lfg") pod "2df7ebd2-0918-4164-beec-18057338255d" (UID: "2df7ebd2-0918-4164-beec-18057338255d"). InnerVolumeSpecName "kube-api-access-28lfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.151795 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2df7ebd2-0918-4164-beec-18057338255d" (UID: "2df7ebd2-0918-4164-beec-18057338255d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.187463 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" event={"ID":"f266dcd3-521e-4624-81e5-0eabd56cac4d","Type":"ContainerStarted","Data":"70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9"} Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.189320 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.198842 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28lfg\" (UniqueName: \"kubernetes.io/projected/2df7ebd2-0918-4164-beec-18057338255d-kube-api-access-28lfg\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.198869 5118 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.198882 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.198894 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.198906 5118 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2df7ebd2-0918-4164-beec-18057338255d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.205392 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" event={"ID":"08b38d85-cf57-41a9-9779-1593300b77a3","Type":"ContainerStarted","Data":"4478c2eb2702eff226081dd62c590b24a317a608e7697bf1cd6f86ae6c3d71e1"} Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.206463 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" event={"ID":"08b38d85-cf57-41a9-9779-1593300b77a3","Type":"ContainerStarted","Data":"838eea4ca37b326a64d5d678a35327acd726a105a4ac2e5f2979892d448df136"} Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.212764 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-config-data" (OuterVolumeSpecName: "config-data") pod "2df7ebd2-0918-4164-beec-18057338255d" (UID: "2df7ebd2-0918-4164-beec-18057338255d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.217747 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" podStartSLOduration=4.217723183 podStartE2EDuration="4.217723183s" podCreationTimestamp="2026-02-23 07:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:52.214124978 +0000 UTC m=+1275.217909551" watchObservedRunningTime="2026-02-23 07:06:52.217723183 +0000 UTC m=+1275.221507756" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.221474 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69db8f76f-xbrx7" event={"ID":"e66bcbb5-075a-4a87-981c-0dc608f19742","Type":"ContainerStarted","Data":"6562661aead1d6afb939f1cc5250e5487e9146e38f4c8f43650a040bdd480d69"} Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.221520 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69db8f76f-xbrx7" event={"ID":"e66bcbb5-075a-4a87-981c-0dc608f19742","Type":"ContainerStarted","Data":"c1cb45006da98332eb6cd8b2fbe6eb461c5eecf7073435dc50db5dc59fb5544b"} Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.234109 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fblpx" event={"ID":"2df7ebd2-0918-4164-beec-18057338255d","Type":"ContainerDied","Data":"f17b3306d490c28d3c5c4745327401652eb09de2794354c7c7d46f11d0bc1c37"} Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.234173 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f17b3306d490c28d3c5c4745327401652eb09de2794354c7c7d46f11d0bc1c37" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.234171 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fblpx" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.237147 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.237177 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.257980 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" podStartSLOduration=2.687210941 podStartE2EDuration="4.257929107s" podCreationTimestamp="2026-02-23 07:06:48 +0000 UTC" firstStartedPulling="2026-02-23 07:06:49.353123935 +0000 UTC m=+1272.356908498" lastFinishedPulling="2026-02-23 07:06:50.923842091 +0000 UTC m=+1273.927626664" observedRunningTime="2026-02-23 07:06:52.254733311 +0000 UTC m=+1275.258517884" watchObservedRunningTime="2026-02-23 07:06:52.257929107 +0000 UTC m=+1275.261713700" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.285409 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-69db8f76f-xbrx7" podStartSLOduration=2.628932178 podStartE2EDuration="4.285385067s" podCreationTimestamp="2026-02-23 07:06:48 +0000 UTC" firstStartedPulling="2026-02-23 07:06:49.253546764 +0000 UTC m=+1272.257331357" lastFinishedPulling="2026-02-23 07:06:50.909999683 +0000 UTC m=+1273.913784246" observedRunningTime="2026-02-23 07:06:52.275454742 +0000 UTC m=+1275.279239315" watchObservedRunningTime="2026-02-23 07:06:52.285385067 +0000 UTC m=+1275.289169640" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.301711 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2df7ebd2-0918-4164-beec-18057338255d-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:52 crc kubenswrapper[5118]: W0223 07:06:52.541771 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c7c403_ece4_4778_9a1c_25dbc355a0bf.slice/crio-7caaa1a5bca8189360df1fb61dbf27135b9ade5ae93377261f21e86ee5fb96a3 WatchSource:0}: Error finding container 7caaa1a5bca8189360df1fb61dbf27135b9ade5ae93377261f21e86ee5fb96a3: Status 404 returned error can't find the container with id 7caaa1a5bca8189360df1fb61dbf27135b9ade5ae93377261f21e86ee5fb96a3 Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.545663 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5df75dfc9b-mpgf2"] Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.563686 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:06:52 crc kubenswrapper[5118]: E0223 07:06:52.564317 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2df7ebd2-0918-4164-beec-18057338255d" containerName="cinder-db-sync" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.564341 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df7ebd2-0918-4164-beec-18057338255d" containerName="cinder-db-sync" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.564609 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2df7ebd2-0918-4164-beec-18057338255d" containerName="cinder-db-sync" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.566303 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.570751 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.571311 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.571466 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.571591 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-s4k5s" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.605352 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.670084 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7489f6876c-jrdzk"] Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.711388 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.711454 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.711509 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.711529 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zblhp\" (UniqueName: \"kubernetes.io/projected/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-kube-api-access-zblhp\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.711583 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.711611 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.733239 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-vdh92"] Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.742043 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.776871 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.778753 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.782464 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.788587 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-vdh92"] Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.798967 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.813785 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.813830 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zblhp\" (UniqueName: \"kubernetes.io/projected/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-kube-api-access-zblhp\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.813872 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-svc\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.813963 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-nb\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.813986 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.814021 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kx5l\" (UniqueName: \"kubernetes.io/projected/d4609473-1aaa-4fb5-9971-6978cbe4ae63-kube-api-access-5kx5l\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.814045 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.814079 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-config\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.814130 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-sb\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.814159 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-swift-storage-0\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.814210 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.814245 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.816025 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.822923 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-scripts\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.823020 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.828745 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.837703 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.851153 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zblhp\" (UniqueName: \"kubernetes.io/projected/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-kube-api-access-zblhp\") pod \"cinder-scheduler-0\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.893162 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.916683 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-swift-storage-0\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.916770 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.916903 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-svc\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.916938 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.916985 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-logs\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.917014 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-nb\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.917054 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data-custom\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.917086 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.917191 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kx5l\" (UniqueName: \"kubernetes.io/projected/d4609473-1aaa-4fb5-9971-6978cbe4ae63-kube-api-access-5kx5l\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.917220 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hntk6\" (UniqueName: \"kubernetes.io/projected/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-kube-api-access-hntk6\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.917254 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-config\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.917292 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-sb\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.917319 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-scripts\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.918360 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-nb\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.918407 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-sb\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.918802 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-config\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.919056 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-svc\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.919397 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-swift-storage-0\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:52 crc kubenswrapper[5118]: I0223 07:06:52.933871 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kx5l\" (UniqueName: \"kubernetes.io/projected/d4609473-1aaa-4fb5-9971-6978cbe4ae63-kube-api-access-5kx5l\") pod \"dnsmasq-dns-6c69c79c7f-vdh92\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.019548 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-scripts\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.019897 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.020064 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.020137 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-logs\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.020174 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data-custom\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.020198 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.020321 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.020968 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-logs\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.021063 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hntk6\" (UniqueName: \"kubernetes.io/projected/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-kube-api-access-hntk6\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.028582 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-scripts\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.029409 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data-custom\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.032306 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.033651 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.038159 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hntk6\" (UniqueName: \"kubernetes.io/projected/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-kube-api-access-hntk6\") pod \"cinder-api-0\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.077406 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.106044 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.257473 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5df75dfc9b-mpgf2" event={"ID":"95c7c403-ece4-4778-9a1c-25dbc355a0bf","Type":"ContainerStarted","Data":"b78a6f1ffe539d4eb30bf74ad5a43f3802cf0350d1aba256152e434c6b28d94e"} Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.257859 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5df75dfc9b-mpgf2" event={"ID":"95c7c403-ece4-4778-9a1c-25dbc355a0bf","Type":"ContainerStarted","Data":"27b321bfffbb4bd657b82546013c7861bf1c2f5af05ee3956e5a515a75954c59"} Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.257870 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5df75dfc9b-mpgf2" event={"ID":"95c7c403-ece4-4778-9a1c-25dbc355a0bf","Type":"ContainerStarted","Data":"7caaa1a5bca8189360df1fb61dbf27135b9ade5ae93377261f21e86ee5fb96a3"} Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.258717 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.258734 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.308422 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5df75dfc9b-mpgf2" podStartSLOduration=2.308400526 podStartE2EDuration="2.308400526s" podCreationTimestamp="2026-02-23 07:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:53.294026186 +0000 UTC m=+1276.297810759" watchObservedRunningTime="2026-02-23 07:06:53.308400526 +0000 UTC m=+1276.312185099" Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.417931 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:06:53 crc kubenswrapper[5118]: W0223 07:06:53.432481 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90694ec2_b87b_476a_9601_4dcb9eb0c2a1.slice/crio-bb499fc0731ab058598678190007644245e2829d92880aa76b4cc588a49f0401 WatchSource:0}: Error finding container bb499fc0731ab058598678190007644245e2829d92880aa76b4cc588a49f0401: Status 404 returned error can't find the container with id bb499fc0731ab058598678190007644245e2829d92880aa76b4cc588a49f0401 Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.668884 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-vdh92"] Feb 23 07:06:53 crc kubenswrapper[5118]: W0223 07:06:53.696178 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4609473_1aaa_4fb5_9971_6978cbe4ae63.slice/crio-28f9356e0a8318fed6f4ef5d1c89f8ef39c38b72cc27ee5509f8aeb3d2eeaa65 WatchSource:0}: Error finding container 28f9356e0a8318fed6f4ef5d1c89f8ef39c38b72cc27ee5509f8aeb3d2eeaa65: Status 404 returned error can't find the container with id 28f9356e0a8318fed6f4ef5d1c89f8ef39c38b72cc27ee5509f8aeb3d2eeaa65 Feb 23 07:06:53 crc kubenswrapper[5118]: W0223 07:06:53.771544 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ce3d279_e276_4f0c_a7c5_d5ecbbc4546b.slice/crio-ffae23b2d1693109cb29cccc175e916c895d297c5f2f6f931d700b8a0ca6ccf7 WatchSource:0}: Error finding container ffae23b2d1693109cb29cccc175e916c895d297c5f2f6f931d700b8a0ca6ccf7: Status 404 returned error can't find the container with id ffae23b2d1693109cb29cccc175e916c895d297c5f2f6f931d700b8a0ca6ccf7 Feb 23 07:06:53 crc kubenswrapper[5118]: I0223 07:06:53.772225 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.275984 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b","Type":"ContainerStarted","Data":"ffae23b2d1693109cb29cccc175e916c895d297c5f2f6f931d700b8a0ca6ccf7"} Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.280249 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90694ec2-b87b-476a-9601-4dcb9eb0c2a1","Type":"ContainerStarted","Data":"bb499fc0731ab058598678190007644245e2829d92880aa76b4cc588a49f0401"} Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.282177 5118 generic.go:334] "Generic (PLEG): container finished" podID="d4609473-1aaa-4fb5-9971-6978cbe4ae63" containerID="d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f" exitCode=0 Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.282413 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" podUID="f266dcd3-521e-4624-81e5-0eabd56cac4d" containerName="dnsmasq-dns" containerID="cri-o://70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9" gracePeriod=10 Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.283709 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" event={"ID":"d4609473-1aaa-4fb5-9971-6978cbe4ae63","Type":"ContainerDied","Data":"d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f"} Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.283735 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" event={"ID":"d4609473-1aaa-4fb5-9971-6978cbe4ae63","Type":"ContainerStarted","Data":"28f9356e0a8318fed6f4ef5d1c89f8ef39c38b72cc27ee5509f8aeb3d2eeaa65"} Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.846911 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.972319 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b27p\" (UniqueName: \"kubernetes.io/projected/f266dcd3-521e-4624-81e5-0eabd56cac4d-kube-api-access-4b27p\") pod \"f266dcd3-521e-4624-81e5-0eabd56cac4d\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.972988 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-config\") pod \"f266dcd3-521e-4624-81e5-0eabd56cac4d\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.975163 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-swift-storage-0\") pod \"f266dcd3-521e-4624-81e5-0eabd56cac4d\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.975219 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-sb\") pod \"f266dcd3-521e-4624-81e5-0eabd56cac4d\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.975281 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-nb\") pod \"f266dcd3-521e-4624-81e5-0eabd56cac4d\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.975316 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-svc\") pod \"f266dcd3-521e-4624-81e5-0eabd56cac4d\" (UID: \"f266dcd3-521e-4624-81e5-0eabd56cac4d\") " Feb 23 07:06:54 crc kubenswrapper[5118]: I0223 07:06:54.994525 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f266dcd3-521e-4624-81e5-0eabd56cac4d-kube-api-access-4b27p" (OuterVolumeSpecName: "kube-api-access-4b27p") pod "f266dcd3-521e-4624-81e5-0eabd56cac4d" (UID: "f266dcd3-521e-4624-81e5-0eabd56cac4d"). InnerVolumeSpecName "kube-api-access-4b27p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.080652 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f266dcd3-521e-4624-81e5-0eabd56cac4d" (UID: "f266dcd3-521e-4624-81e5-0eabd56cac4d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.081896 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b27p\" (UniqueName: \"kubernetes.io/projected/f266dcd3-521e-4624-81e5-0eabd56cac4d-kube-api-access-4b27p\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.081923 5118 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.092657 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f266dcd3-521e-4624-81e5-0eabd56cac4d" (UID: "f266dcd3-521e-4624-81e5-0eabd56cac4d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.095213 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f266dcd3-521e-4624-81e5-0eabd56cac4d" (UID: "f266dcd3-521e-4624-81e5-0eabd56cac4d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.105839 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-config" (OuterVolumeSpecName: "config") pod "f266dcd3-521e-4624-81e5-0eabd56cac4d" (UID: "f266dcd3-521e-4624-81e5-0eabd56cac4d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.164895 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f266dcd3-521e-4624-81e5-0eabd56cac4d" (UID: "f266dcd3-521e-4624-81e5-0eabd56cac4d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.183757 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.183798 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.183808 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.183817 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f266dcd3-521e-4624-81e5-0eabd56cac4d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.190510 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.323581 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b","Type":"ContainerStarted","Data":"dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138"} Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.327404 5118 generic.go:334] "Generic (PLEG): container finished" podID="f266dcd3-521e-4624-81e5-0eabd56cac4d" containerID="70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9" exitCode=0 Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.327485 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.327494 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" event={"ID":"f266dcd3-521e-4624-81e5-0eabd56cac4d","Type":"ContainerDied","Data":"70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9"} Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.327555 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7489f6876c-jrdzk" event={"ID":"f266dcd3-521e-4624-81e5-0eabd56cac4d","Type":"ContainerDied","Data":"a7aeb4ce120722b01099c984ae9f354f9112d66b7ea85367e3d06b86589b3f80"} Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.327579 5118 scope.go:117] "RemoveContainer" containerID="70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.331061 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90694ec2-b87b-476a-9601-4dcb9eb0c2a1","Type":"ContainerStarted","Data":"9476527837f0c6d64af1fe1bb75147baaee9ff0d903acf86c7c799d9ec0a9635"} Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.336214 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" event={"ID":"d4609473-1aaa-4fb5-9971-6978cbe4ae63","Type":"ContainerStarted","Data":"88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f"} Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.372173 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" podStartSLOduration=3.372152804 podStartE2EDuration="3.372152804s" podCreationTimestamp="2026-02-23 07:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:55.368142889 +0000 UTC m=+1278.371927462" watchObservedRunningTime="2026-02-23 07:06:55.372152804 +0000 UTC m=+1278.375937377" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.409484 5118 scope.go:117] "RemoveContainer" containerID="396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.425673 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7489f6876c-jrdzk"] Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.456044 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7489f6876c-jrdzk"] Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.457104 5118 scope.go:117] "RemoveContainer" containerID="70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9" Feb 23 07:06:55 crc kubenswrapper[5118]: E0223 07:06:55.462250 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9\": container with ID starting with 70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9 not found: ID does not exist" containerID="70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.462384 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9"} err="failed to get container status \"70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9\": rpc error: code = NotFound desc = could not find container \"70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9\": container with ID starting with 70637d09bc73acb9eeed8226b83bccf334e62ab227e123204880e3dca15c16e9 not found: ID does not exist" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.462479 5118 scope.go:117] "RemoveContainer" containerID="396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe" Feb 23 07:06:55 crc kubenswrapper[5118]: E0223 07:06:55.465580 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe\": container with ID starting with 396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe not found: ID does not exist" containerID="396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.466017 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe"} err="failed to get container status \"396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe\": rpc error: code = NotFound desc = could not find container \"396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe\": container with ID starting with 396acec47f52e706ff7cb8d562b4c5821c9ad643487ec957c408b194d978d4fe not found: ID does not exist" Feb 23 07:06:55 crc kubenswrapper[5118]: I0223 07:06:55.717534 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f266dcd3-521e-4624-81e5-0eabd56cac4d" path="/var/lib/kubelet/pods/f266dcd3-521e-4624-81e5-0eabd56cac4d/volumes" Feb 23 07:06:56 crc kubenswrapper[5118]: I0223 07:06:56.146619 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:06:56 crc kubenswrapper[5118]: I0223 07:06:56.360895 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90694ec2-b87b-476a-9601-4dcb9eb0c2a1","Type":"ContainerStarted","Data":"9bca37bb9bf6a40a1f9ef890c9265c70efd4c4ad88e76eaff67886212a4d137b"} Feb 23 07:06:56 crc kubenswrapper[5118]: I0223 07:06:56.368534 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" containerName="cinder-api-log" containerID="cri-o://dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138" gracePeriod=30 Feb 23 07:06:56 crc kubenswrapper[5118]: I0223 07:06:56.368803 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b","Type":"ContainerStarted","Data":"d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880"} Feb 23 07:06:56 crc kubenswrapper[5118]: I0223 07:06:56.368903 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" containerName="cinder-api" containerID="cri-o://d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880" gracePeriod=30 Feb 23 07:06:56 crc kubenswrapper[5118]: I0223 07:06:56.368937 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:06:56 crc kubenswrapper[5118]: I0223 07:06:56.368993 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 07:06:56 crc kubenswrapper[5118]: I0223 07:06:56.393649 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.545735211 podStartE2EDuration="4.393621497s" podCreationTimestamp="2026-02-23 07:06:52 +0000 UTC" firstStartedPulling="2026-02-23 07:06:53.45236925 +0000 UTC m=+1276.456153823" lastFinishedPulling="2026-02-23 07:06:54.300255536 +0000 UTC m=+1277.304040109" observedRunningTime="2026-02-23 07:06:56.37857027 +0000 UTC m=+1279.382354843" watchObservedRunningTime="2026-02-23 07:06:56.393621497 +0000 UTC m=+1279.397406070" Feb 23 07:06:56 crc kubenswrapper[5118]: I0223 07:06:56.411263 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.411241394 podStartE2EDuration="4.411241394s" podCreationTimestamp="2026-02-23 07:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:06:56.399549537 +0000 UTC m=+1279.403334100" watchObservedRunningTime="2026-02-23 07:06:56.411241394 +0000 UTC m=+1279.415025967" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.204867 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.234000 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-scripts\") pod \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.234129 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-logs\") pod \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.234270 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-combined-ca-bundle\") pod \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.234293 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hntk6\" (UniqueName: \"kubernetes.io/projected/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-kube-api-access-hntk6\") pod \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.234345 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data\") pod \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.234449 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-etc-machine-id\") pod \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.234506 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data-custom\") pod \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\" (UID: \"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b\") " Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.237654 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" (UID: "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.238048 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-logs" (OuterVolumeSpecName: "logs") pod "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" (UID: "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.249321 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-scripts" (OuterVolumeSpecName: "scripts") pod "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" (UID: "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.275374 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" (UID: "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.285340 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-kube-api-access-hntk6" (OuterVolumeSpecName: "kube-api-access-hntk6") pod "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" (UID: "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b"). InnerVolumeSpecName "kube-api-access-hntk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.339825 5118 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.339858 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.339868 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.339879 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.339888 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hntk6\" (UniqueName: \"kubernetes.io/projected/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-kube-api-access-hntk6\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.403355 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" (UID: "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.403513 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data" (OuterVolumeSpecName: "config-data") pod "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" (UID: "0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.431232 5118 generic.go:334] "Generic (PLEG): container finished" podID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" containerID="d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880" exitCode=0 Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.431469 5118 generic.go:334] "Generic (PLEG): container finished" podID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" containerID="dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138" exitCode=143 Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.432492 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b","Type":"ContainerDied","Data":"d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880"} Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.432527 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b","Type":"ContainerDied","Data":"dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138"} Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.432539 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b","Type":"ContainerDied","Data":"ffae23b2d1693109cb29cccc175e916c895d297c5f2f6f931d700b8a0ca6ccf7"} Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.432554 5118 scope.go:117] "RemoveContainer" containerID="d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.432582 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.443688 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.443897 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.532283 5118 scope.go:117] "RemoveContainer" containerID="dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.533277 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.566524 5118 scope.go:117] "RemoveContainer" containerID="d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880" Feb 23 07:06:57 crc kubenswrapper[5118]: E0223 07:06:57.571248 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880\": container with ID starting with d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880 not found: ID does not exist" containerID="d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.571299 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880"} err="failed to get container status \"d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880\": rpc error: code = NotFound desc = could not find container \"d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880\": container with ID starting with d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880 not found: ID does not exist" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.571339 5118 scope.go:117] "RemoveContainer" containerID="dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138" Feb 23 07:06:57 crc kubenswrapper[5118]: E0223 07:06:57.573706 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138\": container with ID starting with dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138 not found: ID does not exist" containerID="dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.573728 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138"} err="failed to get container status \"dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138\": rpc error: code = NotFound desc = could not find container \"dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138\": container with ID starting with dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138 not found: ID does not exist" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.573742 5118 scope.go:117] "RemoveContainer" containerID="d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.574227 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880"} err="failed to get container status \"d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880\": rpc error: code = NotFound desc = could not find container \"d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880\": container with ID starting with d646d6cbe13e7be9b0b59b872cdc9c46b903c9ca8cea4dacc1a08f9679c9b880 not found: ID does not exist" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.574246 5118 scope.go:117] "RemoveContainer" containerID="dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.574450 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138"} err="failed to get container status \"dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138\": rpc error: code = NotFound desc = could not find container \"dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138\": container with ID starting with dc2ba93713df881d427a2d2ac9d1f09e92bdc33ef9c92c410bdc193efe4bf138 not found: ID does not exist" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.577857 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.585235 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:06:57 crc kubenswrapper[5118]: E0223 07:06:57.585768 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f266dcd3-521e-4624-81e5-0eabd56cac4d" containerName="init" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.585787 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f266dcd3-521e-4624-81e5-0eabd56cac4d" containerName="init" Feb 23 07:06:57 crc kubenswrapper[5118]: E0223 07:06:57.585804 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" containerName="cinder-api" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.585812 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" containerName="cinder-api" Feb 23 07:06:57 crc kubenswrapper[5118]: E0223 07:06:57.585836 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" containerName="cinder-api-log" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.585843 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" containerName="cinder-api-log" Feb 23 07:06:57 crc kubenswrapper[5118]: E0223 07:06:57.585856 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f266dcd3-521e-4624-81e5-0eabd56cac4d" containerName="dnsmasq-dns" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.585862 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f266dcd3-521e-4624-81e5-0eabd56cac4d" containerName="dnsmasq-dns" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.586070 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" containerName="cinder-api-log" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.586089 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" containerName="cinder-api" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.586117 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f266dcd3-521e-4624-81e5-0eabd56cac4d" containerName="dnsmasq-dns" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.587258 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.591629 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.595322 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.595382 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.595716 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.649836 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.649908 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.649926 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.650168 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j6zj\" (UniqueName: \"kubernetes.io/projected/f424d603-7efb-4075-9a9b-5117dec09a6a-kube-api-access-5j6zj\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.650206 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.650361 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f424d603-7efb-4075-9a9b-5117dec09a6a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.650507 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.650674 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f424d603-7efb-4075-9a9b-5117dec09a6a-logs\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.650709 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-scripts\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.713765 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b" path="/var/lib/kubelet/pods/0ce3d279-e276-4f0c-a7c5-d5ecbbc4546b/volumes" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.751385 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.752140 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j6zj\" (UniqueName: \"kubernetes.io/projected/f424d603-7efb-4075-9a9b-5117dec09a6a-kube-api-access-5j6zj\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.752192 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f424d603-7efb-4075-9a9b-5117dec09a6a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.752231 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.752277 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f424d603-7efb-4075-9a9b-5117dec09a6a-logs\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.752307 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-scripts\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.752384 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.752417 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.752433 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.753317 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f424d603-7efb-4075-9a9b-5117dec09a6a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.753700 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f424d603-7efb-4075-9a9b-5117dec09a6a-logs\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.756389 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.756525 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.757034 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.757721 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.759062 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.761587 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-scripts\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.768955 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j6zj\" (UniqueName: \"kubernetes.io/projected/f424d603-7efb-4075-9a9b-5117dec09a6a-kube-api-access-5j6zj\") pod \"cinder-api-0\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " pod="openstack/cinder-api-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.893616 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 07:06:57 crc kubenswrapper[5118]: I0223 07:06:57.911063 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:06:58 crc kubenswrapper[5118]: I0223 07:06:58.048627 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.521495 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.756970 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f964cfbb7-9np4m"] Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.757504 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f964cfbb7-9np4m" podUID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerName="neutron-api" containerID="cri-o://2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8" gracePeriod=30 Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.757711 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f964cfbb7-9np4m" podUID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerName="neutron-httpd" containerID="cri-o://1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e" gracePeriod=30 Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.771919 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6f964cfbb7-9np4m" podUID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": read tcp 10.217.0.2:58374->10.217.0.152:9696: read: connection reset by peer" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.814189 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6ff569978f-gwmwn"] Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.815952 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.827868 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ff569978f-gwmwn"] Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.973763 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-combined-ca-bundle\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.973815 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-httpd-config\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.974116 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-ovndb-tls-certs\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.974206 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-internal-tls-certs\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.974253 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-config\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.974408 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m2wx\" (UniqueName: \"kubernetes.io/projected/cf1c2052-6563-45c5-888c-f7a153225f83-kube-api-access-5m2wx\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.974538 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-public-tls-certs\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.974896 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:07:02 crc kubenswrapper[5118]: I0223 07:07:02.974944 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.078215 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-internal-tls-certs\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.078848 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-config\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.078918 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m2wx\" (UniqueName: \"kubernetes.io/projected/cf1c2052-6563-45c5-888c-f7a153225f83-kube-api-access-5m2wx\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.078969 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-public-tls-certs\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.079021 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-combined-ca-bundle\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.079042 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-httpd-config\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.079116 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-ovndb-tls-certs\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.082580 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.089031 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-public-tls-certs\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.089127 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-internal-tls-certs\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.089918 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-httpd-config\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.093971 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-combined-ca-bundle\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.099343 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-ovndb-tls-certs\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.106679 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-config\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.124025 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m2wx\" (UniqueName: \"kubernetes.io/projected/cf1c2052-6563-45c5-888c-f7a153225f83-kube-api-access-5m2wx\") pod \"neutron-6ff569978f-gwmwn\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.158379 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-dvztp"] Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.158659 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" podUID="b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" containerName="dnsmasq-dns" containerID="cri-o://d4e13a58c524fa157e11bb80e1c91fd643b862183d0f6d64de6798e3f21b49a4" gracePeriod=10 Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.188793 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.244823 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.364287 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:03 crc kubenswrapper[5118]: E0223 07:07:03.367783 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ff639ff8-dd33-445f-8321-6528b227179d" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.441236 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:07:03 crc kubenswrapper[5118]: W0223 07:07:03.449038 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf424d603_7efb_4075_9a9b_5117dec09a6a.slice/crio-838c10107267d6ca2357c09d772e877661a99d19bf12df6a96b83b2bece06c6f WatchSource:0}: Error finding container 838c10107267d6ca2357c09d772e877661a99d19bf12df6a96b83b2bece06c6f: Status 404 returned error can't find the container with id 838c10107267d6ca2357c09d772e877661a99d19bf12df6a96b83b2bece06c6f Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.521412 5118 generic.go:334] "Generic (PLEG): container finished" podID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerID="1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e" exitCode=0 Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.521509 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f964cfbb7-9np4m" event={"ID":"cfc07232-4f7f-4922-996a-2f95a26bc25d","Type":"ContainerDied","Data":"1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e"} Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.525955 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f424d603-7efb-4075-9a9b-5117dec09a6a","Type":"ContainerStarted","Data":"838c10107267d6ca2357c09d772e877661a99d19bf12df6a96b83b2bece06c6f"} Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.532698 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff639ff8-dd33-445f-8321-6528b227179d","Type":"ContainerStarted","Data":"26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1"} Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.532957 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="ceilometer-notification-agent" containerID="cri-o://93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500" gracePeriod=30 Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.533343 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.533669 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="proxy-httpd" containerID="cri-o://26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1" gracePeriod=30 Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.533781 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="sg-core" containerID="cri-o://5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8" gracePeriod=30 Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.542882 5118 generic.go:334] "Generic (PLEG): container finished" podID="b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" containerID="d4e13a58c524fa157e11bb80e1c91fd643b862183d0f6d64de6798e3f21b49a4" exitCode=0 Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.543145 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" event={"ID":"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea","Type":"ContainerDied","Data":"d4e13a58c524fa157e11bb80e1c91fd643b862183d0f6d64de6798e3f21b49a4"} Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.544032 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" containerName="cinder-scheduler" containerID="cri-o://9476527837f0c6d64af1fe1bb75147baaee9ff0d903acf86c7c799d9ec0a9635" gracePeriod=30 Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.544335 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" containerName="probe" containerID="cri-o://9bca37bb9bf6a40a1f9ef890c9265c70efd4c4ad88e76eaff67886212a4d137b" gracePeriod=30 Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.737166 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.908286 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-swift-storage-0\") pod \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.908449 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-config\") pod \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.908590 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-sb\") pod \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.908725 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htqkt\" (UniqueName: \"kubernetes.io/projected/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-kube-api-access-htqkt\") pod \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.908759 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-svc\") pod \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.908789 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-nb\") pod \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\" (UID: \"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea\") " Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.921873 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-kube-api-access-htqkt" (OuterVolumeSpecName: "kube-api-access-htqkt") pod "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" (UID: "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea"). InnerVolumeSpecName "kube-api-access-htqkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:03 crc kubenswrapper[5118]: I0223 07:07:03.983062 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-config" (OuterVolumeSpecName: "config") pod "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" (UID: "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.010974 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" (UID: "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.011278 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htqkt\" (UniqueName: \"kubernetes.io/projected/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-kube-api-access-htqkt\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.011312 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.011324 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.015601 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" (UID: "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.016231 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ff569978f-gwmwn"] Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.017967 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" (UID: "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.032708 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" (UID: "b0b45588-ab90-4a85-b1ce-2f5f8226c9ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.116340 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.116415 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.116425 5118 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.161592 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.431013 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.453688 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.491739 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-85db9c6f6b-8bxj7"] Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.492021 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-85db9c6f6b-8bxj7" podUID="f62662e6-3502-42da-a237-b3d2a4d97153" containerName="barbican-api-log" containerID="cri-o://7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7" gracePeriod=30 Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.492248 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-85db9c6f6b-8bxj7" podUID="f62662e6-3502-42da-a237-b3d2a4d97153" containerName="barbican-api" containerID="cri-o://f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19" gracePeriod=30 Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.548646 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-ovndb-tls-certs\") pod \"cfc07232-4f7f-4922-996a-2f95a26bc25d\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.548723 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-public-tls-certs\") pod \"cfc07232-4f7f-4922-996a-2f95a26bc25d\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.548887 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-combined-ca-bundle\") pod \"cfc07232-4f7f-4922-996a-2f95a26bc25d\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.548993 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l7bd\" (UniqueName: \"kubernetes.io/projected/cfc07232-4f7f-4922-996a-2f95a26bc25d-kube-api-access-2l7bd\") pod \"cfc07232-4f7f-4922-996a-2f95a26bc25d\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.549016 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-config\") pod \"cfc07232-4f7f-4922-996a-2f95a26bc25d\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.549067 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-internal-tls-certs\") pod \"cfc07232-4f7f-4922-996a-2f95a26bc25d\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.552949 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-httpd-config\") pod \"cfc07232-4f7f-4922-996a-2f95a26bc25d\" (UID: \"cfc07232-4f7f-4922-996a-2f95a26bc25d\") " Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.575693 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc07232-4f7f-4922-996a-2f95a26bc25d-kube-api-access-2l7bd" (OuterVolumeSpecName: "kube-api-access-2l7bd") pod "cfc07232-4f7f-4922-996a-2f95a26bc25d" (UID: "cfc07232-4f7f-4922-996a-2f95a26bc25d"). InnerVolumeSpecName "kube-api-access-2l7bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.595166 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cfc07232-4f7f-4922-996a-2f95a26bc25d" (UID: "cfc07232-4f7f-4922-996a-2f95a26bc25d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.621321 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" event={"ID":"b0b45588-ab90-4a85-b1ce-2f5f8226c9ea","Type":"ContainerDied","Data":"6f7295c341d553d38257baec77da0c8c36a3c6d74cf484d2a659f796ae940dd4"} Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.621395 5118 scope.go:117] "RemoveContainer" containerID="d4e13a58c524fa157e11bb80e1c91fd643b862183d0f6d64de6798e3f21b49a4" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.621568 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77d55b9c69-dvztp" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.629974 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ff569978f-gwmwn" event={"ID":"cf1c2052-6563-45c5-888c-f7a153225f83","Type":"ContainerStarted","Data":"26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61"} Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.630019 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ff569978f-gwmwn" event={"ID":"cf1c2052-6563-45c5-888c-f7a153225f83","Type":"ContainerStarted","Data":"5e2ad5be86c31b1d2c83c5e76adb5efa3697dd889de903b885eeee7094932341"} Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.641332 5118 generic.go:334] "Generic (PLEG): container finished" podID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerID="2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8" exitCode=0 Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.641429 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f964cfbb7-9np4m" event={"ID":"cfc07232-4f7f-4922-996a-2f95a26bc25d","Type":"ContainerDied","Data":"2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8"} Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.641475 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f964cfbb7-9np4m" event={"ID":"cfc07232-4f7f-4922-996a-2f95a26bc25d","Type":"ContainerDied","Data":"20adc449979bbfb184e306c25c6f13fefb5bed7347151e335b26ef7ca7869140"} Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.641569 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f964cfbb7-9np4m" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.647335 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f424d603-7efb-4075-9a9b-5117dec09a6a","Type":"ContainerStarted","Data":"744308f3e6aa45dc8d65925950643c4deaca79d38d7fbf553daec9ce3a79d864"} Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.651328 5118 generic.go:334] "Generic (PLEG): container finished" podID="ff639ff8-dd33-445f-8321-6528b227179d" containerID="5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8" exitCode=2 Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.651389 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff639ff8-dd33-445f-8321-6528b227179d","Type":"ContainerDied","Data":"5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8"} Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.652521 5118 scope.go:117] "RemoveContainer" containerID="ab973d15165d83835af258696bd69fae7cee6956504739dbf641f604a05c4f3b" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.656215 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l7bd\" (UniqueName: \"kubernetes.io/projected/cfc07232-4f7f-4922-996a-2f95a26bc25d-kube-api-access-2l7bd\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.656234 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.695204 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-dvztp"] Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.695291 5118 generic.go:334] "Generic (PLEG): container finished" podID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" containerID="9bca37bb9bf6a40a1f9ef890c9265c70efd4c4ad88e76eaff67886212a4d137b" exitCode=0 Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.696287 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90694ec2-b87b-476a-9601-4dcb9eb0c2a1","Type":"ContainerDied","Data":"9bca37bb9bf6a40a1f9ef890c9265c70efd4c4ad88e76eaff67886212a4d137b"} Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.706819 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77d55b9c69-dvztp"] Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.735278 5118 scope.go:117] "RemoveContainer" containerID="1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.799441 5118 scope.go:117] "RemoveContainer" containerID="2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.803374 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cfc07232-4f7f-4922-996a-2f95a26bc25d" (UID: "cfc07232-4f7f-4922-996a-2f95a26bc25d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.806455 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfc07232-4f7f-4922-996a-2f95a26bc25d" (UID: "cfc07232-4f7f-4922-996a-2f95a26bc25d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.807185 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cfc07232-4f7f-4922-996a-2f95a26bc25d" (UID: "cfc07232-4f7f-4922-996a-2f95a26bc25d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.808271 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cfc07232-4f7f-4922-996a-2f95a26bc25d" (UID: "cfc07232-4f7f-4922-996a-2f95a26bc25d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.825907 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-config" (OuterVolumeSpecName: "config") pod "cfc07232-4f7f-4922-996a-2f95a26bc25d" (UID: "cfc07232-4f7f-4922-996a-2f95a26bc25d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.849562 5118 scope.go:117] "RemoveContainer" containerID="1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e" Feb 23 07:07:04 crc kubenswrapper[5118]: E0223 07:07:04.850333 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e\": container with ID starting with 1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e not found: ID does not exist" containerID="1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.850373 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e"} err="failed to get container status \"1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e\": rpc error: code = NotFound desc = could not find container \"1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e\": container with ID starting with 1acb02d453ca5fdf7d79e45cb48ab75f968e99811eb6012cf5c8370378d0be1e not found: ID does not exist" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.850400 5118 scope.go:117] "RemoveContainer" containerID="2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8" Feb 23 07:07:04 crc kubenswrapper[5118]: E0223 07:07:04.850992 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8\": container with ID starting with 2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8 not found: ID does not exist" containerID="2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.851082 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8"} err="failed to get container status \"2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8\": rpc error: code = NotFound desc = could not find container \"2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8\": container with ID starting with 2e8647c802550d381ac8b243f535b3993225a36a6b4d40c71aa3de3265a870a8 not found: ID does not exist" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.860718 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.860756 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.860770 5118 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.860778 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.860788 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfc07232-4f7f-4922-996a-2f95a26bc25d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:04 crc kubenswrapper[5118]: I0223 07:07:04.991008 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f964cfbb7-9np4m"] Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.005997 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f964cfbb7-9np4m"] Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.710560 5118 generic.go:334] "Generic (PLEG): container finished" podID="f62662e6-3502-42da-a237-b3d2a4d97153" containerID="7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7" exitCode=143 Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.711082 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" path="/var/lib/kubelet/pods/b0b45588-ab90-4a85-b1ce-2f5f8226c9ea/volumes" Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.711952 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc07232-4f7f-4922-996a-2f95a26bc25d" path="/var/lib/kubelet/pods/cfc07232-4f7f-4922-996a-2f95a26bc25d/volumes" Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.712740 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.712779 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f424d603-7efb-4075-9a9b-5117dec09a6a","Type":"ContainerStarted","Data":"50ca6bcdb97f2c6020994172d7c32e3ae68a54330a49a74eecef1ee656c8e4db"} Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.712799 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85db9c6f6b-8bxj7" event={"ID":"f62662e6-3502-42da-a237-b3d2a4d97153","Type":"ContainerDied","Data":"7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7"} Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.714765 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ff569978f-gwmwn" event={"ID":"cf1c2052-6563-45c5-888c-f7a153225f83","Type":"ContainerStarted","Data":"8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0"} Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.715006 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.741413 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.74139096 podStartE2EDuration="8.74139096s" podCreationTimestamp="2026-02-23 07:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:05.729707733 +0000 UTC m=+1288.733492326" watchObservedRunningTime="2026-02-23 07:07:05.74139096 +0000 UTC m=+1288.745175543" Feb 23 07:07:05 crc kubenswrapper[5118]: I0223 07:07:05.756573 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6ff569978f-gwmwn" podStartSLOduration=3.756550689 podStartE2EDuration="3.756550689s" podCreationTimestamp="2026-02-23 07:07:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:05.754023349 +0000 UTC m=+1288.757807952" watchObservedRunningTime="2026-02-23 07:07:05.756550689 +0000 UTC m=+1288.760335272" Feb 23 07:07:06 crc kubenswrapper[5118]: I0223 07:07:06.740249 5118 generic.go:334] "Generic (PLEG): container finished" podID="ff639ff8-dd33-445f-8321-6528b227179d" containerID="93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500" exitCode=0 Feb 23 07:07:06 crc kubenswrapper[5118]: I0223 07:07:06.740332 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff639ff8-dd33-445f-8321-6528b227179d","Type":"ContainerDied","Data":"93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500"} Feb 23 07:07:06 crc kubenswrapper[5118]: I0223 07:07:06.743758 5118 generic.go:334] "Generic (PLEG): container finished" podID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" containerID="9476527837f0c6d64af1fe1bb75147baaee9ff0d903acf86c7c799d9ec0a9635" exitCode=0 Feb 23 07:07:06 crc kubenswrapper[5118]: I0223 07:07:06.743812 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90694ec2-b87b-476a-9601-4dcb9eb0c2a1","Type":"ContainerDied","Data":"9476527837f0c6d64af1fe1bb75147baaee9ff0d903acf86c7c799d9ec0a9635"} Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.015738 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.111209 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zblhp\" (UniqueName: \"kubernetes.io/projected/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-kube-api-access-zblhp\") pod \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.111402 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-etc-machine-id\") pod \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.111533 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "90694ec2-b87b-476a-9601-4dcb9eb0c2a1" (UID: "90694ec2-b87b-476a-9601-4dcb9eb0c2a1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.111548 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-combined-ca-bundle\") pod \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.111700 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data-custom\") pod \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.111733 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data\") pod \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.111791 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-scripts\") pod \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\" (UID: \"90694ec2-b87b-476a-9601-4dcb9eb0c2a1\") " Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.112319 5118 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.119790 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-scripts" (OuterVolumeSpecName: "scripts") pod "90694ec2-b87b-476a-9601-4dcb9eb0c2a1" (UID: "90694ec2-b87b-476a-9601-4dcb9eb0c2a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.127421 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "90694ec2-b87b-476a-9601-4dcb9eb0c2a1" (UID: "90694ec2-b87b-476a-9601-4dcb9eb0c2a1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.136291 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-kube-api-access-zblhp" (OuterVolumeSpecName: "kube-api-access-zblhp") pod "90694ec2-b87b-476a-9601-4dcb9eb0c2a1" (UID: "90694ec2-b87b-476a-9601-4dcb9eb0c2a1"). InnerVolumeSpecName "kube-api-access-zblhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.183449 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90694ec2-b87b-476a-9601-4dcb9eb0c2a1" (UID: "90694ec2-b87b-476a-9601-4dcb9eb0c2a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.214280 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.214334 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.214351 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.214366 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zblhp\" (UniqueName: \"kubernetes.io/projected/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-kube-api-access-zblhp\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.226291 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data" (OuterVolumeSpecName: "config-data") pod "90694ec2-b87b-476a-9601-4dcb9eb0c2a1" (UID: "90694ec2-b87b-476a-9601-4dcb9eb0c2a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.316988 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90694ec2-b87b-476a-9601-4dcb9eb0c2a1-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.756532 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90694ec2-b87b-476a-9601-4dcb9eb0c2a1","Type":"ContainerDied","Data":"bb499fc0731ab058598678190007644245e2829d92880aa76b4cc588a49f0401"} Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.756626 5118 scope.go:117] "RemoveContainer" containerID="9bca37bb9bf6a40a1f9ef890c9265c70efd4c4ad88e76eaff67886212a4d137b" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.756687 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.869199 5118 scope.go:117] "RemoveContainer" containerID="9476527837f0c6d64af1fe1bb75147baaee9ff0d903acf86c7c799d9ec0a9635" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.874481 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.887138 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.904640 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:07 crc kubenswrapper[5118]: E0223 07:07:07.905246 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" containerName="init" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905260 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" containerName="init" Feb 23 07:07:07 crc kubenswrapper[5118]: E0223 07:07:07.905290 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" containerName="dnsmasq-dns" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905296 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" containerName="dnsmasq-dns" Feb 23 07:07:07 crc kubenswrapper[5118]: E0223 07:07:07.905306 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerName="neutron-httpd" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905312 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerName="neutron-httpd" Feb 23 07:07:07 crc kubenswrapper[5118]: E0223 07:07:07.905325 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" containerName="cinder-scheduler" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905331 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" containerName="cinder-scheduler" Feb 23 07:07:07 crc kubenswrapper[5118]: E0223 07:07:07.905342 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" containerName="probe" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905355 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" containerName="probe" Feb 23 07:07:07 crc kubenswrapper[5118]: E0223 07:07:07.905369 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerName="neutron-api" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905376 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerName="neutron-api" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905659 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" containerName="probe" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905676 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerName="neutron-api" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905687 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc07232-4f7f-4922-996a-2f95a26bc25d" containerName="neutron-httpd" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905698 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" containerName="cinder-scheduler" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.905707 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b45588-ab90-4a85-b1ce-2f5f8226c9ea" containerName="dnsmasq-dns" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.906734 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.911596 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 23 07:07:07 crc kubenswrapper[5118]: I0223 07:07:07.938268 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.040445 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7gnk\" (UniqueName: \"kubernetes.io/projected/2ff79df5-722f-4ea5-91a2-8368d8eeee99-kube-api-access-x7gnk\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.040578 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.040606 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.040650 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff79df5-722f-4ea5-91a2-8368d8eeee99-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.040695 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.040718 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.142947 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.142990 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.143051 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff79df5-722f-4ea5-91a2-8368d8eeee99-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.143294 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.143320 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.143383 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7gnk\" (UniqueName: \"kubernetes.io/projected/2ff79df5-722f-4ea5-91a2-8368d8eeee99-kube-api-access-x7gnk\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.145516 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff79df5-722f-4ea5-91a2-8368d8eeee99-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.154696 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.155543 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.156569 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.156583 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.164586 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7gnk\" (UniqueName: \"kubernetes.io/projected/2ff79df5-722f-4ea5-91a2-8368d8eeee99-kube-api-access-x7gnk\") pod \"cinder-scheduler-0\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.239456 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.285352 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.347538 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj8jw\" (UniqueName: \"kubernetes.io/projected/f62662e6-3502-42da-a237-b3d2a4d97153-kube-api-access-gj8jw\") pod \"f62662e6-3502-42da-a237-b3d2a4d97153\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.347602 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-combined-ca-bundle\") pod \"f62662e6-3502-42da-a237-b3d2a4d97153\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.347660 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data-custom\") pod \"f62662e6-3502-42da-a237-b3d2a4d97153\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.347765 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data\") pod \"f62662e6-3502-42da-a237-b3d2a4d97153\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.348063 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62662e6-3502-42da-a237-b3d2a4d97153-logs\") pod \"f62662e6-3502-42da-a237-b3d2a4d97153\" (UID: \"f62662e6-3502-42da-a237-b3d2a4d97153\") " Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.348812 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62662e6-3502-42da-a237-b3d2a4d97153-logs" (OuterVolumeSpecName: "logs") pod "f62662e6-3502-42da-a237-b3d2a4d97153" (UID: "f62662e6-3502-42da-a237-b3d2a4d97153"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.353799 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62662e6-3502-42da-a237-b3d2a4d97153-kube-api-access-gj8jw" (OuterVolumeSpecName: "kube-api-access-gj8jw") pod "f62662e6-3502-42da-a237-b3d2a4d97153" (UID: "f62662e6-3502-42da-a237-b3d2a4d97153"). InnerVolumeSpecName "kube-api-access-gj8jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.355207 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f62662e6-3502-42da-a237-b3d2a4d97153" (UID: "f62662e6-3502-42da-a237-b3d2a4d97153"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.404465 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f62662e6-3502-42da-a237-b3d2a4d97153" (UID: "f62662e6-3502-42da-a237-b3d2a4d97153"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.451533 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62662e6-3502-42da-a237-b3d2a4d97153-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.451781 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj8jw\" (UniqueName: \"kubernetes.io/projected/f62662e6-3502-42da-a237-b3d2a4d97153-kube-api-access-gj8jw\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.451793 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.451802 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.488260 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data" (OuterVolumeSpecName: "config-data") pod "f62662e6-3502-42da-a237-b3d2a4d97153" (UID: "f62662e6-3502-42da-a237-b3d2a4d97153"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.559635 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62662e6-3502-42da-a237-b3d2a4d97153-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.771342 5118 generic.go:334] "Generic (PLEG): container finished" podID="f62662e6-3502-42da-a237-b3d2a4d97153" containerID="f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19" exitCode=0 Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.771402 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85db9c6f6b-8bxj7" event={"ID":"f62662e6-3502-42da-a237-b3d2a4d97153","Type":"ContainerDied","Data":"f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19"} Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.771423 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-85db9c6f6b-8bxj7" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.771457 5118 scope.go:117] "RemoveContainer" containerID="f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.771443 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-85db9c6f6b-8bxj7" event={"ID":"f62662e6-3502-42da-a237-b3d2a4d97153","Type":"ContainerDied","Data":"ec9425b1c66e54214e7ae5f43b94a33209df9058840ba3447f4b614f2c85b353"} Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.806647 5118 scope.go:117] "RemoveContainer" containerID="7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.808926 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-85db9c6f6b-8bxj7"] Feb 23 07:07:08 crc kubenswrapper[5118]: W0223 07:07:08.816231 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ff79df5_722f_4ea5_91a2_8368d8eeee99.slice/crio-e6d523e4b486df6459175b76b433d0f80a8af70c1a832712e0f2bf145831fe81 WatchSource:0}: Error finding container e6d523e4b486df6459175b76b433d0f80a8af70c1a832712e0f2bf145831fe81: Status 404 returned error can't find the container with id e6d523e4b486df6459175b76b433d0f80a8af70c1a832712e0f2bf145831fe81 Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.820293 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.824752 5118 scope.go:117] "RemoveContainer" containerID="f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19" Feb 23 07:07:08 crc kubenswrapper[5118]: E0223 07:07:08.825509 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19\": container with ID starting with f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19 not found: ID does not exist" containerID="f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.825543 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19"} err="failed to get container status \"f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19\": rpc error: code = NotFound desc = could not find container \"f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19\": container with ID starting with f4b2318f7fa6cdd61ae5088d575454451c21108cd8eec3ab7bfd04f33f244d19 not found: ID does not exist" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.825569 5118 scope.go:117] "RemoveContainer" containerID="7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7" Feb 23 07:07:08 crc kubenswrapper[5118]: E0223 07:07:08.825845 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7\": container with ID starting with 7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7 not found: ID does not exist" containerID="7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.825891 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7"} err="failed to get container status \"7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7\": rpc error: code = NotFound desc = could not find container \"7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7\": container with ID starting with 7345e1858b986a6f5a155835cee28c97fb49833abe54c3df3749728ef29efbd7 not found: ID does not exist" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.832042 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-85db9c6f6b-8bxj7"] Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.882758 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:07:08 crc kubenswrapper[5118]: I0223 07:07:08.884421 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.147524 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b5b85dd46-5prjs"] Feb 23 07:07:09 crc kubenswrapper[5118]: E0223 07:07:09.148026 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62662e6-3502-42da-a237-b3d2a4d97153" containerName="barbican-api" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.148042 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62662e6-3502-42da-a237-b3d2a4d97153" containerName="barbican-api" Feb 23 07:07:09 crc kubenswrapper[5118]: E0223 07:07:09.148069 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62662e6-3502-42da-a237-b3d2a4d97153" containerName="barbican-api-log" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.148077 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62662e6-3502-42da-a237-b3d2a4d97153" containerName="barbican-api-log" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.148299 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62662e6-3502-42da-a237-b3d2a4d97153" containerName="barbican-api-log" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.148319 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62662e6-3502-42da-a237-b3d2a4d97153" containerName="barbican-api" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.150739 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.160663 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b5b85dd46-5prjs"] Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.281611 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-combined-ca-bundle\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.281660 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-public-tls-certs\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.281691 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-internal-tls-certs\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.281711 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-config-data\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.281771 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0e825c7-deb0-41b5-b358-f23dcc0f1082-logs\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.282032 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54xcp\" (UniqueName: \"kubernetes.io/projected/b0e825c7-deb0-41b5-b358-f23dcc0f1082-kube-api-access-54xcp\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.282171 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-scripts\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.384828 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0e825c7-deb0-41b5-b358-f23dcc0f1082-logs\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.384915 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54xcp\" (UniqueName: \"kubernetes.io/projected/b0e825c7-deb0-41b5-b358-f23dcc0f1082-kube-api-access-54xcp\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.384951 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-scripts\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.385008 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-combined-ca-bundle\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.385026 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-public-tls-certs\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.385049 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-internal-tls-certs\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.385071 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-config-data\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.385345 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0e825c7-deb0-41b5-b358-f23dcc0f1082-logs\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.396522 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-public-tls-certs\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.396817 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-combined-ca-bundle\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.396861 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-internal-tls-certs\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.396864 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-config-data\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.400056 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-scripts\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.405337 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54xcp\" (UniqueName: \"kubernetes.io/projected/b0e825c7-deb0-41b5-b358-f23dcc0f1082-kube-api-access-54xcp\") pod \"placement-6b5b85dd46-5prjs\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.469287 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.716005 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90694ec2-b87b-476a-9601-4dcb9eb0c2a1" path="/var/lib/kubelet/pods/90694ec2-b87b-476a-9601-4dcb9eb0c2a1/volumes" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.717917 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62662e6-3502-42da-a237-b3d2a4d97153" path="/var/lib/kubelet/pods/f62662e6-3502-42da-a237-b3d2a4d97153/volumes" Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.785830 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff79df5-722f-4ea5-91a2-8368d8eeee99","Type":"ContainerStarted","Data":"56b4d180783d05f72c701c589b061dafc5eb3886c60c04d8b8ec016724107c8f"} Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.785910 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff79df5-722f-4ea5-91a2-8368d8eeee99","Type":"ContainerStarted","Data":"e6d523e4b486df6459175b76b433d0f80a8af70c1a832712e0f2bf145831fe81"} Feb 23 07:07:09 crc kubenswrapper[5118]: I0223 07:07:09.981551 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b5b85dd46-5prjs"] Feb 23 07:07:09 crc kubenswrapper[5118]: W0223 07:07:09.984051 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0e825c7_deb0_41b5_b358_f23dcc0f1082.slice/crio-13124c55d77e72251209f54aed646d929192a57cac960f30ab583ef76c0b0135 WatchSource:0}: Error finding container 13124c55d77e72251209f54aed646d929192a57cac960f30ab583ef76c0b0135: Status 404 returned error can't find the container with id 13124c55d77e72251209f54aed646d929192a57cac960f30ab583ef76c0b0135 Feb 23 07:07:10 crc kubenswrapper[5118]: I0223 07:07:10.797301 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5b85dd46-5prjs" event={"ID":"b0e825c7-deb0-41b5-b358-f23dcc0f1082","Type":"ContainerStarted","Data":"bf1c589b730495897b68ee4004af7ede183db68decb4860331926ec63e038b03"} Feb 23 07:07:10 crc kubenswrapper[5118]: I0223 07:07:10.798374 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5b85dd46-5prjs" event={"ID":"b0e825c7-deb0-41b5-b358-f23dcc0f1082","Type":"ContainerStarted","Data":"d9b130e05974e910c11275a4779f706f2b8c61a00e6da82692831f10958cd486"} Feb 23 07:07:10 crc kubenswrapper[5118]: I0223 07:07:10.798396 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5b85dd46-5prjs" event={"ID":"b0e825c7-deb0-41b5-b358-f23dcc0f1082","Type":"ContainerStarted","Data":"13124c55d77e72251209f54aed646d929192a57cac960f30ab583ef76c0b0135"} Feb 23 07:07:10 crc kubenswrapper[5118]: I0223 07:07:10.798421 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:10 crc kubenswrapper[5118]: I0223 07:07:10.798443 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:10 crc kubenswrapper[5118]: I0223 07:07:10.799126 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff79df5-722f-4ea5-91a2-8368d8eeee99","Type":"ContainerStarted","Data":"befb16d8a3f73c9a8d367e972c7e2cebda984c914cdc82b0856f084d6aa276f7"} Feb 23 07:07:10 crc kubenswrapper[5118]: I0223 07:07:10.820224 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b5b85dd46-5prjs" podStartSLOduration=1.820201334 podStartE2EDuration="1.820201334s" podCreationTimestamp="2026-02-23 07:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:10.814271223 +0000 UTC m=+1293.818055796" watchObservedRunningTime="2026-02-23 07:07:10.820201334 +0000 UTC m=+1293.823985907" Feb 23 07:07:10 crc kubenswrapper[5118]: I0223 07:07:10.848074 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.848054754 podStartE2EDuration="3.848054754s" podCreationTimestamp="2026-02-23 07:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:10.839573353 +0000 UTC m=+1293.843357926" watchObservedRunningTime="2026-02-23 07:07:10.848054754 +0000 UTC m=+1293.851839327" Feb 23 07:07:11 crc kubenswrapper[5118]: I0223 07:07:11.943159 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.284034 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.285844 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.289495 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.290194 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-h2t7j" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.291456 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.303898 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.374754 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.374824 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67t5b\" (UniqueName: \"kubernetes.io/projected/1b3633e5-65f3-41c8-be57-5c4e28227ec9-kube-api-access-67t5b\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.374848 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.375020 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.477414 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.477517 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.477571 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67t5b\" (UniqueName: \"kubernetes.io/projected/1b3633e5-65f3-41c8-be57-5c4e28227ec9-kube-api-access-67t5b\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.477596 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.478512 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.488942 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.489995 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.494561 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67t5b\" (UniqueName: \"kubernetes.io/projected/1b3633e5-65f3-41c8-be57-5c4e28227ec9-kube-api-access-67t5b\") pod \"openstackclient\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.605826 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 07:07:12 crc kubenswrapper[5118]: I0223 07:07:12.943029 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 07:07:13 crc kubenswrapper[5118]: I0223 07:07:13.285794 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 07:07:13 crc kubenswrapper[5118]: I0223 07:07:13.861917 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1b3633e5-65f3-41c8-be57-5c4e28227ec9","Type":"ContainerStarted","Data":"d251465d6930dc16da92acb2f16b30087e733c1d3bd169553bd1c45a6e13b740"} Feb 23 07:07:14 crc kubenswrapper[5118]: I0223 07:07:14.861089 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.245876 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-v6s5m"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.248212 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v6s5m" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.253830 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-v6s5m"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.361945 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-ks9t7"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.367900 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ks9t7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.368399 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ks9t7"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.390387 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0f3a81-94a7-4553-abc5-d58e0150aeea-operator-scripts\") pod \"nova-api-db-create-v6s5m\" (UID: \"2e0f3a81-94a7-4553-abc5-d58e0150aeea\") " pod="openstack/nova-api-db-create-v6s5m" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.390448 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcgvh\" (UniqueName: \"kubernetes.io/projected/2e0f3a81-94a7-4553-abc5-d58e0150aeea-kube-api-access-jcgvh\") pod \"nova-api-db-create-v6s5m\" (UID: \"2e0f3a81-94a7-4553-abc5-d58e0150aeea\") " pod="openstack/nova-api-db-create-v6s5m" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.457853 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-n6s7x"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.459640 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n6s7x" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.488533 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2de9-account-create-update-8hbsf"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.490775 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2de9-account-create-update-8hbsf" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.493934 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.494562 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0f3a81-94a7-4553-abc5-d58e0150aeea-operator-scripts\") pod \"nova-api-db-create-v6s5m\" (UID: \"2e0f3a81-94a7-4553-abc5-d58e0150aeea\") " pod="openstack/nova-api-db-create-v6s5m" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.494619 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcgvh\" (UniqueName: \"kubernetes.io/projected/2e0f3a81-94a7-4553-abc5-d58e0150aeea-kube-api-access-jcgvh\") pod \"nova-api-db-create-v6s5m\" (UID: \"2e0f3a81-94a7-4553-abc5-d58e0150aeea\") " pod="openstack/nova-api-db-create-v6s5m" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.494695 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-operator-scripts\") pod \"nova-cell0-db-create-ks9t7\" (UID: \"84786dbb-5d55-4ca5-bceb-b66bd83d6c06\") " pod="openstack/nova-cell0-db-create-ks9t7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.495756 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g2l9\" (UniqueName: \"kubernetes.io/projected/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-kube-api-access-8g2l9\") pod \"nova-cell0-db-create-ks9t7\" (UID: \"84786dbb-5d55-4ca5-bceb-b66bd83d6c06\") " pod="openstack/nova-cell0-db-create-ks9t7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.496503 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0f3a81-94a7-4553-abc5-d58e0150aeea-operator-scripts\") pod \"nova-api-db-create-v6s5m\" (UID: \"2e0f3a81-94a7-4553-abc5-d58e0150aeea\") " pod="openstack/nova-api-db-create-v6s5m" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.508408 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2de9-account-create-update-8hbsf"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.521668 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-n6s7x"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.536260 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcgvh\" (UniqueName: \"kubernetes.io/projected/2e0f3a81-94a7-4553-abc5-d58e0150aeea-kube-api-access-jcgvh\") pod \"nova-api-db-create-v6s5m\" (UID: \"2e0f3a81-94a7-4553-abc5-d58e0150aeea\") " pod="openstack/nova-api-db-create-v6s5m" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.578673 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v6s5m" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.607777 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66bdd8af-d48f-4419-83cc-c06f0b71ee32-operator-scripts\") pod \"nova-api-2de9-account-create-update-8hbsf\" (UID: \"66bdd8af-d48f-4419-83cc-c06f0b71ee32\") " pod="openstack/nova-api-2de9-account-create-update-8hbsf" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.608085 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-operator-scripts\") pod \"nova-cell0-db-create-ks9t7\" (UID: \"84786dbb-5d55-4ca5-bceb-b66bd83d6c06\") " pod="openstack/nova-cell0-db-create-ks9t7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.608362 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffqg7\" (UniqueName: \"kubernetes.io/projected/814f160f-4a46-433d-8abb-bdf4b6ce65d3-kube-api-access-ffqg7\") pod \"nova-cell1-db-create-n6s7x\" (UID: \"814f160f-4a46-433d-8abb-bdf4b6ce65d3\") " pod="openstack/nova-cell1-db-create-n6s7x" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.608523 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g2l9\" (UniqueName: \"kubernetes.io/projected/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-kube-api-access-8g2l9\") pod \"nova-cell0-db-create-ks9t7\" (UID: \"84786dbb-5d55-4ca5-bceb-b66bd83d6c06\") " pod="openstack/nova-cell0-db-create-ks9t7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.608619 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktp6\" (UniqueName: \"kubernetes.io/projected/66bdd8af-d48f-4419-83cc-c06f0b71ee32-kube-api-access-bktp6\") pod \"nova-api-2de9-account-create-update-8hbsf\" (UID: \"66bdd8af-d48f-4419-83cc-c06f0b71ee32\") " pod="openstack/nova-api-2de9-account-create-update-8hbsf" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.608702 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/814f160f-4a46-433d-8abb-bdf4b6ce65d3-operator-scripts\") pod \"nova-cell1-db-create-n6s7x\" (UID: \"814f160f-4a46-433d-8abb-bdf4b6ce65d3\") " pod="openstack/nova-cell1-db-create-n6s7x" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.618255 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-operator-scripts\") pod \"nova-cell0-db-create-ks9t7\" (UID: \"84786dbb-5d55-4ca5-bceb-b66bd83d6c06\") " pod="openstack/nova-cell0-db-create-ks9t7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.628881 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9656-account-create-update-k8fz9"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.633774 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g2l9\" (UniqueName: \"kubernetes.io/projected/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-kube-api-access-8g2l9\") pod \"nova-cell0-db-create-ks9t7\" (UID: \"84786dbb-5d55-4ca5-bceb-b66bd83d6c06\") " pod="openstack/nova-cell0-db-create-ks9t7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.636403 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9656-account-create-update-k8fz9" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.641132 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.678942 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9656-account-create-update-k8fz9"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.694658 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ks9t7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.710360 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffqg7\" (UniqueName: \"kubernetes.io/projected/814f160f-4a46-433d-8abb-bdf4b6ce65d3-kube-api-access-ffqg7\") pod \"nova-cell1-db-create-n6s7x\" (UID: \"814f160f-4a46-433d-8abb-bdf4b6ce65d3\") " pod="openstack/nova-cell1-db-create-n6s7x" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.710440 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92010864-4f7f-4066-9ce5-d14281346f97-operator-scripts\") pod \"nova-cell0-9656-account-create-update-k8fz9\" (UID: \"92010864-4f7f-4066-9ce5-d14281346f97\") " pod="openstack/nova-cell0-9656-account-create-update-k8fz9" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.710468 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bktp6\" (UniqueName: \"kubernetes.io/projected/66bdd8af-d48f-4419-83cc-c06f0b71ee32-kube-api-access-bktp6\") pod \"nova-api-2de9-account-create-update-8hbsf\" (UID: \"66bdd8af-d48f-4419-83cc-c06f0b71ee32\") " pod="openstack/nova-api-2de9-account-create-update-8hbsf" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.710499 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/814f160f-4a46-433d-8abb-bdf4b6ce65d3-operator-scripts\") pod \"nova-cell1-db-create-n6s7x\" (UID: \"814f160f-4a46-433d-8abb-bdf4b6ce65d3\") " pod="openstack/nova-cell1-db-create-n6s7x" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.710543 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66bdd8af-d48f-4419-83cc-c06f0b71ee32-operator-scripts\") pod \"nova-api-2de9-account-create-update-8hbsf\" (UID: \"66bdd8af-d48f-4419-83cc-c06f0b71ee32\") " pod="openstack/nova-api-2de9-account-create-update-8hbsf" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.710609 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9nzd\" (UniqueName: \"kubernetes.io/projected/92010864-4f7f-4066-9ce5-d14281346f97-kube-api-access-m9nzd\") pod \"nova-cell0-9656-account-create-update-k8fz9\" (UID: \"92010864-4f7f-4066-9ce5-d14281346f97\") " pod="openstack/nova-cell0-9656-account-create-update-k8fz9" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.711909 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/814f160f-4a46-433d-8abb-bdf4b6ce65d3-operator-scripts\") pod \"nova-cell1-db-create-n6s7x\" (UID: \"814f160f-4a46-433d-8abb-bdf4b6ce65d3\") " pod="openstack/nova-cell1-db-create-n6s7x" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.744217 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66bdd8af-d48f-4419-83cc-c06f0b71ee32-operator-scripts\") pod \"nova-api-2de9-account-create-update-8hbsf\" (UID: \"66bdd8af-d48f-4419-83cc-c06f0b71ee32\") " pod="openstack/nova-api-2de9-account-create-update-8hbsf" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.746372 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffqg7\" (UniqueName: \"kubernetes.io/projected/814f160f-4a46-433d-8abb-bdf4b6ce65d3-kube-api-access-ffqg7\") pod \"nova-cell1-db-create-n6s7x\" (UID: \"814f160f-4a46-433d-8abb-bdf4b6ce65d3\") " pod="openstack/nova-cell1-db-create-n6s7x" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.757789 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktp6\" (UniqueName: \"kubernetes.io/projected/66bdd8af-d48f-4419-83cc-c06f0b71ee32-kube-api-access-bktp6\") pod \"nova-api-2de9-account-create-update-8hbsf\" (UID: \"66bdd8af-d48f-4419-83cc-c06f0b71ee32\") " pod="openstack/nova-api-2de9-account-create-update-8hbsf" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.775127 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-77bd586555-4s2g7"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.777130 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.781650 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.781941 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.782123 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.794078 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-77bd586555-4s2g7"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.799938 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n6s7x" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.804824 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-241c-account-create-update-tnztd"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.806061 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-241c-account-create-update-tnztd" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.808151 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.812212 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-log-httpd\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.812251 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-internal-tls-certs\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.812274 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg2xq\" (UniqueName: \"kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-kube-api-access-gg2xq\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.812326 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9nzd\" (UniqueName: \"kubernetes.io/projected/92010864-4f7f-4066-9ce5-d14281346f97-kube-api-access-m9nzd\") pod \"nova-cell0-9656-account-create-update-k8fz9\" (UID: \"92010864-4f7f-4066-9ce5-d14281346f97\") " pod="openstack/nova-cell0-9656-account-create-update-k8fz9" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.812348 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-combined-ca-bundle\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.812371 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-run-httpd\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.812395 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-etc-swift\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.812421 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-public-tls-certs\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.812465 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92010864-4f7f-4066-9ce5-d14281346f97-operator-scripts\") pod \"nova-cell0-9656-account-create-update-k8fz9\" (UID: \"92010864-4f7f-4066-9ce5-d14281346f97\") " pod="openstack/nova-cell0-9656-account-create-update-k8fz9" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.812514 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-config-data\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.814975 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92010864-4f7f-4066-9ce5-d14281346f97-operator-scripts\") pod \"nova-cell0-9656-account-create-update-k8fz9\" (UID: \"92010864-4f7f-4066-9ce5-d14281346f97\") " pod="openstack/nova-cell0-9656-account-create-update-k8fz9" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.823479 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-241c-account-create-update-tnztd"] Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.830898 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2de9-account-create-update-8hbsf" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.866018 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9nzd\" (UniqueName: \"kubernetes.io/projected/92010864-4f7f-4066-9ce5-d14281346f97-kube-api-access-m9nzd\") pod \"nova-cell0-9656-account-create-update-k8fz9\" (UID: \"92010864-4f7f-4066-9ce5-d14281346f97\") " pod="openstack/nova-cell0-9656-account-create-update-k8fz9" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.914588 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-config-data\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.914651 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bc15918-ffec-4589-85ea-bdf3999dc8d6-operator-scripts\") pod \"nova-cell1-241c-account-create-update-tnztd\" (UID: \"8bc15918-ffec-4589-85ea-bdf3999dc8d6\") " pod="openstack/nova-cell1-241c-account-create-update-tnztd" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.914715 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jskxb\" (UniqueName: \"kubernetes.io/projected/8bc15918-ffec-4589-85ea-bdf3999dc8d6-kube-api-access-jskxb\") pod \"nova-cell1-241c-account-create-update-tnztd\" (UID: \"8bc15918-ffec-4589-85ea-bdf3999dc8d6\") " pod="openstack/nova-cell1-241c-account-create-update-tnztd" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.914755 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-log-httpd\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.914775 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-internal-tls-certs\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.914796 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg2xq\" (UniqueName: \"kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-kube-api-access-gg2xq\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.914824 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-combined-ca-bundle\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.914849 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-run-httpd\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.914875 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-etc-swift\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.914908 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-public-tls-certs\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.915647 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-run-httpd\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.916842 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-log-httpd\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.922526 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-etc-swift\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.923424 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-combined-ca-bundle\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.923637 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-internal-tls-certs\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.929745 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-public-tls-certs\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.930761 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-config-data\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.934351 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg2xq\" (UniqueName: \"kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-kube-api-access-gg2xq\") pod \"swift-proxy-77bd586555-4s2g7\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:17 crc kubenswrapper[5118]: I0223 07:07:17.994700 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9656-account-create-update-k8fz9" Feb 23 07:07:18 crc kubenswrapper[5118]: I0223 07:07:18.017160 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jskxb\" (UniqueName: \"kubernetes.io/projected/8bc15918-ffec-4589-85ea-bdf3999dc8d6-kube-api-access-jskxb\") pod \"nova-cell1-241c-account-create-update-tnztd\" (UID: \"8bc15918-ffec-4589-85ea-bdf3999dc8d6\") " pod="openstack/nova-cell1-241c-account-create-update-tnztd" Feb 23 07:07:18 crc kubenswrapper[5118]: I0223 07:07:18.017328 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bc15918-ffec-4589-85ea-bdf3999dc8d6-operator-scripts\") pod \"nova-cell1-241c-account-create-update-tnztd\" (UID: \"8bc15918-ffec-4589-85ea-bdf3999dc8d6\") " pod="openstack/nova-cell1-241c-account-create-update-tnztd" Feb 23 07:07:18 crc kubenswrapper[5118]: I0223 07:07:18.018117 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bc15918-ffec-4589-85ea-bdf3999dc8d6-operator-scripts\") pod \"nova-cell1-241c-account-create-update-tnztd\" (UID: \"8bc15918-ffec-4589-85ea-bdf3999dc8d6\") " pod="openstack/nova-cell1-241c-account-create-update-tnztd" Feb 23 07:07:18 crc kubenswrapper[5118]: I0223 07:07:18.037784 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jskxb\" (UniqueName: \"kubernetes.io/projected/8bc15918-ffec-4589-85ea-bdf3999dc8d6-kube-api-access-jskxb\") pod \"nova-cell1-241c-account-create-update-tnztd\" (UID: \"8bc15918-ffec-4589-85ea-bdf3999dc8d6\") " pod="openstack/nova-cell1-241c-account-create-update-tnztd" Feb 23 07:07:18 crc kubenswrapper[5118]: I0223 07:07:18.130805 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:18 crc kubenswrapper[5118]: I0223 07:07:18.174554 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-241c-account-create-update-tnztd" Feb 23 07:07:18 crc kubenswrapper[5118]: I0223 07:07:18.560177 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 07:07:23 crc kubenswrapper[5118]: I0223 07:07:23.660701 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-n6s7x"] Feb 23 07:07:23 crc kubenswrapper[5118]: W0223 07:07:23.662284 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod814f160f_4a46_433d_8abb_bdf4b6ce65d3.slice/crio-e0a8a359c808671874d726d42b7179ed377afc483cf0d86c0ed65d9e54267a76 WatchSource:0}: Error finding container e0a8a359c808671874d726d42b7179ed377afc483cf0d86c0ed65d9e54267a76: Status 404 returned error can't find the container with id e0a8a359c808671874d726d42b7179ed377afc483cf0d86c0ed65d9e54267a76 Feb 23 07:07:23 crc kubenswrapper[5118]: I0223 07:07:23.695374 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9656-account-create-update-k8fz9"] Feb 23 07:07:23 crc kubenswrapper[5118]: I0223 07:07:23.843802 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-241c-account-create-update-tnztd"] Feb 23 07:07:23 crc kubenswrapper[5118]: W0223 07:07:23.865773 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66bdd8af_d48f_4419_83cc_c06f0b71ee32.slice/crio-ee9151ba84eb00a77b60efb97052bbe970b045ebdd10d2ab1e52c3249e48aeca WatchSource:0}: Error finding container ee9151ba84eb00a77b60efb97052bbe970b045ebdd10d2ab1e52c3249e48aeca: Status 404 returned error can't find the container with id ee9151ba84eb00a77b60efb97052bbe970b045ebdd10d2ab1e52c3249e48aeca Feb 23 07:07:23 crc kubenswrapper[5118]: I0223 07:07:23.876155 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2de9-account-create-update-8hbsf"] Feb 23 07:07:23 crc kubenswrapper[5118]: I0223 07:07:23.904499 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-v6s5m"] Feb 23 07:07:23 crc kubenswrapper[5118]: I0223 07:07:23.959162 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ks9t7"] Feb 23 07:07:23 crc kubenswrapper[5118]: I0223 07:07:23.995281 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-77bd586555-4s2g7"] Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.040652 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ks9t7" event={"ID":"84786dbb-5d55-4ca5-bceb-b66bd83d6c06","Type":"ContainerStarted","Data":"134b5d3ae4dea8158fb06bcf52c73fe796a50d90ac4fe7f3e662f4d5081e3b48"} Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.063028 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n6s7x" event={"ID":"814f160f-4a46-433d-8abb-bdf4b6ce65d3","Type":"ContainerStarted","Data":"6d1930f97fefea30957a1d47896f52db8ffd741601261f9d1da5be58352aff9b"} Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.063084 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n6s7x" event={"ID":"814f160f-4a46-433d-8abb-bdf4b6ce65d3","Type":"ContainerStarted","Data":"e0a8a359c808671874d726d42b7179ed377afc483cf0d86c0ed65d9e54267a76"} Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.067754 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9656-account-create-update-k8fz9" event={"ID":"92010864-4f7f-4066-9ce5-d14281346f97","Type":"ContainerStarted","Data":"8f957cc8e11a9fbbf6380e9c0269fb9c7337a8d09534ea1c9361fe65ece901ca"} Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.067814 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9656-account-create-update-k8fz9" event={"ID":"92010864-4f7f-4066-9ce5-d14281346f97","Type":"ContainerStarted","Data":"6fd5a6dc453358c060d35ad7bc0a662207824e8d65c3838f9a6dcca2f6417988"} Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.069443 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-241c-account-create-update-tnztd" event={"ID":"8bc15918-ffec-4589-85ea-bdf3999dc8d6","Type":"ContainerStarted","Data":"f6780d87ef21cd016c8094998486080c7fa49a22e9cf9eb42237391adea345b8"} Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.080533 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-v6s5m" event={"ID":"2e0f3a81-94a7-4553-abc5-d58e0150aeea","Type":"ContainerStarted","Data":"d55ac6e3d7545f124afd84e3866618b79d70b02b9f2e68d4ee8901a2e09dedd5"} Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.084923 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2de9-account-create-update-8hbsf" event={"ID":"66bdd8af-d48f-4419-83cc-c06f0b71ee32","Type":"ContainerStarted","Data":"ee9151ba84eb00a77b60efb97052bbe970b045ebdd10d2ab1e52c3249e48aeca"} Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.087353 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1b3633e5-65f3-41c8-be57-5c4e28227ec9","Type":"ContainerStarted","Data":"4ba34d66d196c48e8577f6109b59eff4f5b86c53ed431eae0c2091d19a42b92c"} Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.095523 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-n6s7x" podStartSLOduration=7.095496712 podStartE2EDuration="7.095496712s" podCreationTimestamp="2026-02-23 07:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:24.077371212 +0000 UTC m=+1307.081155785" watchObservedRunningTime="2026-02-23 07:07:24.095496712 +0000 UTC m=+1307.099281285" Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.105336 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-9656-account-create-update-k8fz9" podStartSLOduration=7.105316925 podStartE2EDuration="7.105316925s" podCreationTimestamp="2026-02-23 07:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:24.093044753 +0000 UTC m=+1307.096829326" watchObservedRunningTime="2026-02-23 07:07:24.105316925 +0000 UTC m=+1307.109101488" Feb 23 07:07:24 crc kubenswrapper[5118]: I0223 07:07:24.112306 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.969230808 podStartE2EDuration="12.1122794s" podCreationTimestamp="2026-02-23 07:07:12 +0000 UTC" firstStartedPulling="2026-02-23 07:07:12.951785881 +0000 UTC m=+1295.955570454" lastFinishedPulling="2026-02-23 07:07:23.094834473 +0000 UTC m=+1306.098619046" observedRunningTime="2026-02-23 07:07:24.10975752 +0000 UTC m=+1307.113542093" watchObservedRunningTime="2026-02-23 07:07:24.1122794 +0000 UTC m=+1307.116063973" Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.104052 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77bd586555-4s2g7" event={"ID":"e75b838e-decf-4583-8b96-a41f54e2a654","Type":"ContainerStarted","Data":"66b033f5cdba1801f896cac33e18ec0b1d1f6261e8ea02c514f953da7c1c5a84"} Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.104398 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77bd586555-4s2g7" event={"ID":"e75b838e-decf-4583-8b96-a41f54e2a654","Type":"ContainerStarted","Data":"d4786168035f880a50d9daf3c146a82f881d078e61fee117923247704b6bc372"} Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.104414 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77bd586555-4s2g7" event={"ID":"e75b838e-decf-4583-8b96-a41f54e2a654","Type":"ContainerStarted","Data":"8ae6db35f5b3512c867b9317844df3160b3a086707b468e0c8be638a0ef9f66a"} Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.104653 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.106287 5118 generic.go:334] "Generic (PLEG): container finished" podID="84786dbb-5d55-4ca5-bceb-b66bd83d6c06" containerID="a331d02a6eec032c5afe8e514192d44b34c7dc18aaea080dddcb420228f17459" exitCode=0 Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.106334 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ks9t7" event={"ID":"84786dbb-5d55-4ca5-bceb-b66bd83d6c06","Type":"ContainerDied","Data":"a331d02a6eec032c5afe8e514192d44b34c7dc18aaea080dddcb420228f17459"} Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.109326 5118 generic.go:334] "Generic (PLEG): container finished" podID="814f160f-4a46-433d-8abb-bdf4b6ce65d3" containerID="6d1930f97fefea30957a1d47896f52db8ffd741601261f9d1da5be58352aff9b" exitCode=0 Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.109387 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n6s7x" event={"ID":"814f160f-4a46-433d-8abb-bdf4b6ce65d3","Type":"ContainerDied","Data":"6d1930f97fefea30957a1d47896f52db8ffd741601261f9d1da5be58352aff9b"} Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.112781 5118 generic.go:334] "Generic (PLEG): container finished" podID="92010864-4f7f-4066-9ce5-d14281346f97" containerID="8f957cc8e11a9fbbf6380e9c0269fb9c7337a8d09534ea1c9361fe65ece901ca" exitCode=0 Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.112812 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9656-account-create-update-k8fz9" event={"ID":"92010864-4f7f-4066-9ce5-d14281346f97","Type":"ContainerDied","Data":"8f957cc8e11a9fbbf6380e9c0269fb9c7337a8d09534ea1c9361fe65ece901ca"} Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.117029 5118 generic.go:334] "Generic (PLEG): container finished" podID="8bc15918-ffec-4589-85ea-bdf3999dc8d6" containerID="734762bdf80dd183a26cac81a8b5b957a69d69feded0dd37247d0ea7e3a3cd24" exitCode=0 Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.117186 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-241c-account-create-update-tnztd" event={"ID":"8bc15918-ffec-4589-85ea-bdf3999dc8d6","Type":"ContainerDied","Data":"734762bdf80dd183a26cac81a8b5b957a69d69feded0dd37247d0ea7e3a3cd24"} Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.121289 5118 generic.go:334] "Generic (PLEG): container finished" podID="2e0f3a81-94a7-4553-abc5-d58e0150aeea" containerID="ee31de91e40b156a453f3da6cf9d535c4074b1b72364559bf6cedba450085aa1" exitCode=0 Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.121366 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-v6s5m" event={"ID":"2e0f3a81-94a7-4553-abc5-d58e0150aeea","Type":"ContainerDied","Data":"ee31de91e40b156a453f3da6cf9d535c4074b1b72364559bf6cedba450085aa1"} Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.135052 5118 generic.go:334] "Generic (PLEG): container finished" podID="66bdd8af-d48f-4419-83cc-c06f0b71ee32" containerID="1b8d408fe3b77e3bb503c8839eb4d16302c83548ed27ca45d62d3fefc3f0f0fb" exitCode=0 Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.135964 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-77bd586555-4s2g7" podStartSLOduration=8.135936994 podStartE2EDuration="8.135936994s" podCreationTimestamp="2026-02-23 07:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:25.128968108 +0000 UTC m=+1308.132752681" watchObservedRunningTime="2026-02-23 07:07:25.135936994 +0000 UTC m=+1308.139721577" Feb 23 07:07:25 crc kubenswrapper[5118]: I0223 07:07:25.136268 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2de9-account-create-update-8hbsf" event={"ID":"66bdd8af-d48f-4419-83cc-c06f0b71ee32","Type":"ContainerDied","Data":"1b8d408fe3b77e3bb503c8839eb4d16302c83548ed27ca45d62d3fefc3f0f0fb"} Feb 23 07:07:26 crc kubenswrapper[5118]: I0223 07:07:26.160684 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:26 crc kubenswrapper[5118]: I0223 07:07:26.885545 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-241c-account-create-update-tnztd" Feb 23 07:07:26 crc kubenswrapper[5118]: I0223 07:07:26.893640 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2de9-account-create-update-8hbsf" Feb 23 07:07:26 crc kubenswrapper[5118]: I0223 07:07:26.898162 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n6s7x" Feb 23 07:07:26 crc kubenswrapper[5118]: I0223 07:07:26.904133 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9656-account-create-update-k8fz9" Feb 23 07:07:26 crc kubenswrapper[5118]: I0223 07:07:26.910039 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ks9t7" Feb 23 07:07:26 crc kubenswrapper[5118]: I0223 07:07:26.917868 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v6s5m" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.066858 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bktp6\" (UniqueName: \"kubernetes.io/projected/66bdd8af-d48f-4419-83cc-c06f0b71ee32-kube-api-access-bktp6\") pod \"66bdd8af-d48f-4419-83cc-c06f0b71ee32\" (UID: \"66bdd8af-d48f-4419-83cc-c06f0b71ee32\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.066969 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/814f160f-4a46-433d-8abb-bdf4b6ce65d3-operator-scripts\") pod \"814f160f-4a46-433d-8abb-bdf4b6ce65d3\" (UID: \"814f160f-4a46-433d-8abb-bdf4b6ce65d3\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067064 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffqg7\" (UniqueName: \"kubernetes.io/projected/814f160f-4a46-433d-8abb-bdf4b6ce65d3-kube-api-access-ffqg7\") pod \"814f160f-4a46-433d-8abb-bdf4b6ce65d3\" (UID: \"814f160f-4a46-433d-8abb-bdf4b6ce65d3\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067166 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bc15918-ffec-4589-85ea-bdf3999dc8d6-operator-scripts\") pod \"8bc15918-ffec-4589-85ea-bdf3999dc8d6\" (UID: \"8bc15918-ffec-4589-85ea-bdf3999dc8d6\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067374 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0f3a81-94a7-4553-abc5-d58e0150aeea-operator-scripts\") pod \"2e0f3a81-94a7-4553-abc5-d58e0150aeea\" (UID: \"2e0f3a81-94a7-4553-abc5-d58e0150aeea\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067400 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92010864-4f7f-4066-9ce5-d14281346f97-operator-scripts\") pod \"92010864-4f7f-4066-9ce5-d14281346f97\" (UID: \"92010864-4f7f-4066-9ce5-d14281346f97\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067502 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g2l9\" (UniqueName: \"kubernetes.io/projected/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-kube-api-access-8g2l9\") pod \"84786dbb-5d55-4ca5-bceb-b66bd83d6c06\" (UID: \"84786dbb-5d55-4ca5-bceb-b66bd83d6c06\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067699 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcgvh\" (UniqueName: \"kubernetes.io/projected/2e0f3a81-94a7-4553-abc5-d58e0150aeea-kube-api-access-jcgvh\") pod \"2e0f3a81-94a7-4553-abc5-d58e0150aeea\" (UID: \"2e0f3a81-94a7-4553-abc5-d58e0150aeea\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067704 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/814f160f-4a46-433d-8abb-bdf4b6ce65d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "814f160f-4a46-433d-8abb-bdf4b6ce65d3" (UID: "814f160f-4a46-433d-8abb-bdf4b6ce65d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067724 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-operator-scripts\") pod \"84786dbb-5d55-4ca5-bceb-b66bd83d6c06\" (UID: \"84786dbb-5d55-4ca5-bceb-b66bd83d6c06\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067823 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jskxb\" (UniqueName: \"kubernetes.io/projected/8bc15918-ffec-4589-85ea-bdf3999dc8d6-kube-api-access-jskxb\") pod \"8bc15918-ffec-4589-85ea-bdf3999dc8d6\" (UID: \"8bc15918-ffec-4589-85ea-bdf3999dc8d6\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067911 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9nzd\" (UniqueName: \"kubernetes.io/projected/92010864-4f7f-4066-9ce5-d14281346f97-kube-api-access-m9nzd\") pod \"92010864-4f7f-4066-9ce5-d14281346f97\" (UID: \"92010864-4f7f-4066-9ce5-d14281346f97\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067754 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc15918-ffec-4589-85ea-bdf3999dc8d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8bc15918-ffec-4589-85ea-bdf3999dc8d6" (UID: "8bc15918-ffec-4589-85ea-bdf3999dc8d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.067960 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66bdd8af-d48f-4419-83cc-c06f0b71ee32-operator-scripts\") pod \"66bdd8af-d48f-4419-83cc-c06f0b71ee32\" (UID: \"66bdd8af-d48f-4419-83cc-c06f0b71ee32\") " Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.068025 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e0f3a81-94a7-4553-abc5-d58e0150aeea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e0f3a81-94a7-4553-abc5-d58e0150aeea" (UID: "2e0f3a81-94a7-4553-abc5-d58e0150aeea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.068713 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84786dbb-5d55-4ca5-bceb-b66bd83d6c06" (UID: "84786dbb-5d55-4ca5-bceb-b66bd83d6c06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.069062 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92010864-4f7f-4066-9ce5-d14281346f97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92010864-4f7f-4066-9ce5-d14281346f97" (UID: "92010864-4f7f-4066-9ce5-d14281346f97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.069154 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66bdd8af-d48f-4419-83cc-c06f0b71ee32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66bdd8af-d48f-4419-83cc-c06f0b71ee32" (UID: "66bdd8af-d48f-4419-83cc-c06f0b71ee32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.070447 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/814f160f-4a46-433d-8abb-bdf4b6ce65d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.070482 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bc15918-ffec-4589-85ea-bdf3999dc8d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.070493 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92010864-4f7f-4066-9ce5-d14281346f97-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.070506 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0f3a81-94a7-4553-abc5-d58e0150aeea-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.070518 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.070527 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66bdd8af-d48f-4419-83cc-c06f0b71ee32-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.074600 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0f3a81-94a7-4553-abc5-d58e0150aeea-kube-api-access-jcgvh" (OuterVolumeSpecName: "kube-api-access-jcgvh") pod "2e0f3a81-94a7-4553-abc5-d58e0150aeea" (UID: "2e0f3a81-94a7-4553-abc5-d58e0150aeea"). InnerVolumeSpecName "kube-api-access-jcgvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.074677 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92010864-4f7f-4066-9ce5-d14281346f97-kube-api-access-m9nzd" (OuterVolumeSpecName: "kube-api-access-m9nzd") pod "92010864-4f7f-4066-9ce5-d14281346f97" (UID: "92010864-4f7f-4066-9ce5-d14281346f97"). InnerVolumeSpecName "kube-api-access-m9nzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.074758 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66bdd8af-d48f-4419-83cc-c06f0b71ee32-kube-api-access-bktp6" (OuterVolumeSpecName: "kube-api-access-bktp6") pod "66bdd8af-d48f-4419-83cc-c06f0b71ee32" (UID: "66bdd8af-d48f-4419-83cc-c06f0b71ee32"). InnerVolumeSpecName "kube-api-access-bktp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.077890 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814f160f-4a46-433d-8abb-bdf4b6ce65d3-kube-api-access-ffqg7" (OuterVolumeSpecName: "kube-api-access-ffqg7") pod "814f160f-4a46-433d-8abb-bdf4b6ce65d3" (UID: "814f160f-4a46-433d-8abb-bdf4b6ce65d3"). InnerVolumeSpecName "kube-api-access-ffqg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.077967 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-kube-api-access-8g2l9" (OuterVolumeSpecName: "kube-api-access-8g2l9") pod "84786dbb-5d55-4ca5-bceb-b66bd83d6c06" (UID: "84786dbb-5d55-4ca5-bceb-b66bd83d6c06"). InnerVolumeSpecName "kube-api-access-8g2l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.078014 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc15918-ffec-4589-85ea-bdf3999dc8d6-kube-api-access-jskxb" (OuterVolumeSpecName: "kube-api-access-jskxb") pod "8bc15918-ffec-4589-85ea-bdf3999dc8d6" (UID: "8bc15918-ffec-4589-85ea-bdf3999dc8d6"). InnerVolumeSpecName "kube-api-access-jskxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.172281 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g2l9\" (UniqueName: \"kubernetes.io/projected/84786dbb-5d55-4ca5-bceb-b66bd83d6c06-kube-api-access-8g2l9\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.172320 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcgvh\" (UniqueName: \"kubernetes.io/projected/2e0f3a81-94a7-4553-abc5-d58e0150aeea-kube-api-access-jcgvh\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.172334 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jskxb\" (UniqueName: \"kubernetes.io/projected/8bc15918-ffec-4589-85ea-bdf3999dc8d6-kube-api-access-jskxb\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.172347 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9nzd\" (UniqueName: \"kubernetes.io/projected/92010864-4f7f-4066-9ce5-d14281346f97-kube-api-access-m9nzd\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.172361 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bktp6\" (UniqueName: \"kubernetes.io/projected/66bdd8af-d48f-4419-83cc-c06f0b71ee32-kube-api-access-bktp6\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.172373 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffqg7\" (UniqueName: \"kubernetes.io/projected/814f160f-4a46-433d-8abb-bdf4b6ce65d3-kube-api-access-ffqg7\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.172463 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9656-account-create-update-k8fz9" event={"ID":"92010864-4f7f-4066-9ce5-d14281346f97","Type":"ContainerDied","Data":"6fd5a6dc453358c060d35ad7bc0a662207824e8d65c3838f9a6dcca2f6417988"} Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.172531 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9656-account-create-update-k8fz9" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.172553 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fd5a6dc453358c060d35ad7bc0a662207824e8d65c3838f9a6dcca2f6417988" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.179328 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-241c-account-create-update-tnztd" event={"ID":"8bc15918-ffec-4589-85ea-bdf3999dc8d6","Type":"ContainerDied","Data":"f6780d87ef21cd016c8094998486080c7fa49a22e9cf9eb42237391adea345b8"} Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.179390 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6780d87ef21cd016c8094998486080c7fa49a22e9cf9eb42237391adea345b8" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.179495 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-241c-account-create-update-tnztd" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.188759 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-v6s5m" event={"ID":"2e0f3a81-94a7-4553-abc5-d58e0150aeea","Type":"ContainerDied","Data":"d55ac6e3d7545f124afd84e3866618b79d70b02b9f2e68d4ee8901a2e09dedd5"} Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.188811 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d55ac6e3d7545f124afd84e3866618b79d70b02b9f2e68d4ee8901a2e09dedd5" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.188963 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-v6s5m" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.190485 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2de9-account-create-update-8hbsf" event={"ID":"66bdd8af-d48f-4419-83cc-c06f0b71ee32","Type":"ContainerDied","Data":"ee9151ba84eb00a77b60efb97052bbe970b045ebdd10d2ab1e52c3249e48aeca"} Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.190507 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee9151ba84eb00a77b60efb97052bbe970b045ebdd10d2ab1e52c3249e48aeca" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.190558 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2de9-account-create-update-8hbsf" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.195223 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ks9t7" event={"ID":"84786dbb-5d55-4ca5-bceb-b66bd83d6c06","Type":"ContainerDied","Data":"134b5d3ae4dea8158fb06bcf52c73fe796a50d90ac4fe7f3e662f4d5081e3b48"} Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.195296 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="134b5d3ae4dea8158fb06bcf52c73fe796a50d90ac4fe7f3e662f4d5081e3b48" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.195379 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ks9t7" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.201560 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-n6s7x" event={"ID":"814f160f-4a46-433d-8abb-bdf4b6ce65d3","Type":"ContainerDied","Data":"e0a8a359c808671874d726d42b7179ed377afc483cf0d86c0ed65d9e54267a76"} Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.201822 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0a8a359c808671874d726d42b7179ed377afc483cf0d86c0ed65d9e54267a76" Feb 23 07:07:27 crc kubenswrapper[5118]: I0223 07:07:27.201611 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-n6s7x" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.944047 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4tnnx"] Feb 23 07:07:32 crc kubenswrapper[5118]: E0223 07:07:32.945022 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0f3a81-94a7-4553-abc5-d58e0150aeea" containerName="mariadb-database-create" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.945037 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0f3a81-94a7-4553-abc5-d58e0150aeea" containerName="mariadb-database-create" Feb 23 07:07:32 crc kubenswrapper[5118]: E0223 07:07:32.945055 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84786dbb-5d55-4ca5-bceb-b66bd83d6c06" containerName="mariadb-database-create" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.945063 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="84786dbb-5d55-4ca5-bceb-b66bd83d6c06" containerName="mariadb-database-create" Feb 23 07:07:32 crc kubenswrapper[5118]: E0223 07:07:32.945080 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92010864-4f7f-4066-9ce5-d14281346f97" containerName="mariadb-account-create-update" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.945091 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="92010864-4f7f-4066-9ce5-d14281346f97" containerName="mariadb-account-create-update" Feb 23 07:07:32 crc kubenswrapper[5118]: E0223 07:07:32.946079 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc15918-ffec-4589-85ea-bdf3999dc8d6" containerName="mariadb-account-create-update" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.946090 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc15918-ffec-4589-85ea-bdf3999dc8d6" containerName="mariadb-account-create-update" Feb 23 07:07:32 crc kubenswrapper[5118]: E0223 07:07:32.946120 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66bdd8af-d48f-4419-83cc-c06f0b71ee32" containerName="mariadb-account-create-update" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.946129 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="66bdd8af-d48f-4419-83cc-c06f0b71ee32" containerName="mariadb-account-create-update" Feb 23 07:07:32 crc kubenswrapper[5118]: E0223 07:07:32.946143 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814f160f-4a46-433d-8abb-bdf4b6ce65d3" containerName="mariadb-database-create" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.946152 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="814f160f-4a46-433d-8abb-bdf4b6ce65d3" containerName="mariadb-database-create" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.946397 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="92010864-4f7f-4066-9ce5-d14281346f97" containerName="mariadb-account-create-update" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.946420 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0f3a81-94a7-4553-abc5-d58e0150aeea" containerName="mariadb-database-create" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.946450 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc15918-ffec-4589-85ea-bdf3999dc8d6" containerName="mariadb-account-create-update" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.946466 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="66bdd8af-d48f-4419-83cc-c06f0b71ee32" containerName="mariadb-account-create-update" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.946477 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="84786dbb-5d55-4ca5-bceb-b66bd83d6c06" containerName="mariadb-database-create" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.946492 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="814f160f-4a46-433d-8abb-bdf4b6ce65d3" containerName="mariadb-database-create" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.947306 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.949528 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.949528 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rlmzr" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.949720 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.960817 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4tnnx"] Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.984282 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.984377 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.984449 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.985625 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"497192e0fff21b8738796a3ac22d605d2776126154853b7db5acd78788b414a6"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:07:32 crc kubenswrapper[5118]: I0223 07:07:32.985732 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://497192e0fff21b8738796a3ac22d605d2776126154853b7db5acd78788b414a6" gracePeriod=600 Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.021078 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqdj7\" (UniqueName: \"kubernetes.io/projected/8ff3ee59-87cc-452f-a176-37b6f8d4307b-kube-api-access-dqdj7\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.021773 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-scripts\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.021981 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-config-data\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.022304 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.123374 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.123648 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqdj7\" (UniqueName: \"kubernetes.io/projected/8ff3ee59-87cc-452f-a176-37b6f8d4307b-kube-api-access-dqdj7\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.123745 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-scripts\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.123777 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-config-data\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.129254 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-scripts\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.129419 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-config-data\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.130293 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.147814 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.147885 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.147983 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqdj7\" (UniqueName: \"kubernetes.io/projected/8ff3ee59-87cc-452f-a176-37b6f8d4307b-kube-api-access-dqdj7\") pod \"nova-cell0-conductor-db-sync-4tnnx\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.218532 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.296807 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.315542 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-669cbc77fb-t2crt"] Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.315910 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-669cbc77fb-t2crt" podUID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" containerName="neutron-api" containerID="cri-o://8f35eaca65d796ab21f4ca0bcf43c8c10780748560c0fee8cea0a49d6e4d4212" gracePeriod=30 Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.318145 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-669cbc77fb-t2crt" podUID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" containerName="neutron-httpd" containerID="cri-o://2bd607225ed6de7ecdf4edab5d293168921b5d05789c1817b483b260e5a22348" gracePeriod=30 Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.342375 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="497192e0fff21b8738796a3ac22d605d2776126154853b7db5acd78788b414a6" exitCode=0 Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.342990 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"497192e0fff21b8738796a3ac22d605d2776126154853b7db5acd78788b414a6"} Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.343166 5118 scope.go:117] "RemoveContainer" containerID="a2567b9bb45aad766de8eaa23f029ab9162c75ff9459d11d1dce42cc736d50e9" Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.836656 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4tnnx"] Feb 23 07:07:33 crc kubenswrapper[5118]: W0223 07:07:33.866327 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ff3ee59_87cc_452f_a176_37b6f8d4307b.slice/crio-9d70fb9b8fae1321d06ba55e8a364cc9f9281f1e470125f494ac61b86ada525c WatchSource:0}: Error finding container 9d70fb9b8fae1321d06ba55e8a364cc9f9281f1e470125f494ac61b86ada525c: Status 404 returned error can't find the container with id 9d70fb9b8fae1321d06ba55e8a364cc9f9281f1e470125f494ac61b86ada525c Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.869007 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:07:33 crc kubenswrapper[5118]: I0223 07:07:33.909832 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.051843 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-log-httpd\") pod \"ff639ff8-dd33-445f-8321-6528b227179d\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.051948 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-config-data\") pod \"ff639ff8-dd33-445f-8321-6528b227179d\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.051985 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-scripts\") pod \"ff639ff8-dd33-445f-8321-6528b227179d\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.052084 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-run-httpd\") pod \"ff639ff8-dd33-445f-8321-6528b227179d\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.052210 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-combined-ca-bundle\") pod \"ff639ff8-dd33-445f-8321-6528b227179d\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.052251 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-sg-core-conf-yaml\") pod \"ff639ff8-dd33-445f-8321-6528b227179d\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.052288 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbvtl\" (UniqueName: \"kubernetes.io/projected/ff639ff8-dd33-445f-8321-6528b227179d-kube-api-access-lbvtl\") pod \"ff639ff8-dd33-445f-8321-6528b227179d\" (UID: \"ff639ff8-dd33-445f-8321-6528b227179d\") " Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.052544 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ff639ff8-dd33-445f-8321-6528b227179d" (UID: "ff639ff8-dd33-445f-8321-6528b227179d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.053294 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ff639ff8-dd33-445f-8321-6528b227179d" (UID: "ff639ff8-dd33-445f-8321-6528b227179d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.062417 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff639ff8-dd33-445f-8321-6528b227179d-kube-api-access-lbvtl" (OuterVolumeSpecName: "kube-api-access-lbvtl") pod "ff639ff8-dd33-445f-8321-6528b227179d" (UID: "ff639ff8-dd33-445f-8321-6528b227179d"). InnerVolumeSpecName "kube-api-access-lbvtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.062578 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-scripts" (OuterVolumeSpecName: "scripts") pod "ff639ff8-dd33-445f-8321-6528b227179d" (UID: "ff639ff8-dd33-445f-8321-6528b227179d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.083952 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ff639ff8-dd33-445f-8321-6528b227179d" (UID: "ff639ff8-dd33-445f-8321-6528b227179d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.125336 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff639ff8-dd33-445f-8321-6528b227179d" (UID: "ff639ff8-dd33-445f-8321-6528b227179d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.135275 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-config-data" (OuterVolumeSpecName: "config-data") pod "ff639ff8-dd33-445f-8321-6528b227179d" (UID: "ff639ff8-dd33-445f-8321-6528b227179d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.154894 5118 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.154935 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.154948 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.154959 5118 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ff639ff8-dd33-445f-8321-6528b227179d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.154969 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.154983 5118 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ff639ff8-dd33-445f-8321-6528b227179d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.154992 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbvtl\" (UniqueName: \"kubernetes.io/projected/ff639ff8-dd33-445f-8321-6528b227179d-kube-api-access-lbvtl\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.382381 5118 generic.go:334] "Generic (PLEG): container finished" podID="ff639ff8-dd33-445f-8321-6528b227179d" containerID="26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1" exitCode=137 Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.382510 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.382517 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff639ff8-dd33-445f-8321-6528b227179d","Type":"ContainerDied","Data":"26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1"} Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.382603 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ff639ff8-dd33-445f-8321-6528b227179d","Type":"ContainerDied","Data":"826a07d78195a74f00e2c51d67f74e217188667a281c3a321098529f73635374"} Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.382626 5118 scope.go:117] "RemoveContainer" containerID="26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.401700 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"4ac91484d2ab6a449ecd673a74a9b5daa98cf0d8d88bcca8f30a4d381279f2ab"} Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.411414 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4tnnx" event={"ID":"8ff3ee59-87cc-452f-a176-37b6f8d4307b","Type":"ContainerStarted","Data":"9d70fb9b8fae1321d06ba55e8a364cc9f9281f1e470125f494ac61b86ada525c"} Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.440235 5118 scope.go:117] "RemoveContainer" containerID="5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.440537 5118 generic.go:334] "Generic (PLEG): container finished" podID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" containerID="2bd607225ed6de7ecdf4edab5d293168921b5d05789c1817b483b260e5a22348" exitCode=0 Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.440609 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669cbc77fb-t2crt" event={"ID":"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2","Type":"ContainerDied","Data":"2bd607225ed6de7ecdf4edab5d293168921b5d05789c1817b483b260e5a22348"} Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.504142 5118 scope.go:117] "RemoveContainer" containerID="93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.536415 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.559075 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.571738 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:34 crc kubenswrapper[5118]: E0223 07:07:34.572363 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="sg-core" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.572384 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="sg-core" Feb 23 07:07:34 crc kubenswrapper[5118]: E0223 07:07:34.572415 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="ceilometer-notification-agent" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.572422 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="ceilometer-notification-agent" Feb 23 07:07:34 crc kubenswrapper[5118]: E0223 07:07:34.572450 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="proxy-httpd" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.572457 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="proxy-httpd" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.572647 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="ceilometer-notification-agent" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.572898 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="sg-core" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.572913 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff639ff8-dd33-445f-8321-6528b227179d" containerName="proxy-httpd" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.575281 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.577816 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.578508 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.584631 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.602029 5118 scope.go:117] "RemoveContainer" containerID="26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1" Feb 23 07:07:34 crc kubenswrapper[5118]: E0223 07:07:34.607607 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1\": container with ID starting with 26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1 not found: ID does not exist" containerID="26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.607684 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1"} err="failed to get container status \"26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1\": rpc error: code = NotFound desc = could not find container \"26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1\": container with ID starting with 26b9ff146226143323240ed8ea488f31d57c1c1b9b9835a5c0117b6602137aa1 not found: ID does not exist" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.607729 5118 scope.go:117] "RemoveContainer" containerID="5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8" Feb 23 07:07:34 crc kubenswrapper[5118]: E0223 07:07:34.608746 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8\": container with ID starting with 5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8 not found: ID does not exist" containerID="5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.608847 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8"} err="failed to get container status \"5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8\": rpc error: code = NotFound desc = could not find container \"5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8\": container with ID starting with 5e9becbef5fd85e7f2a732476d973f4343593dd415ba9e8ccdfa171c861c53c8 not found: ID does not exist" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.608966 5118 scope.go:117] "RemoveContainer" containerID="93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500" Feb 23 07:07:34 crc kubenswrapper[5118]: E0223 07:07:34.609559 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500\": container with ID starting with 93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500 not found: ID does not exist" containerID="93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.609624 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500"} err="failed to get container status \"93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500\": rpc error: code = NotFound desc = could not find container \"93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500\": container with ID starting with 93651b718b970ba2b2e8e27d169e004e178c1195f73952b81c363efeba4d7500 not found: ID does not exist" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.776283 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9g5h\" (UniqueName: \"kubernetes.io/projected/4c334c43-649b-4a34-aad2-c19af063362d-kube-api-access-r9g5h\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.776365 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-scripts\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.776394 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-run-httpd\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.776459 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-log-httpd\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.776487 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-config-data\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.776508 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.776576 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.878171 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-config-data\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.878227 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.878376 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.878438 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9g5h\" (UniqueName: \"kubernetes.io/projected/4c334c43-649b-4a34-aad2-c19af063362d-kube-api-access-r9g5h\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.878496 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-scripts\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.878525 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-run-httpd\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.878616 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-log-httpd\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.879777 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-run-httpd\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.879833 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-log-httpd\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.883002 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.883227 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.887230 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-scripts\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.891076 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-config-data\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.898390 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9g5h\" (UniqueName: \"kubernetes.io/projected/4c334c43-649b-4a34-aad2-c19af063362d-kube-api-access-r9g5h\") pod \"ceilometer-0\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " pod="openstack/ceilometer-0" Feb 23 07:07:34 crc kubenswrapper[5118]: I0223 07:07:34.907785 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:35 crc kubenswrapper[5118]: I0223 07:07:35.386355 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:35 crc kubenswrapper[5118]: I0223 07:07:35.461357 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c334c43-649b-4a34-aad2-c19af063362d","Type":"ContainerStarted","Data":"0ed31202470a27f5a064032e75747ad57670874a33d81156a787befc7fd1ac7e"} Feb 23 07:07:35 crc kubenswrapper[5118]: I0223 07:07:35.712151 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff639ff8-dd33-445f-8321-6528b227179d" path="/var/lib/kubelet/pods/ff639ff8-dd33-445f-8321-6528b227179d/volumes" Feb 23 07:07:36 crc kubenswrapper[5118]: I0223 07:07:36.503383 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c334c43-649b-4a34-aad2-c19af063362d","Type":"ContainerStarted","Data":"1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a"} Feb 23 07:07:37 crc kubenswrapper[5118]: I0223 07:07:37.517008 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c334c43-649b-4a34-aad2-c19af063362d","Type":"ContainerStarted","Data":"d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125"} Feb 23 07:07:38 crc kubenswrapper[5118]: I0223 07:07:38.532290 5118 generic.go:334] "Generic (PLEG): container finished" podID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" containerID="8f35eaca65d796ab21f4ca0bcf43c8c10780748560c0fee8cea0a49d6e4d4212" exitCode=0 Feb 23 07:07:38 crc kubenswrapper[5118]: I0223 07:07:38.532374 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669cbc77fb-t2crt" event={"ID":"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2","Type":"ContainerDied","Data":"8f35eaca65d796ab21f4ca0bcf43c8c10780748560c0fee8cea0a49d6e4d4212"} Feb 23 07:07:40 crc kubenswrapper[5118]: I0223 07:07:40.700266 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:40 crc kubenswrapper[5118]: I0223 07:07:40.823013 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:41 crc kubenswrapper[5118]: I0223 07:07:41.020908 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:07:41 crc kubenswrapper[5118]: I0223 07:07:41.100690 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b976557fd-6lkcx"] Feb 23 07:07:41 crc kubenswrapper[5118]: I0223 07:07:41.101137 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-b976557fd-6lkcx" podUID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" containerName="placement-log" containerID="cri-o://45ffa8bb3a7b2fda9250327348a8dbae14fb92a0d0dba32bea29be4860f5a723" gracePeriod=30 Feb 23 07:07:41 crc kubenswrapper[5118]: I0223 07:07:41.103342 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-b976557fd-6lkcx" podUID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" containerName="placement-api" containerID="cri-o://9c2df29044a4a6505db392529c5d66e0c129921b569f448d9246b1dfaaeab1f6" gracePeriod=30 Feb 23 07:07:41 crc kubenswrapper[5118]: I0223 07:07:41.579769 5118 generic.go:334] "Generic (PLEG): container finished" podID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" containerID="45ffa8bb3a7b2fda9250327348a8dbae14fb92a0d0dba32bea29be4860f5a723" exitCode=143 Feb 23 07:07:41 crc kubenswrapper[5118]: I0223 07:07:41.579853 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b976557fd-6lkcx" event={"ID":"9a0dba31-6eac-45a8-b327-cfa8af20e3cb","Type":"ContainerDied","Data":"45ffa8bb3a7b2fda9250327348a8dbae14fb92a0d0dba32bea29be4860f5a723"} Feb 23 07:07:41 crc kubenswrapper[5118]: I0223 07:07:41.715881 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:41 crc kubenswrapper[5118]: I0223 07:07:41.716715 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" containerName="glance-log" containerID="cri-o://115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63" gracePeriod=30 Feb 23 07:07:41 crc kubenswrapper[5118]: I0223 07:07:41.716893 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" containerName="glance-httpd" containerID="cri-o://460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127" gracePeriod=30 Feb 23 07:07:42 crc kubenswrapper[5118]: I0223 07:07:42.594381 5118 generic.go:334] "Generic (PLEG): container finished" podID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" containerID="115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63" exitCode=143 Feb 23 07:07:42 crc kubenswrapper[5118]: I0223 07:07:42.594469 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31dd5760-ad76-42fa-bc30-b6cb90f353e5","Type":"ContainerDied","Data":"115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63"} Feb 23 07:07:42 crc kubenswrapper[5118]: I0223 07:07:42.865826 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:42 crc kubenswrapper[5118]: I0223 07:07:42.866296 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="22653bef-e824-4fc6-9a90-3a275160b3a4" containerName="glance-httpd" containerID="cri-o://67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0" gracePeriod=30 Feb 23 07:07:42 crc kubenswrapper[5118]: I0223 07:07:42.866151 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="22653bef-e824-4fc6-9a90-3a275160b3a4" containerName="glance-log" containerID="cri-o://5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022" gracePeriod=30 Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.657850 5118 generic.go:334] "Generic (PLEG): container finished" podID="22653bef-e824-4fc6-9a90-3a275160b3a4" containerID="5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022" exitCode=143 Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.658494 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22653bef-e824-4fc6-9a90-3a275160b3a4","Type":"ContainerDied","Data":"5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022"} Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.708404 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.817420 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-httpd-config\") pod \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.817562 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvt8s\" (UniqueName: \"kubernetes.io/projected/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-kube-api-access-fvt8s\") pod \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.817652 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-combined-ca-bundle\") pod \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.817681 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-ovndb-tls-certs\") pod \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.817924 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-config\") pod \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\" (UID: \"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2\") " Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.827620 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-kube-api-access-fvt8s" (OuterVolumeSpecName: "kube-api-access-fvt8s") pod "27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" (UID: "27637b93-b63b-4da7-9f5b-3e3ed25ce2a2"). InnerVolumeSpecName "kube-api-access-fvt8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.833721 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" (UID: "27637b93-b63b-4da7-9f5b-3e3ed25ce2a2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.873839 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" (UID: "27637b93-b63b-4da7-9f5b-3e3ed25ce2a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.879129 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-config" (OuterVolumeSpecName: "config") pod "27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" (UID: "27637b93-b63b-4da7-9f5b-3e3ed25ce2a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.901198 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" (UID: "27637b93-b63b-4da7-9f5b-3e3ed25ce2a2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.920557 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.920602 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.920614 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvt8s\" (UniqueName: \"kubernetes.io/projected/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-kube-api-access-fvt8s\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.920628 5118 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:43 crc kubenswrapper[5118]: I0223 07:07:43.920638 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.670784 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c334c43-649b-4a34-aad2-c19af063362d","Type":"ContainerStarted","Data":"2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095"} Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.673476 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4tnnx" event={"ID":"8ff3ee59-87cc-452f-a176-37b6f8d4307b","Type":"ContainerStarted","Data":"b790bed49ac437e2b4ca3f9fbeb049900c080653098145f84fa077dc058b9955"} Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.675935 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-669cbc77fb-t2crt" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.675943 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-669cbc77fb-t2crt" event={"ID":"27637b93-b63b-4da7-9f5b-3e3ed25ce2a2","Type":"ContainerDied","Data":"53cb9a6f5b6607145d0065f1061b097e7990aff809ac6e35ec98e20aed2dae2d"} Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.676011 5118 scope.go:117] "RemoveContainer" containerID="2bd607225ed6de7ecdf4edab5d293168921b5d05789c1817b483b260e5a22348" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.683632 5118 generic.go:334] "Generic (PLEG): container finished" podID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" containerID="9c2df29044a4a6505db392529c5d66e0c129921b569f448d9246b1dfaaeab1f6" exitCode=0 Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.683695 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b976557fd-6lkcx" event={"ID":"9a0dba31-6eac-45a8-b327-cfa8af20e3cb","Type":"ContainerDied","Data":"9c2df29044a4a6505db392529c5d66e0c129921b569f448d9246b1dfaaeab1f6"} Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.683801 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b976557fd-6lkcx" event={"ID":"9a0dba31-6eac-45a8-b327-cfa8af20e3cb","Type":"ContainerDied","Data":"dab074f4c5a6cda8167c62ebbfc6403ad510cb26329d6e63a6effd40549b2e61"} Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.683824 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dab074f4c5a6cda8167c62ebbfc6403ad510cb26329d6e63a6effd40549b2e61" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.719534 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-4tnnx" podStartSLOduration=3.201196082 podStartE2EDuration="12.7195101s" podCreationTimestamp="2026-02-23 07:07:32 +0000 UTC" firstStartedPulling="2026-02-23 07:07:33.868750175 +0000 UTC m=+1316.872534748" lastFinishedPulling="2026-02-23 07:07:43.387064193 +0000 UTC m=+1326.390848766" observedRunningTime="2026-02-23 07:07:44.71655791 +0000 UTC m=+1327.720342483" watchObservedRunningTime="2026-02-23 07:07:44.7195101 +0000 UTC m=+1327.723294673" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.730352 5118 scope.go:117] "RemoveContainer" containerID="8f35eaca65d796ab21f4ca0bcf43c8c10780748560c0fee8cea0a49d6e4d4212" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.730635 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.755874 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-669cbc77fb-t2crt"] Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.770779 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-669cbc77fb-t2crt"] Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.866606 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-combined-ca-bundle\") pod \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.866673 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-config-data\") pod \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.866717 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfht9\" (UniqueName: \"kubernetes.io/projected/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-kube-api-access-qfht9\") pod \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.866772 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-public-tls-certs\") pod \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.866834 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-internal-tls-certs\") pod \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.866881 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-logs\") pod \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.866969 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-scripts\") pod \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\" (UID: \"9a0dba31-6eac-45a8-b327-cfa8af20e3cb\") " Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.872664 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-scripts" (OuterVolumeSpecName: "scripts") pod "9a0dba31-6eac-45a8-b327-cfa8af20e3cb" (UID: "9a0dba31-6eac-45a8-b327-cfa8af20e3cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.873032 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-logs" (OuterVolumeSpecName: "logs") pod "9a0dba31-6eac-45a8-b327-cfa8af20e3cb" (UID: "9a0dba31-6eac-45a8-b327-cfa8af20e3cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.877940 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-kube-api-access-qfht9" (OuterVolumeSpecName: "kube-api-access-qfht9") pod "9a0dba31-6eac-45a8-b327-cfa8af20e3cb" (UID: "9a0dba31-6eac-45a8-b327-cfa8af20e3cb"). InnerVolumeSpecName "kube-api-access-qfht9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.969668 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.969697 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfht9\" (UniqueName: \"kubernetes.io/projected/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-kube-api-access-qfht9\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.969710 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:44 crc kubenswrapper[5118]: I0223 07:07:44.996469 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a0dba31-6eac-45a8-b327-cfa8af20e3cb" (UID: "9a0dba31-6eac-45a8-b327-cfa8af20e3cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.065301 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-config-data" (OuterVolumeSpecName: "config-data") pod "9a0dba31-6eac-45a8-b327-cfa8af20e3cb" (UID: "9a0dba31-6eac-45a8-b327-cfa8af20e3cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.073384 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.073419 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.087226 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9a0dba31-6eac-45a8-b327-cfa8af20e3cb" (UID: "9a0dba31-6eac-45a8-b327-cfa8af20e3cb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.108744 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9a0dba31-6eac-45a8-b327-cfa8af20e3cb" (UID: "9a0dba31-6eac-45a8-b327-cfa8af20e3cb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.177839 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.177879 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0dba31-6eac-45a8-b327-cfa8af20e3cb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.655365 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.710077 5118 generic.go:334] "Generic (PLEG): container finished" podID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" containerID="460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127" exitCode=0 Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.710309 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.723006 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b976557fd-6lkcx" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.723241 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" path="/var/lib/kubelet/pods/27637b93-b63b-4da7-9f5b-3e3ed25ce2a2/volumes" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.723406 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="ceilometer-central-agent" containerID="cri-o://1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a" gracePeriod=30 Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.723591 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="proxy-httpd" containerID="cri-o://5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416" gracePeriod=30 Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.723637 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="sg-core" containerID="cri-o://2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095" gracePeriod=30 Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.723710 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="ceilometer-notification-agent" containerID="cri-o://d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125" gracePeriod=30 Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.731263 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31dd5760-ad76-42fa-bc30-b6cb90f353e5","Type":"ContainerDied","Data":"460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127"} Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.731317 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31dd5760-ad76-42fa-bc30-b6cb90f353e5","Type":"ContainerDied","Data":"3d29e9738ad95cfa97695152f119423f98d65f291597f6d1f5769ffb912c7feb"} Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.731329 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c334c43-649b-4a34-aad2-c19af063362d","Type":"ContainerStarted","Data":"5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416"} Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.731350 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.731371 5118 scope.go:117] "RemoveContainer" containerID="460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.772323 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.267780315 podStartE2EDuration="11.772302455s" podCreationTimestamp="2026-02-23 07:07:34 +0000 UTC" firstStartedPulling="2026-02-23 07:07:35.407497514 +0000 UTC m=+1318.411282087" lastFinishedPulling="2026-02-23 07:07:44.912019654 +0000 UTC m=+1327.915804227" observedRunningTime="2026-02-23 07:07:45.766524277 +0000 UTC m=+1328.770308850" watchObservedRunningTime="2026-02-23 07:07:45.772302455 +0000 UTC m=+1328.776087028" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.810628 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-public-tls-certs\") pod \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.810688 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-combined-ca-bundle\") pod \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.810785 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-scripts\") pod \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.810832 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-config-data\") pod \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.810891 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp9ml\" (UniqueName: \"kubernetes.io/projected/31dd5760-ad76-42fa-bc30-b6cb90f353e5-kube-api-access-wp9ml\") pod \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.810975 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-httpd-run\") pod \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.811066 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-logs\") pod \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.811090 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\" (UID: \"31dd5760-ad76-42fa-bc30-b6cb90f353e5\") " Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.813695 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "31dd5760-ad76-42fa-bc30-b6cb90f353e5" (UID: "31dd5760-ad76-42fa-bc30-b6cb90f353e5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.814326 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-logs" (OuterVolumeSpecName: "logs") pod "31dd5760-ad76-42fa-bc30-b6cb90f353e5" (UID: "31dd5760-ad76-42fa-bc30-b6cb90f353e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.824284 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "31dd5760-ad76-42fa-bc30-b6cb90f353e5" (UID: "31dd5760-ad76-42fa-bc30-b6cb90f353e5"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.828244 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-scripts" (OuterVolumeSpecName: "scripts") pod "31dd5760-ad76-42fa-bc30-b6cb90f353e5" (UID: "31dd5760-ad76-42fa-bc30-b6cb90f353e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.863893 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31dd5760-ad76-42fa-bc30-b6cb90f353e5-kube-api-access-wp9ml" (OuterVolumeSpecName: "kube-api-access-wp9ml") pod "31dd5760-ad76-42fa-bc30-b6cb90f353e5" (UID: "31dd5760-ad76-42fa-bc30-b6cb90f353e5"). InnerVolumeSpecName "kube-api-access-wp9ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.874141 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31dd5760-ad76-42fa-bc30-b6cb90f353e5" (UID: "31dd5760-ad76-42fa-bc30-b6cb90f353e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.894396 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-config-data" (OuterVolumeSpecName: "config-data") pod "31dd5760-ad76-42fa-bc30-b6cb90f353e5" (UID: "31dd5760-ad76-42fa-bc30-b6cb90f353e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.913504 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.913605 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31dd5760-ad76-42fa-bc30-b6cb90f353e5-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.913693 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.913752 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.913807 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.913858 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.913911 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp9ml\" (UniqueName: \"kubernetes.io/projected/31dd5760-ad76-42fa-bc30-b6cb90f353e5-kube-api-access-wp9ml\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.933472 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "31dd5760-ad76-42fa-bc30-b6cb90f353e5" (UID: "31dd5760-ad76-42fa-bc30-b6cb90f353e5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.934360 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 23 07:07:45 crc kubenswrapper[5118]: I0223 07:07:45.985763 5118 scope.go:117] "RemoveContainer" containerID="115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.009981 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b976557fd-6lkcx"] Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.015382 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.015409 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31dd5760-ad76-42fa-bc30-b6cb90f353e5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.015553 5118 scope.go:117] "RemoveContainer" containerID="460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127" Feb 23 07:07:46 crc kubenswrapper[5118]: E0223 07:07:46.016403 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127\": container with ID starting with 460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127 not found: ID does not exist" containerID="460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.016474 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127"} err="failed to get container status \"460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127\": rpc error: code = NotFound desc = could not find container \"460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127\": container with ID starting with 460fb419f8dbb5155ecd7a1f6235c505c2968b0755173dc29cbd3f153e5a4127 not found: ID does not exist" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.016506 5118 scope.go:117] "RemoveContainer" containerID="115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63" Feb 23 07:07:46 crc kubenswrapper[5118]: E0223 07:07:46.020508 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63\": container with ID starting with 115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63 not found: ID does not exist" containerID="115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.020557 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63"} err="failed to get container status \"115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63\": rpc error: code = NotFound desc = could not find container \"115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63\": container with ID starting with 115bd5337a29f2abc7812e519f0f1d8a515447e35b16737157971e82a749ed63 not found: ID does not exist" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.021782 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b976557fd-6lkcx"] Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.068046 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.083802 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.093739 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:46 crc kubenswrapper[5118]: E0223 07:07:46.094371 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" containerName="neutron-httpd" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.094390 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" containerName="neutron-httpd" Feb 23 07:07:46 crc kubenswrapper[5118]: E0223 07:07:46.094401 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" containerName="placement-api" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.094408 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" containerName="placement-api" Feb 23 07:07:46 crc kubenswrapper[5118]: E0223 07:07:46.094427 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" containerName="neutron-api" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.094433 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" containerName="neutron-api" Feb 23 07:07:46 crc kubenswrapper[5118]: E0223 07:07:46.094444 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" containerName="placement-log" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.094450 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" containerName="placement-log" Feb 23 07:07:46 crc kubenswrapper[5118]: E0223 07:07:46.094458 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" containerName="glance-log" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.094463 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" containerName="glance-log" Feb 23 07:07:46 crc kubenswrapper[5118]: E0223 07:07:46.094479 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" containerName="glance-httpd" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.094485 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" containerName="glance-httpd" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.095290 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" containerName="glance-log" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.095306 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" containerName="glance-httpd" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.095327 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" containerName="placement-log" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.095337 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" containerName="neutron-api" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.095343 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" containerName="placement-api" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.095360 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="27637b93-b63b-4da7-9f5b-3e3ed25ce2a2" containerName="neutron-httpd" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.096534 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.101438 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.101516 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.105039 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.219458 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.219526 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.219552 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.219577 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdr4s\" (UniqueName: \"kubernetes.io/projected/376ff246-417d-442a-83d1-1579abd318ba-kube-api-access-kdr4s\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.219608 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.219636 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-logs\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.219715 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.219735 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.321491 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.321542 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.321634 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.321671 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.321696 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.321717 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdr4s\" (UniqueName: \"kubernetes.io/projected/376ff246-417d-442a-83d1-1579abd318ba-kube-api-access-kdr4s\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.321744 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.321772 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-logs\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.322013 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.322583 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-logs\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.322741 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.329994 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.331036 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-config-data\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.335793 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.335969 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-scripts\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.340290 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdr4s\" (UniqueName: \"kubernetes.io/projected/376ff246-417d-442a-83d1-1579abd318ba-kube-api-access-kdr4s\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.364237 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.468472 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.630676 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.748145 5118 generic.go:334] "Generic (PLEG): container finished" podID="4c334c43-649b-4a34-aad2-c19af063362d" containerID="5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416" exitCode=0 Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.748197 5118 generic.go:334] "Generic (PLEG): container finished" podID="4c334c43-649b-4a34-aad2-c19af063362d" containerID="2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095" exitCode=2 Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.748208 5118 generic.go:334] "Generic (PLEG): container finished" podID="4c334c43-649b-4a34-aad2-c19af063362d" containerID="d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125" exitCode=0 Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.748280 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c334c43-649b-4a34-aad2-c19af063362d","Type":"ContainerDied","Data":"5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416"} Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.748310 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c334c43-649b-4a34-aad2-c19af063362d","Type":"ContainerDied","Data":"2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095"} Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.748322 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c334c43-649b-4a34-aad2-c19af063362d","Type":"ContainerDied","Data":"d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125"} Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.751149 5118 generic.go:334] "Generic (PLEG): container finished" podID="22653bef-e824-4fc6-9a90-3a275160b3a4" containerID="67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0" exitCode=0 Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.751196 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22653bef-e824-4fc6-9a90-3a275160b3a4","Type":"ContainerDied","Data":"67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0"} Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.751216 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"22653bef-e824-4fc6-9a90-3a275160b3a4","Type":"ContainerDied","Data":"198ac77e2e045888b34196e59bffa93fd7e5ad63f3a76cdaa3617e8783ace41b"} Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.751235 5118 scope.go:117] "RemoveContainer" containerID="67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.751395 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.777926 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"22653bef-e824-4fc6-9a90-3a275160b3a4\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.778062 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n7rx\" (UniqueName: \"kubernetes.io/projected/22653bef-e824-4fc6-9a90-3a275160b3a4-kube-api-access-9n7rx\") pod \"22653bef-e824-4fc6-9a90-3a275160b3a4\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.778976 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-internal-tls-certs\") pod \"22653bef-e824-4fc6-9a90-3a275160b3a4\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.779053 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-scripts\") pod \"22653bef-e824-4fc6-9a90-3a275160b3a4\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.779144 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-config-data\") pod \"22653bef-e824-4fc6-9a90-3a275160b3a4\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.779186 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-logs\") pod \"22653bef-e824-4fc6-9a90-3a275160b3a4\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.779220 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-combined-ca-bundle\") pod \"22653bef-e824-4fc6-9a90-3a275160b3a4\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.779245 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-httpd-run\") pod \"22653bef-e824-4fc6-9a90-3a275160b3a4\" (UID: \"22653bef-e824-4fc6-9a90-3a275160b3a4\") " Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.784440 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "22653bef-e824-4fc6-9a90-3a275160b3a4" (UID: "22653bef-e824-4fc6-9a90-3a275160b3a4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.784695 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-logs" (OuterVolumeSpecName: "logs") pod "22653bef-e824-4fc6-9a90-3a275160b3a4" (UID: "22653bef-e824-4fc6-9a90-3a275160b3a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.791936 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22653bef-e824-4fc6-9a90-3a275160b3a4-kube-api-access-9n7rx" (OuterVolumeSpecName: "kube-api-access-9n7rx") pod "22653bef-e824-4fc6-9a90-3a275160b3a4" (UID: "22653bef-e824-4fc6-9a90-3a275160b3a4"). InnerVolumeSpecName "kube-api-access-9n7rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.792772 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "22653bef-e824-4fc6-9a90-3a275160b3a4" (UID: "22653bef-e824-4fc6-9a90-3a275160b3a4"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.795424 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-scripts" (OuterVolumeSpecName: "scripts") pod "22653bef-e824-4fc6-9a90-3a275160b3a4" (UID: "22653bef-e824-4fc6-9a90-3a275160b3a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.825282 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22653bef-e824-4fc6-9a90-3a275160b3a4" (UID: "22653bef-e824-4fc6-9a90-3a275160b3a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.828599 5118 scope.go:117] "RemoveContainer" containerID="5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.842859 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-config-data" (OuterVolumeSpecName: "config-data") pod "22653bef-e824-4fc6-9a90-3a275160b3a4" (UID: "22653bef-e824-4fc6-9a90-3a275160b3a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.845672 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "22653bef-e824-4fc6-9a90-3a275160b3a4" (UID: "22653bef-e824-4fc6-9a90-3a275160b3a4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.857823 5118 scope.go:117] "RemoveContainer" containerID="67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0" Feb 23 07:07:46 crc kubenswrapper[5118]: E0223 07:07:46.859371 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0\": container with ID starting with 67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0 not found: ID does not exist" containerID="67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.859422 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0"} err="failed to get container status \"67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0\": rpc error: code = NotFound desc = could not find container \"67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0\": container with ID starting with 67db000b6995ed03430aefbb0aacc1d4f65555a9369ccc45b071b0647bfe74a0 not found: ID does not exist" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.859456 5118 scope.go:117] "RemoveContainer" containerID="5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022" Feb 23 07:07:46 crc kubenswrapper[5118]: E0223 07:07:46.862246 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022\": container with ID starting with 5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022 not found: ID does not exist" containerID="5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.862311 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022"} err="failed to get container status \"5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022\": rpc error: code = NotFound desc = could not find container \"5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022\": container with ID starting with 5fc7cf64b424be7e590f29502c488433faa700bb37d0b2034837a12da7241022 not found: ID does not exist" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.880936 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.881327 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n7rx\" (UniqueName: \"kubernetes.io/projected/22653bef-e824-4fc6-9a90-3a275160b3a4-kube-api-access-9n7rx\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.881342 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.881354 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.881363 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.881372 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.881382 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22653bef-e824-4fc6-9a90-3a275160b3a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.881393 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/22653bef-e824-4fc6-9a90-3a275160b3a4-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.914207 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 23 07:07:46 crc kubenswrapper[5118]: I0223 07:07:46.983703 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.097546 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.129821 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.154829 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:47 crc kubenswrapper[5118]: E0223 07:07:47.155515 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22653bef-e824-4fc6-9a90-3a275160b3a4" containerName="glance-log" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.155543 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="22653bef-e824-4fc6-9a90-3a275160b3a4" containerName="glance-log" Feb 23 07:07:47 crc kubenswrapper[5118]: E0223 07:07:47.155551 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22653bef-e824-4fc6-9a90-3a275160b3a4" containerName="glance-httpd" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.155557 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="22653bef-e824-4fc6-9a90-3a275160b3a4" containerName="glance-httpd" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.155767 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="22653bef-e824-4fc6-9a90-3a275160b3a4" containerName="glance-log" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.155797 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="22653bef-e824-4fc6-9a90-3a275160b3a4" containerName="glance-httpd" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.157074 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.159889 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.160242 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.185612 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:47 crc kubenswrapper[5118]: W0223 07:07:47.236783 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod376ff246_417d_442a_83d1_1579abd318ba.slice/crio-a6a10ad7518edc61a27564a7c7336e6d8d6019edc247d01835baad623ae51fc0 WatchSource:0}: Error finding container a6a10ad7518edc61a27564a7c7336e6d8d6019edc247d01835baad623ae51fc0: Status 404 returned error can't find the container with id a6a10ad7518edc61a27564a7c7336e6d8d6019edc247d01835baad623ae51fc0 Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.240489 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.290773 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.290862 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.290948 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.291009 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdnj8\" (UniqueName: \"kubernetes.io/projected/2a79f618-3555-44a5-8c52-ec9120261645-kube-api-access-wdnj8\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.291064 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.291133 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.291213 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.291253 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.393626 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.394026 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.394125 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.394151 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.394171 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdnj8\" (UniqueName: \"kubernetes.io/projected/2a79f618-3555-44a5-8c52-ec9120261645-kube-api-access-wdnj8\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.394222 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.394283 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.394315 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.394554 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-logs\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.395022 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.395297 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.402115 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.403859 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.403922 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.404759 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.420547 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdnj8\" (UniqueName: \"kubernetes.io/projected/2a79f618-3555-44a5-8c52-ec9120261645-kube-api-access-wdnj8\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.440464 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.477387 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.721256 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22653bef-e824-4fc6-9a90-3a275160b3a4" path="/var/lib/kubelet/pods/22653bef-e824-4fc6-9a90-3a275160b3a4/volumes" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.722468 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31dd5760-ad76-42fa-bc30-b6cb90f353e5" path="/var/lib/kubelet/pods/31dd5760-ad76-42fa-bc30-b6cb90f353e5/volumes" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.723381 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a0dba31-6eac-45a8-b327-cfa8af20e3cb" path="/var/lib/kubelet/pods/9a0dba31-6eac-45a8-b327-cfa8af20e3cb/volumes" Feb 23 07:07:47 crc kubenswrapper[5118]: I0223 07:07:47.765145 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"376ff246-417d-442a-83d1-1579abd318ba","Type":"ContainerStarted","Data":"a6a10ad7518edc61a27564a7c7336e6d8d6019edc247d01835baad623ae51fc0"} Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.187019 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.377700 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.519063 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-config-data\") pod \"4c334c43-649b-4a34-aad2-c19af063362d\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.519250 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-sg-core-conf-yaml\") pod \"4c334c43-649b-4a34-aad2-c19af063362d\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.519281 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-run-httpd\") pod \"4c334c43-649b-4a34-aad2-c19af063362d\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.519331 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-log-httpd\") pod \"4c334c43-649b-4a34-aad2-c19af063362d\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.519442 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-scripts\") pod \"4c334c43-649b-4a34-aad2-c19af063362d\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.519527 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-combined-ca-bundle\") pod \"4c334c43-649b-4a34-aad2-c19af063362d\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.519576 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9g5h\" (UniqueName: \"kubernetes.io/projected/4c334c43-649b-4a34-aad2-c19af063362d-kube-api-access-r9g5h\") pod \"4c334c43-649b-4a34-aad2-c19af063362d\" (UID: \"4c334c43-649b-4a34-aad2-c19af063362d\") " Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.520028 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4c334c43-649b-4a34-aad2-c19af063362d" (UID: "4c334c43-649b-4a34-aad2-c19af063362d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.520303 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4c334c43-649b-4a34-aad2-c19af063362d" (UID: "4c334c43-649b-4a34-aad2-c19af063362d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.547235 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c334c43-649b-4a34-aad2-c19af063362d-kube-api-access-r9g5h" (OuterVolumeSpecName: "kube-api-access-r9g5h") pod "4c334c43-649b-4a34-aad2-c19af063362d" (UID: "4c334c43-649b-4a34-aad2-c19af063362d"). InnerVolumeSpecName "kube-api-access-r9g5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.547277 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-scripts" (OuterVolumeSpecName: "scripts") pod "4c334c43-649b-4a34-aad2-c19af063362d" (UID: "4c334c43-649b-4a34-aad2-c19af063362d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.562237 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4c334c43-649b-4a34-aad2-c19af063362d" (UID: "4c334c43-649b-4a34-aad2-c19af063362d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.622814 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.622864 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9g5h\" (UniqueName: \"kubernetes.io/projected/4c334c43-649b-4a34-aad2-c19af063362d-kube-api-access-r9g5h\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.622882 5118 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.622896 5118 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.622910 5118 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4c334c43-649b-4a34-aad2-c19af063362d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.641364 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c334c43-649b-4a34-aad2-c19af063362d" (UID: "4c334c43-649b-4a34-aad2-c19af063362d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.678912 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-config-data" (OuterVolumeSpecName: "config-data") pod "4c334c43-649b-4a34-aad2-c19af063362d" (UID: "4c334c43-649b-4a34-aad2-c19af063362d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.725738 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.725770 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c334c43-649b-4a34-aad2-c19af063362d-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.783213 5118 generic.go:334] "Generic (PLEG): container finished" podID="4c334c43-649b-4a34-aad2-c19af063362d" containerID="1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a" exitCode=0 Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.783303 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c334c43-649b-4a34-aad2-c19af063362d","Type":"ContainerDied","Data":"1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a"} Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.783358 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4c334c43-649b-4a34-aad2-c19af063362d","Type":"ContainerDied","Data":"0ed31202470a27f5a064032e75747ad57670874a33d81156a787befc7fd1ac7e"} Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.783381 5118 scope.go:117] "RemoveContainer" containerID="5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.783548 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.798446 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"376ff246-417d-442a-83d1-1579abd318ba","Type":"ContainerStarted","Data":"f79ac5b11fb0deba4274f00ce33e6c2a41832523dee197459bcb484c55984024"} Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.798551 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"376ff246-417d-442a-83d1-1579abd318ba","Type":"ContainerStarted","Data":"7aa9aba83a6a4faa226f1ac01b3aa1d78be1db352ff3cdcb6601be775a19cd5c"} Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.802626 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a79f618-3555-44a5-8c52-ec9120261645","Type":"ContainerStarted","Data":"7d93bed45f12f635a4f79ad66401c18ff3cd1fb3434c49200ff33284cce196d3"} Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.810519 5118 scope.go:117] "RemoveContainer" containerID="2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.828195 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.828174688 podStartE2EDuration="2.828174688s" podCreationTimestamp="2026-02-23 07:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:48.827623895 +0000 UTC m=+1331.831408468" watchObservedRunningTime="2026-02-23 07:07:48.828174688 +0000 UTC m=+1331.831959261" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.855848 5118 scope.go:117] "RemoveContainer" containerID="d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.858121 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.878006 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.887852 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:48 crc kubenswrapper[5118]: E0223 07:07:48.888329 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="ceilometer-central-agent" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.888350 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="ceilometer-central-agent" Feb 23 07:07:48 crc kubenswrapper[5118]: E0223 07:07:48.888365 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="sg-core" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.888372 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="sg-core" Feb 23 07:07:48 crc kubenswrapper[5118]: E0223 07:07:48.888382 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="proxy-httpd" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.888390 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="proxy-httpd" Feb 23 07:07:48 crc kubenswrapper[5118]: E0223 07:07:48.888412 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="ceilometer-notification-agent" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.888418 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="ceilometer-notification-agent" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.888608 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="proxy-httpd" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.888618 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="sg-core" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.888630 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="ceilometer-notification-agent" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.888649 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c334c43-649b-4a34-aad2-c19af063362d" containerName="ceilometer-central-agent" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.894967 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.899030 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.899187 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.908375 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:48 crc kubenswrapper[5118]: I0223 07:07:48.932513 5118 scope.go:117] "RemoveContainer" containerID="1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.014448 5118 scope.go:117] "RemoveContainer" containerID="5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416" Feb 23 07:07:49 crc kubenswrapper[5118]: E0223 07:07:49.018427 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416\": container with ID starting with 5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416 not found: ID does not exist" containerID="5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.018480 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416"} err="failed to get container status \"5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416\": rpc error: code = NotFound desc = could not find container \"5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416\": container with ID starting with 5357400fe2b6ac4fd392b5f4d3ef599540665b5d73b0c43e312c118bfe91a416 not found: ID does not exist" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.018519 5118 scope.go:117] "RemoveContainer" containerID="2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095" Feb 23 07:07:49 crc kubenswrapper[5118]: E0223 07:07:49.018840 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095\": container with ID starting with 2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095 not found: ID does not exist" containerID="2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.018862 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095"} err="failed to get container status \"2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095\": rpc error: code = NotFound desc = could not find container \"2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095\": container with ID starting with 2caf8aca02f0e5c0ab1f12f4db24990e2e36da4f5647ce4825798fd5b6e9b095 not found: ID does not exist" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.018879 5118 scope.go:117] "RemoveContainer" containerID="d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125" Feb 23 07:07:49 crc kubenswrapper[5118]: E0223 07:07:49.022122 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125\": container with ID starting with d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125 not found: ID does not exist" containerID="d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.022179 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125"} err="failed to get container status \"d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125\": rpc error: code = NotFound desc = could not find container \"d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125\": container with ID starting with d64d2eb3cce28898996d7f6286589396571a3ee420b796b61b535fc16d50c125 not found: ID does not exist" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.022217 5118 scope.go:117] "RemoveContainer" containerID="1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a" Feb 23 07:07:49 crc kubenswrapper[5118]: E0223 07:07:49.023541 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a\": container with ID starting with 1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a not found: ID does not exist" containerID="1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.023588 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a"} err="failed to get container status \"1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a\": rpc error: code = NotFound desc = could not find container \"1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a\": container with ID starting with 1a2f1e81153ead5748584618d147b8523c5115f1bbd45fd2fbe251db8ce0914a not found: ID does not exist" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.036423 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.036515 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sqbt\" (UniqueName: \"kubernetes.io/projected/b92146b1-cfa3-40c6-9c91-cb268024bd23-kube-api-access-6sqbt\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.036552 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-scripts\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.036575 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.036609 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-config-data\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.036667 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-log-httpd\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.036690 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-run-httpd\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.138696 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.138763 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sqbt\" (UniqueName: \"kubernetes.io/projected/b92146b1-cfa3-40c6-9c91-cb268024bd23-kube-api-access-6sqbt\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.138795 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-scripts\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.138820 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.138856 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-config-data\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.138917 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-log-httpd\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.138943 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-run-httpd\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.139500 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-run-httpd\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.142163 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-log-httpd\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.146336 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.147668 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-config-data\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.148006 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.148743 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-scripts\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.158293 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sqbt\" (UniqueName: \"kubernetes.io/projected/b92146b1-cfa3-40c6-9c91-cb268024bd23-kube-api-access-6sqbt\") pod \"ceilometer-0\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.257076 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.709117 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c334c43-649b-4a34-aad2-c19af063362d" path="/var/lib/kubelet/pods/4c334c43-649b-4a34-aad2-c19af063362d/volumes" Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.796924 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:07:49 crc kubenswrapper[5118]: W0223 07:07:49.808695 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92146b1_cfa3_40c6_9c91_cb268024bd23.slice/crio-2915c3f84cbeca37904a31cee0bd591a30a84fd57cf88e068fe1164f20aaa5ed WatchSource:0}: Error finding container 2915c3f84cbeca37904a31cee0bd591a30a84fd57cf88e068fe1164f20aaa5ed: Status 404 returned error can't find the container with id 2915c3f84cbeca37904a31cee0bd591a30a84fd57cf88e068fe1164f20aaa5ed Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.814344 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a79f618-3555-44a5-8c52-ec9120261645","Type":"ContainerStarted","Data":"4000f09676001090ea18cdb7df7926a1a769a2407d4b9008c3f8702217c3491e"} Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.814405 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a79f618-3555-44a5-8c52-ec9120261645","Type":"ContainerStarted","Data":"6b8c6eb9b9ce37fbe932ac5074db82c6dfa9e1f23cb6e0b090172cccdaf73dcf"} Feb 23 07:07:49 crc kubenswrapper[5118]: I0223 07:07:49.838190 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.838169807 podStartE2EDuration="2.838169807s" podCreationTimestamp="2026-02-23 07:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:07:49.837177294 +0000 UTC m=+1332.840961877" watchObservedRunningTime="2026-02-23 07:07:49.838169807 +0000 UTC m=+1332.841954380" Feb 23 07:07:50 crc kubenswrapper[5118]: I0223 07:07:50.831253 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b92146b1-cfa3-40c6-9c91-cb268024bd23","Type":"ContainerStarted","Data":"5b1673251e24d33e14cb885d1a30100b57c0b80b0883e50be4e26c0544e5eeaa"} Feb 23 07:07:50 crc kubenswrapper[5118]: I0223 07:07:50.832277 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b92146b1-cfa3-40c6-9c91-cb268024bd23","Type":"ContainerStarted","Data":"2915c3f84cbeca37904a31cee0bd591a30a84fd57cf88e068fe1164f20aaa5ed"} Feb 23 07:07:51 crc kubenswrapper[5118]: I0223 07:07:51.843995 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b92146b1-cfa3-40c6-9c91-cb268024bd23","Type":"ContainerStarted","Data":"63d318b94849d84942e140978eb3d696a0f38b314c7ae1a84ee03f020ac59443"} Feb 23 07:07:52 crc kubenswrapper[5118]: I0223 07:07:52.860139 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b92146b1-cfa3-40c6-9c91-cb268024bd23","Type":"ContainerStarted","Data":"85649c8553e19892cd3a25464dcbc4e9c7dbd2ca6f717c585fde0037df3d2ef1"} Feb 23 07:07:53 crc kubenswrapper[5118]: I0223 07:07:53.892847 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b92146b1-cfa3-40c6-9c91-cb268024bd23","Type":"ContainerStarted","Data":"a065dfe336ce5b3881868f56481691176a9b31d3f9dbcb65b5a810d6d7860b51"} Feb 23 07:07:53 crc kubenswrapper[5118]: I0223 07:07:53.893461 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:07:53 crc kubenswrapper[5118]: I0223 07:07:53.921343 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.779770586 podStartE2EDuration="5.921321962s" podCreationTimestamp="2026-02-23 07:07:48 +0000 UTC" firstStartedPulling="2026-02-23 07:07:49.81632456 +0000 UTC m=+1332.820109133" lastFinishedPulling="2026-02-23 07:07:52.957875896 +0000 UTC m=+1335.961660509" observedRunningTime="2026-02-23 07:07:53.91575291 +0000 UTC m=+1336.919537503" watchObservedRunningTime="2026-02-23 07:07:53.921321962 +0000 UTC m=+1336.925106535" Feb 23 07:07:55 crc kubenswrapper[5118]: I0223 07:07:55.915357 5118 generic.go:334] "Generic (PLEG): container finished" podID="8ff3ee59-87cc-452f-a176-37b6f8d4307b" containerID="b790bed49ac437e2b4ca3f9fbeb049900c080653098145f84fa077dc058b9955" exitCode=0 Feb 23 07:07:55 crc kubenswrapper[5118]: I0223 07:07:55.915443 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4tnnx" event={"ID":"8ff3ee59-87cc-452f-a176-37b6f8d4307b","Type":"ContainerDied","Data":"b790bed49ac437e2b4ca3f9fbeb049900c080653098145f84fa077dc058b9955"} Feb 23 07:07:56 crc kubenswrapper[5118]: I0223 07:07:56.470757 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:07:56 crc kubenswrapper[5118]: I0223 07:07:56.473251 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 07:07:56 crc kubenswrapper[5118]: I0223 07:07:56.529234 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:07:56 crc kubenswrapper[5118]: I0223 07:07:56.559717 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 07:07:56 crc kubenswrapper[5118]: I0223 07:07:56.925987 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:07:56 crc kubenswrapper[5118]: I0223 07:07:56.926246 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.317220 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.452652 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqdj7\" (UniqueName: \"kubernetes.io/projected/8ff3ee59-87cc-452f-a176-37b6f8d4307b-kube-api-access-dqdj7\") pod \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.452747 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-combined-ca-bundle\") pod \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.452917 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-config-data\") pod \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.453162 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-scripts\") pod \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\" (UID: \"8ff3ee59-87cc-452f-a176-37b6f8d4307b\") " Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.462726 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-scripts" (OuterVolumeSpecName: "scripts") pod "8ff3ee59-87cc-452f-a176-37b6f8d4307b" (UID: "8ff3ee59-87cc-452f-a176-37b6f8d4307b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.463074 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ff3ee59-87cc-452f-a176-37b6f8d4307b-kube-api-access-dqdj7" (OuterVolumeSpecName: "kube-api-access-dqdj7") pod "8ff3ee59-87cc-452f-a176-37b6f8d4307b" (UID: "8ff3ee59-87cc-452f-a176-37b6f8d4307b"). InnerVolumeSpecName "kube-api-access-dqdj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.477713 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.477811 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.504793 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-config-data" (OuterVolumeSpecName: "config-data") pod "8ff3ee59-87cc-452f-a176-37b6f8d4307b" (UID: "8ff3ee59-87cc-452f-a176-37b6f8d4307b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.510512 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ff3ee59-87cc-452f-a176-37b6f8d4307b" (UID: "8ff3ee59-87cc-452f-a176-37b6f8d4307b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.540502 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.555601 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.558260 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqdj7\" (UniqueName: \"kubernetes.io/projected/8ff3ee59-87cc-452f-a176-37b6f8d4307b-kube-api-access-dqdj7\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.558301 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.558326 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.558345 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ff3ee59-87cc-452f-a176-37b6f8d4307b-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.941693 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4tnnx" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.941965 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4tnnx" event={"ID":"8ff3ee59-87cc-452f-a176-37b6f8d4307b","Type":"ContainerDied","Data":"9d70fb9b8fae1321d06ba55e8a364cc9f9281f1e470125f494ac61b86ada525c"} Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.942419 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d70fb9b8fae1321d06ba55e8a364cc9f9281f1e470125f494ac61b86ada525c" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.943425 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:57 crc kubenswrapper[5118]: I0223 07:07:57.943508 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.140341 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:07:58 crc kubenswrapper[5118]: E0223 07:07:58.140984 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ff3ee59-87cc-452f-a176-37b6f8d4307b" containerName="nova-cell0-conductor-db-sync" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.141010 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ff3ee59-87cc-452f-a176-37b6f8d4307b" containerName="nova-cell0-conductor-db-sync" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.141304 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ff3ee59-87cc-452f-a176-37b6f8d4307b" containerName="nova-cell0-conductor-db-sync" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.142200 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.147322 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rlmzr" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.148073 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.152835 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.277670 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.278053 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtg9b\" (UniqueName: \"kubernetes.io/projected/334e9392-6a5f-4aa8-83d7-41e26e94dd32-kube-api-access-xtg9b\") pod \"nova-cell0-conductor-0\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.278119 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.379882 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtg9b\" (UniqueName: \"kubernetes.io/projected/334e9392-6a5f-4aa8-83d7-41e26e94dd32-kube-api-access-xtg9b\") pod \"nova-cell0-conductor-0\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.379938 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.380032 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.386189 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.396736 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.413384 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtg9b\" (UniqueName: \"kubernetes.io/projected/334e9392-6a5f-4aa8-83d7-41e26e94dd32-kube-api-access-xtg9b\") pod \"nova-cell0-conductor-0\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.462526 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.968569 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.969083 5118 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:07:58 crc kubenswrapper[5118]: I0223 07:07:58.972060 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 07:07:59 crc kubenswrapper[5118]: I0223 07:07:59.010637 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:07:59 crc kubenswrapper[5118]: I0223 07:07:59.979767 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"334e9392-6a5f-4aa8-83d7-41e26e94dd32","Type":"ContainerStarted","Data":"6820419b366ae8b47d31039250fb3d906f1f99ec9b556bf263a6f996823a2472"} Feb 23 07:07:59 crc kubenswrapper[5118]: I0223 07:07:59.980464 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"334e9392-6a5f-4aa8-83d7-41e26e94dd32","Type":"ContainerStarted","Data":"1fcd449de2364e242719177391062d4a455abf457bbd8dcfb5134cf4c7c827b9"} Feb 23 07:07:59 crc kubenswrapper[5118]: I0223 07:07:59.980536 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:00 crc kubenswrapper[5118]: I0223 07:08:00.006411 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.006388067 podStartE2EDuration="2.006388067s" podCreationTimestamp="2026-02-23 07:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:00.002142417 +0000 UTC m=+1343.005926990" watchObservedRunningTime="2026-02-23 07:08:00.006388067 +0000 UTC m=+1343.010172640" Feb 23 07:08:00 crc kubenswrapper[5118]: I0223 07:08:00.093435 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:00 crc kubenswrapper[5118]: I0223 07:08:00.094478 5118 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 07:08:00 crc kubenswrapper[5118]: I0223 07:08:00.108542 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 07:08:08 crc kubenswrapper[5118]: I0223 07:08:08.516620 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.189829 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-h47kr"] Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.196448 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.200249 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.201192 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h47kr"] Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.201205 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.291635 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-config-data\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.291949 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9qdg\" (UniqueName: \"kubernetes.io/projected/d081a48a-959d-43fe-92de-180551979ba7-kube-api-access-m9qdg\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.292198 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.292392 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-scripts\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.394716 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.395179 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-scripts\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.395204 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-config-data\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.395241 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9qdg\" (UniqueName: \"kubernetes.io/projected/d081a48a-959d-43fe-92de-180551979ba7-kube-api-access-m9qdg\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.414983 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.415626 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-scripts\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.426222 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-config-data\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.485184 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9qdg\" (UniqueName: \"kubernetes.io/projected/d081a48a-959d-43fe-92de-180551979ba7-kube-api-access-m9qdg\") pod \"nova-cell0-cell-mapping-h47kr\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.528581 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.575119 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.576387 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.624474 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.642353 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.645255 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.674723 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.702290 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.702424 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.702457 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzrx9\" (UniqueName: \"kubernetes.io/projected/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-kube-api-access-fzrx9\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.734208 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.799553 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.803932 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-config-data\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.804002 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tctdm\" (UniqueName: \"kubernetes.io/projected/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-kube-api-access-tctdm\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.804039 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.804108 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.804133 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-logs\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.804162 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzrx9\" (UniqueName: \"kubernetes.io/projected/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-kube-api-access-fzrx9\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.804220 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.815802 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.863642 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.864185 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.880318 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.892649 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.923378 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzrx9\" (UniqueName: \"kubernetes.io/projected/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-kube-api-access-fzrx9\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.946655 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.946754 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-config-data\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.946790 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tctdm\" (UniqueName: \"kubernetes.io/projected/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-kube-api-access-tctdm\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.946837 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-logs\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.946865 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.946899 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-logs\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.946942 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-config-data\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.946963 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72bl\" (UniqueName: \"kubernetes.io/projected/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-kube-api-access-w72bl\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.952086 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-logs\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.955653 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-config-data\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:09 crc kubenswrapper[5118]: I0223 07:08:09.982621 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.001638 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.004704 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.020089 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tctdm\" (UniqueName: \"kubernetes.io/projected/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-kube-api-access-tctdm\") pod \"nova-api-0\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " pod="openstack/nova-api-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.024733 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.054231 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-logs\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.054339 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.054394 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-config-data\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.054411 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72bl\" (UniqueName: \"kubernetes.io/projected/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-kube-api-access-w72bl\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.055511 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-logs\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.058716 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.062759 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-config-data\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.080677 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72bl\" (UniqueName: \"kubernetes.io/projected/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-kube-api-access-w72bl\") pod \"nova-metadata-0\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " pod="openstack/nova-metadata-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.116737 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.118575 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.124060 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.147787 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.158929 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mhcb\" (UniqueName: \"kubernetes.io/projected/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-kube-api-access-8mhcb\") pod \"nova-scheduler-0\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.159074 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-config-data\") pod \"nova-scheduler-0\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.159184 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.173284 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-849fff7679-bb58m"] Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.176503 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.187712 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-bb58m"] Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.212918 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.261886 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-swift-storage-0\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.262181 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-config\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.262778 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-svc\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.263238 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-sb\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.275205 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74xns\" (UniqueName: \"kubernetes.io/projected/8a38ce22-4dfe-44b7-bd53-df058bf38afe-kube-api-access-74xns\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.275300 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-nb\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.275332 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mhcb\" (UniqueName: \"kubernetes.io/projected/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-kube-api-access-8mhcb\") pod \"nova-scheduler-0\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.275357 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-config-data\") pod \"nova-scheduler-0\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.275408 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.280179 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-config-data\") pod \"nova-scheduler-0\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.281316 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.294949 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mhcb\" (UniqueName: \"kubernetes.io/projected/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-kube-api-access-8mhcb\") pod \"nova-scheduler-0\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.377612 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-config\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.377715 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-svc\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.377753 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-sb\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.377787 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74xns\" (UniqueName: \"kubernetes.io/projected/8a38ce22-4dfe-44b7-bd53-df058bf38afe-kube-api-access-74xns\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.377823 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-nb\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.377888 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-swift-storage-0\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.379688 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-svc\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.379990 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-sb\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.381244 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-swift-storage-0\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.381307 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-nb\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.381461 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-config\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.405680 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74xns\" (UniqueName: \"kubernetes.io/projected/8a38ce22-4dfe-44b7-bd53-df058bf38afe-kube-api-access-74xns\") pod \"dnsmasq-dns-849fff7679-bb58m\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.410494 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h47kr"] Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.458832 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.505885 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.560495 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wps6h"] Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.566139 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.569165 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.569622 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.579841 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wps6h"] Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.588255 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.686378 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.686485 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-config-data\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.686556 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-scripts\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.686609 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g87k\" (UniqueName: \"kubernetes.io/projected/74e0be85-acce-4ee5-a56a-22f60082695d-kube-api-access-7g87k\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.707270 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:10 crc kubenswrapper[5118]: W0223 07:08:10.720494 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2764489_48a8_4e1d_88e6_2bbd0c9e3585.slice/crio-8534901dcdd6f644a098cc5e6d9c59313283e91668c3c633fcf1f5c724faf7f0 WatchSource:0}: Error finding container 8534901dcdd6f644a098cc5e6d9c59313283e91668c3c633fcf1f5c724faf7f0: Status 404 returned error can't find the container with id 8534901dcdd6f644a098cc5e6d9c59313283e91668c3c633fcf1f5c724faf7f0 Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.791771 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.792356 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-config-data\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.792509 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-scripts\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.792549 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g87k\" (UniqueName: \"kubernetes.io/projected/74e0be85-acce-4ee5-a56a-22f60082695d-kube-api-access-7g87k\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.800519 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.800531 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-config-data\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.800852 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-scripts\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.814435 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g87k\" (UniqueName: \"kubernetes.io/projected/74e0be85-acce-4ee5-a56a-22f60082695d-kube-api-access-7g87k\") pod \"nova-cell1-conductor-db-sync-wps6h\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:10 crc kubenswrapper[5118]: I0223 07:08:10.875553 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.017242 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.075892 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-bb58m"] Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.109445 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:08:11 crc kubenswrapper[5118]: W0223 07:08:11.120231 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b80ddbb_91d7_496a_bd7c_d0798dad05f6.slice/crio-e93e9969e2d3b1f54823eb80d373a785d8fca82a53315883881a40c6a8f1edd0 WatchSource:0}: Error finding container e93e9969e2d3b1f54823eb80d373a785d8fca82a53315883881a40c6a8f1edd0: Status 404 returned error can't find the container with id e93e9969e2d3b1f54823eb80d373a785d8fca82a53315883881a40c6a8f1edd0 Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.136942 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-bb58m" event={"ID":"8a38ce22-4dfe-44b7-bd53-df058bf38afe","Type":"ContainerStarted","Data":"0e2590d9ecf33e189bf395fa71206cd698caaf3963536229dcc415e5e911ab7c"} Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.138313 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b","Type":"ContainerStarted","Data":"9daaec3fd27e9f2a4caa2f2fe363aef604dcede92bf585d98eab1ca7e32649d0"} Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.148496 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h47kr" event={"ID":"d081a48a-959d-43fe-92de-180551979ba7","Type":"ContainerStarted","Data":"b1c8d946aea11cb7af4ee8d18d5fddb1fc1d4ea9530beb309c55bad6ab1ac93a"} Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.148845 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h47kr" event={"ID":"d081a48a-959d-43fe-92de-180551979ba7","Type":"ContainerStarted","Data":"dd08c0bdc6e0f716267a91dfa8f68587e9d0d2412c42c6d4dd3a762b69c179d9"} Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.161962 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0eedf734-c6e4-47ae-ba24-64c3a057ce8b","Type":"ContainerStarted","Data":"bd3c26a6f00f7f4816b3f45550043c05f576a7b30e7bd0ac80c70d09c7d65ecf"} Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.177623 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2764489-48a8-4e1d-88e6-2bbd0c9e3585","Type":"ContainerStarted","Data":"8534901dcdd6f644a098cc5e6d9c59313283e91668c3c633fcf1f5c724faf7f0"} Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.180800 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-h47kr" podStartSLOduration=2.180742975 podStartE2EDuration="2.180742975s" podCreationTimestamp="2026-02-23 07:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:11.176389862 +0000 UTC m=+1354.180174485" watchObservedRunningTime="2026-02-23 07:08:11.180742975 +0000 UTC m=+1354.184527588" Feb 23 07:08:11 crc kubenswrapper[5118]: I0223 07:08:11.533344 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wps6h"] Feb 23 07:08:11 crc kubenswrapper[5118]: E0223 07:08:11.901250 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a38ce22_4dfe_44b7_bd53_df058bf38afe.slice/crio-conmon-6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a38ce22_4dfe_44b7_bd53_df058bf38afe.slice/crio-6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a.scope\": RecentStats: unable to find data in memory cache]" Feb 23 07:08:12 crc kubenswrapper[5118]: I0223 07:08:12.215632 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wps6h" event={"ID":"74e0be85-acce-4ee5-a56a-22f60082695d","Type":"ContainerStarted","Data":"dfbcad5efe1813a218d7063b164a48376977a850a9936dc6b7bc7563d47686e9"} Feb 23 07:08:12 crc kubenswrapper[5118]: I0223 07:08:12.215718 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wps6h" event={"ID":"74e0be85-acce-4ee5-a56a-22f60082695d","Type":"ContainerStarted","Data":"1abd6b02d74821500d1c22256f8fc61ff0ce80b540004522d0328fb2d2ccef0a"} Feb 23 07:08:12 crc kubenswrapper[5118]: I0223 07:08:12.223179 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5b80ddbb-91d7-496a-bd7c-d0798dad05f6","Type":"ContainerStarted","Data":"e93e9969e2d3b1f54823eb80d373a785d8fca82a53315883881a40c6a8f1edd0"} Feb 23 07:08:12 crc kubenswrapper[5118]: I0223 07:08:12.237036 5118 generic.go:334] "Generic (PLEG): container finished" podID="8a38ce22-4dfe-44b7-bd53-df058bf38afe" containerID="6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a" exitCode=0 Feb 23 07:08:12 crc kubenswrapper[5118]: I0223 07:08:12.238584 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-bb58m" event={"ID":"8a38ce22-4dfe-44b7-bd53-df058bf38afe","Type":"ContainerDied","Data":"6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a"} Feb 23 07:08:12 crc kubenswrapper[5118]: I0223 07:08:12.238884 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wps6h" podStartSLOduration=2.238865835 podStartE2EDuration="2.238865835s" podCreationTimestamp="2026-02-23 07:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:12.232380232 +0000 UTC m=+1355.236164805" watchObservedRunningTime="2026-02-23 07:08:12.238865835 +0000 UTC m=+1355.242650408" Feb 23 07:08:14 crc kubenswrapper[5118]: I0223 07:08:14.000905 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:08:14 crc kubenswrapper[5118]: I0223 07:08:14.010636 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.275392 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0eedf734-c6e4-47ae-ba24-64c3a057ce8b","Type":"ContainerStarted","Data":"0adb6c329a2cd02154518bf66f2b846b623f8eee18887b789bac01be55ba8b0b"} Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.276476 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0eedf734-c6e4-47ae-ba24-64c3a057ce8b","Type":"ContainerStarted","Data":"d4558ce9d6310e8db6eb2cdf9488a47360b540290b0f1dfc0509bc3c7c548193"} Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.275591 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" containerName="nova-metadata-metadata" containerID="cri-o://0adb6c329a2cd02154518bf66f2b846b623f8eee18887b789bac01be55ba8b0b" gracePeriod=30 Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.275516 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" containerName="nova-metadata-log" containerID="cri-o://d4558ce9d6310e8db6eb2cdf9488a47360b540290b0f1dfc0509bc3c7c548193" gracePeriod=30 Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.283769 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2764489-48a8-4e1d-88e6-2bbd0c9e3585","Type":"ContainerStarted","Data":"29ab15e771c8d2bfde6f33b4f6d051919adf09dce39953172b199ed5aca3c723"} Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.283830 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2764489-48a8-4e1d-88e6-2bbd0c9e3585","Type":"ContainerStarted","Data":"85debac3ccc2911a804ae95237cba4b8a027c9e5be200e4cd588484f443d6ba6"} Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.286051 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5b80ddbb-91d7-496a-bd7c-d0798dad05f6","Type":"ContainerStarted","Data":"b3addf3691e84cbed33a8caa1ed42d09867e2ded2b5b13a3ed5d86681396513a"} Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.290118 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-bb58m" event={"ID":"8a38ce22-4dfe-44b7-bd53-df058bf38afe","Type":"ContainerStarted","Data":"867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20"} Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.290316 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.292264 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b","Type":"ContainerStarted","Data":"85d21d0dac4a95b6e222c0bbe6b332c072d7e302a0738ab596e737641693a839"} Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.292384 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://85d21d0dac4a95b6e222c0bbe6b332c072d7e302a0738ab596e737641693a839" gracePeriod=30 Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.310004 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.591037954 podStartE2EDuration="6.309981841s" podCreationTimestamp="2026-02-23 07:08:09 +0000 UTC" firstStartedPulling="2026-02-23 07:08:10.87690328 +0000 UTC m=+1353.880687853" lastFinishedPulling="2026-02-23 07:08:14.595847147 +0000 UTC m=+1357.599631740" observedRunningTime="2026-02-23 07:08:15.303451576 +0000 UTC m=+1358.307236149" watchObservedRunningTime="2026-02-23 07:08:15.309981841 +0000 UTC m=+1358.313766414" Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.332654 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.378840312 podStartE2EDuration="6.332630638s" podCreationTimestamp="2026-02-23 07:08:09 +0000 UTC" firstStartedPulling="2026-02-23 07:08:10.64289537 +0000 UTC m=+1353.646679943" lastFinishedPulling="2026-02-23 07:08:14.596685696 +0000 UTC m=+1357.600470269" observedRunningTime="2026-02-23 07:08:15.320192813 +0000 UTC m=+1358.323977386" watchObservedRunningTime="2026-02-23 07:08:15.332630638 +0000 UTC m=+1358.336415211" Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.351982 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.883241793 podStartE2EDuration="6.351959467s" podCreationTimestamp="2026-02-23 07:08:09 +0000 UTC" firstStartedPulling="2026-02-23 07:08:11.127221505 +0000 UTC m=+1354.131006098" lastFinishedPulling="2026-02-23 07:08:14.595939179 +0000 UTC m=+1357.599723772" observedRunningTime="2026-02-23 07:08:15.342228556 +0000 UTC m=+1358.346013139" watchObservedRunningTime="2026-02-23 07:08:15.351959467 +0000 UTC m=+1358.355744040" Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.377265 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-849fff7679-bb58m" podStartSLOduration=6.377241286 podStartE2EDuration="6.377241286s" podCreationTimestamp="2026-02-23 07:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:15.36473834 +0000 UTC m=+1358.368522913" watchObservedRunningTime="2026-02-23 07:08:15.377241286 +0000 UTC m=+1358.381025859" Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.413790 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.540436654 podStartE2EDuration="6.413765292s" podCreationTimestamp="2026-02-23 07:08:09 +0000 UTC" firstStartedPulling="2026-02-23 07:08:10.723080672 +0000 UTC m=+1353.726865255" lastFinishedPulling="2026-02-23 07:08:14.59640932 +0000 UTC m=+1357.600193893" observedRunningTime="2026-02-23 07:08:15.400313363 +0000 UTC m=+1358.404097936" watchObservedRunningTime="2026-02-23 07:08:15.413765292 +0000 UTC m=+1358.417549865" Feb 23 07:08:15 crc kubenswrapper[5118]: I0223 07:08:15.460242 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 07:08:16 crc kubenswrapper[5118]: I0223 07:08:16.302731 5118 generic.go:334] "Generic (PLEG): container finished" podID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" containerID="d4558ce9d6310e8db6eb2cdf9488a47360b540290b0f1dfc0509bc3c7c548193" exitCode=143 Feb 23 07:08:16 crc kubenswrapper[5118]: I0223 07:08:16.302837 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0eedf734-c6e4-47ae-ba24-64c3a057ce8b","Type":"ContainerDied","Data":"d4558ce9d6310e8db6eb2cdf9488a47360b540290b0f1dfc0509bc3c7c548193"} Feb 23 07:08:19 crc kubenswrapper[5118]: I0223 07:08:19.271417 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 07:08:19 crc kubenswrapper[5118]: I0223 07:08:19.382557 5118 generic.go:334] "Generic (PLEG): container finished" podID="d081a48a-959d-43fe-92de-180551979ba7" containerID="b1c8d946aea11cb7af4ee8d18d5fddb1fc1d4ea9530beb309c55bad6ab1ac93a" exitCode=0 Feb 23 07:08:19 crc kubenswrapper[5118]: I0223 07:08:19.382614 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h47kr" event={"ID":"d081a48a-959d-43fe-92de-180551979ba7","Type":"ContainerDied","Data":"b1c8d946aea11cb7af4ee8d18d5fddb1fc1d4ea9530beb309c55bad6ab1ac93a"} Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.006225 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.026642 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.026747 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.213027 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.213116 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.393305 5118 generic.go:334] "Generic (PLEG): container finished" podID="74e0be85-acce-4ee5-a56a-22f60082695d" containerID="dfbcad5efe1813a218d7063b164a48376977a850a9936dc6b7bc7563d47686e9" exitCode=0 Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.393389 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wps6h" event={"ID":"74e0be85-acce-4ee5-a56a-22f60082695d","Type":"ContainerDied","Data":"dfbcad5efe1813a218d7063b164a48376977a850a9936dc6b7bc7563d47686e9"} Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.459782 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.505416 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.507307 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.610641 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-vdh92"] Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.610889 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" podUID="d4609473-1aaa-4fb5-9971-6978cbe4ae63" containerName="dnsmasq-dns" containerID="cri-o://88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f" gracePeriod=10 Feb 23 07:08:20 crc kubenswrapper[5118]: I0223 07:08:20.894758 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.020015 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-combined-ca-bundle\") pod \"d081a48a-959d-43fe-92de-180551979ba7\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.020918 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9qdg\" (UniqueName: \"kubernetes.io/projected/d081a48a-959d-43fe-92de-180551979ba7-kube-api-access-m9qdg\") pod \"d081a48a-959d-43fe-92de-180551979ba7\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.021046 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-config-data\") pod \"d081a48a-959d-43fe-92de-180551979ba7\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.021198 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-scripts\") pod \"d081a48a-959d-43fe-92de-180551979ba7\" (UID: \"d081a48a-959d-43fe-92de-180551979ba7\") " Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.029531 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d081a48a-959d-43fe-92de-180551979ba7-kube-api-access-m9qdg" (OuterVolumeSpecName: "kube-api-access-m9qdg") pod "d081a48a-959d-43fe-92de-180551979ba7" (UID: "d081a48a-959d-43fe-92de-180551979ba7"). InnerVolumeSpecName "kube-api-access-m9qdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.030554 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-scripts" (OuterVolumeSpecName: "scripts") pod "d081a48a-959d-43fe-92de-180551979ba7" (UID: "d081a48a-959d-43fe-92de-180551979ba7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.052993 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d081a48a-959d-43fe-92de-180551979ba7" (UID: "d081a48a-959d-43fe-92de-180551979ba7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.083292 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-config-data" (OuterVolumeSpecName: "config-data") pod "d081a48a-959d-43fe-92de-180551979ba7" (UID: "d081a48a-959d-43fe-92de-180551979ba7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.109441 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.109456 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.125256 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9qdg\" (UniqueName: \"kubernetes.io/projected/d081a48a-959d-43fe-92de-180551979ba7-kube-api-access-m9qdg\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.125315 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.125332 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.125347 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d081a48a-959d-43fe-92de-180551979ba7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.169171 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.330799 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-svc\") pod \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.330847 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-sb\") pod \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.331079 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-config\") pod \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.331129 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-swift-storage-0\") pod \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.331188 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kx5l\" (UniqueName: \"kubernetes.io/projected/d4609473-1aaa-4fb5-9971-6978cbe4ae63-kube-api-access-5kx5l\") pod \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.331236 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-nb\") pod \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\" (UID: \"d4609473-1aaa-4fb5-9971-6978cbe4ae63\") " Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.340758 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4609473-1aaa-4fb5-9971-6978cbe4ae63-kube-api-access-5kx5l" (OuterVolumeSpecName: "kube-api-access-5kx5l") pod "d4609473-1aaa-4fb5-9971-6978cbe4ae63" (UID: "d4609473-1aaa-4fb5-9971-6978cbe4ae63"). InnerVolumeSpecName "kube-api-access-5kx5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.383875 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4609473-1aaa-4fb5-9971-6978cbe4ae63" (UID: "d4609473-1aaa-4fb5-9971-6978cbe4ae63"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.393412 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d4609473-1aaa-4fb5-9971-6978cbe4ae63" (UID: "d4609473-1aaa-4fb5-9971-6978cbe4ae63"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.396026 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-config" (OuterVolumeSpecName: "config") pod "d4609473-1aaa-4fb5-9971-6978cbe4ae63" (UID: "d4609473-1aaa-4fb5-9971-6978cbe4ae63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.398118 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d4609473-1aaa-4fb5-9971-6978cbe4ae63" (UID: "d4609473-1aaa-4fb5-9971-6978cbe4ae63"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.411027 5118 generic.go:334] "Generic (PLEG): container finished" podID="d4609473-1aaa-4fb5-9971-6978cbe4ae63" containerID="88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f" exitCode=0 Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.411338 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" event={"ID":"d4609473-1aaa-4fb5-9971-6978cbe4ae63","Type":"ContainerDied","Data":"88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f"} Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.411435 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" event={"ID":"d4609473-1aaa-4fb5-9971-6978cbe4ae63","Type":"ContainerDied","Data":"28f9356e0a8318fed6f4ef5d1c89f8ef39c38b72cc27ee5509f8aeb3d2eeaa65"} Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.411512 5118 scope.go:117] "RemoveContainer" containerID="88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.411695 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c69c79c7f-vdh92" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.417011 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h47kr" event={"ID":"d081a48a-959d-43fe-92de-180551979ba7","Type":"ContainerDied","Data":"dd08c0bdc6e0f716267a91dfa8f68587e9d0d2412c42c6d4dd3a762b69c179d9"} Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.417082 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd08c0bdc6e0f716267a91dfa8f68587e9d0d2412c42c6d4dd3a762b69c179d9" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.417203 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h47kr" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.417313 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d4609473-1aaa-4fb5-9971-6978cbe4ae63" (UID: "d4609473-1aaa-4fb5-9971-6978cbe4ae63"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.433154 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.443668 5118 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.443714 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kx5l\" (UniqueName: \"kubernetes.io/projected/d4609473-1aaa-4fb5-9971-6978cbe4ae63-kube-api-access-5kx5l\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.443729 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.443766 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.443781 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4609473-1aaa-4fb5-9971-6978cbe4ae63-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.469651 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.486193 5118 scope.go:117] "RemoveContainer" containerID="d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.528365 5118 scope.go:117] "RemoveContainer" containerID="88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f" Feb 23 07:08:21 crc kubenswrapper[5118]: E0223 07:08:21.528878 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f\": container with ID starting with 88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f not found: ID does not exist" containerID="88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.528911 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f"} err="failed to get container status \"88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f\": rpc error: code = NotFound desc = could not find container \"88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f\": container with ID starting with 88fa691b5f611ece6d9514f4ceb2bdeca25a27bfc793ab84e97bf459300e7f3f not found: ID does not exist" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.528933 5118 scope.go:117] "RemoveContainer" containerID="d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f" Feb 23 07:08:21 crc kubenswrapper[5118]: E0223 07:08:21.529162 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f\": container with ID starting with d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f not found: ID does not exist" containerID="d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.529183 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f"} err="failed to get container status \"d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f\": rpc error: code = NotFound desc = could not find container \"d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f\": container with ID starting with d4bf2e72a21bef123f7aeb4aa44d3b8ed69fc6bd5ee0d48e149e0b5bcb9e140f not found: ID does not exist" Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.676816 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.677592 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerName="nova-api-api" containerID="cri-o://29ab15e771c8d2bfde6f33b4f6d051919adf09dce39953172b199ed5aca3c723" gracePeriod=30 Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.677943 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerName="nova-api-log" containerID="cri-o://85debac3ccc2911a804ae95237cba4b8a027c9e5be200e4cd588484f443d6ba6" gracePeriod=30 Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.829802 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-vdh92"] Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.841866 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c69c79c7f-vdh92"] Feb 23 07:08:21 crc kubenswrapper[5118]: I0223 07:08:21.900315 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.057852 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-config-data\") pod \"74e0be85-acce-4ee5-a56a-22f60082695d\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.057951 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-scripts\") pod \"74e0be85-acce-4ee5-a56a-22f60082695d\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.058022 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-combined-ca-bundle\") pod \"74e0be85-acce-4ee5-a56a-22f60082695d\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.058089 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g87k\" (UniqueName: \"kubernetes.io/projected/74e0be85-acce-4ee5-a56a-22f60082695d-kube-api-access-7g87k\") pod \"74e0be85-acce-4ee5-a56a-22f60082695d\" (UID: \"74e0be85-acce-4ee5-a56a-22f60082695d\") " Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.066154 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e0be85-acce-4ee5-a56a-22f60082695d-kube-api-access-7g87k" (OuterVolumeSpecName: "kube-api-access-7g87k") pod "74e0be85-acce-4ee5-a56a-22f60082695d" (UID: "74e0be85-acce-4ee5-a56a-22f60082695d"). InnerVolumeSpecName "kube-api-access-7g87k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.098532 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-scripts" (OuterVolumeSpecName: "scripts") pod "74e0be85-acce-4ee5-a56a-22f60082695d" (UID: "74e0be85-acce-4ee5-a56a-22f60082695d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.111857 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.131259 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74e0be85-acce-4ee5-a56a-22f60082695d" (UID: "74e0be85-acce-4ee5-a56a-22f60082695d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.137248 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-config-data" (OuterVolumeSpecName: "config-data") pod "74e0be85-acce-4ee5-a56a-22f60082695d" (UID: "74e0be85-acce-4ee5-a56a-22f60082695d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.159965 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.159996 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.160008 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g87k\" (UniqueName: \"kubernetes.io/projected/74e0be85-acce-4ee5-a56a-22f60082695d-kube-api-access-7g87k\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.160018 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e0be85-acce-4ee5-a56a-22f60082695d-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.427001 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wps6h" event={"ID":"74e0be85-acce-4ee5-a56a-22f60082695d","Type":"ContainerDied","Data":"1abd6b02d74821500d1c22256f8fc61ff0ce80b540004522d0328fb2d2ccef0a"} Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.427309 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1abd6b02d74821500d1c22256f8fc61ff0ce80b540004522d0328fb2d2ccef0a" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.427191 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wps6h" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.431019 5118 generic.go:334] "Generic (PLEG): container finished" podID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerID="85debac3ccc2911a804ae95237cba4b8a027c9e5be200e4cd588484f443d6ba6" exitCode=143 Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.431087 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2764489-48a8-4e1d-88e6-2bbd0c9e3585","Type":"ContainerDied","Data":"85debac3ccc2911a804ae95237cba4b8a027c9e5be200e4cd588484f443d6ba6"} Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.540688 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:08:22 crc kubenswrapper[5118]: E0223 07:08:22.541355 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4609473-1aaa-4fb5-9971-6978cbe4ae63" containerName="init" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.541377 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4609473-1aaa-4fb5-9971-6978cbe4ae63" containerName="init" Feb 23 07:08:22 crc kubenswrapper[5118]: E0223 07:08:22.541386 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d081a48a-959d-43fe-92de-180551979ba7" containerName="nova-manage" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.541396 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d081a48a-959d-43fe-92de-180551979ba7" containerName="nova-manage" Feb 23 07:08:22 crc kubenswrapper[5118]: E0223 07:08:22.541410 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e0be85-acce-4ee5-a56a-22f60082695d" containerName="nova-cell1-conductor-db-sync" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.541417 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e0be85-acce-4ee5-a56a-22f60082695d" containerName="nova-cell1-conductor-db-sync" Feb 23 07:08:22 crc kubenswrapper[5118]: E0223 07:08:22.541452 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4609473-1aaa-4fb5-9971-6978cbe4ae63" containerName="dnsmasq-dns" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.541460 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4609473-1aaa-4fb5-9971-6978cbe4ae63" containerName="dnsmasq-dns" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.541624 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4609473-1aaa-4fb5-9971-6978cbe4ae63" containerName="dnsmasq-dns" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.541647 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e0be85-acce-4ee5-a56a-22f60082695d" containerName="nova-cell1-conductor-db-sync" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.541660 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d081a48a-959d-43fe-92de-180551979ba7" containerName="nova-manage" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.542320 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.545569 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.567830 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.668850 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.668907 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.669507 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmd9j\" (UniqueName: \"kubernetes.io/projected/460a8b7a-b61f-4f56-889e-54b5c2346679-kube-api-access-bmd9j\") pod \"nova-cell1-conductor-0\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.771751 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmd9j\" (UniqueName: \"kubernetes.io/projected/460a8b7a-b61f-4f56-889e-54b5c2346679-kube-api-access-bmd9j\") pod \"nova-cell1-conductor-0\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.771856 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.771884 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.779882 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.795573 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.799032 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmd9j\" (UniqueName: \"kubernetes.io/projected/460a8b7a-b61f-4f56-889e-54b5c2346679-kube-api-access-bmd9j\") pod \"nova-cell1-conductor-0\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:22 crc kubenswrapper[5118]: I0223 07:08:22.874453 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:23 crc kubenswrapper[5118]: I0223 07:08:23.431194 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:08:23 crc kubenswrapper[5118]: I0223 07:08:23.454763 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5b80ddbb-91d7-496a-bd7c-d0798dad05f6" containerName="nova-scheduler-scheduler" containerID="cri-o://b3addf3691e84cbed33a8caa1ed42d09867e2ded2b5b13a3ed5d86681396513a" gracePeriod=30 Feb 23 07:08:23 crc kubenswrapper[5118]: I0223 07:08:23.455238 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"460a8b7a-b61f-4f56-889e-54b5c2346679","Type":"ContainerStarted","Data":"5fe9b8cda05c8c35fcafac2a173468abac5eccb15e3f49c0e175f828ab25acc4"} Feb 23 07:08:23 crc kubenswrapper[5118]: I0223 07:08:23.708728 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4609473-1aaa-4fb5-9971-6978cbe4ae63" path="/var/lib/kubelet/pods/d4609473-1aaa-4fb5-9971-6978cbe4ae63/volumes" Feb 23 07:08:24 crc kubenswrapper[5118]: I0223 07:08:24.467655 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"460a8b7a-b61f-4f56-889e-54b5c2346679","Type":"ContainerStarted","Data":"896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70"} Feb 23 07:08:24 crc kubenswrapper[5118]: I0223 07:08:24.468370 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:24 crc kubenswrapper[5118]: I0223 07:08:24.492400 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.492370555 podStartE2EDuration="2.492370555s" podCreationTimestamp="2026-02-23 07:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:24.486513033 +0000 UTC m=+1367.490297606" watchObservedRunningTime="2026-02-23 07:08:24.492370555 +0000 UTC m=+1367.496155128" Feb 23 07:08:25 crc kubenswrapper[5118]: E0223 07:08:25.466359 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3addf3691e84cbed33a8caa1ed42d09867e2ded2b5b13a3ed5d86681396513a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:08:25 crc kubenswrapper[5118]: E0223 07:08:25.471944 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3addf3691e84cbed33a8caa1ed42d09867e2ded2b5b13a3ed5d86681396513a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:08:25 crc kubenswrapper[5118]: E0223 07:08:25.479814 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3addf3691e84cbed33a8caa1ed42d09867e2ded2b5b13a3ed5d86681396513a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:08:25 crc kubenswrapper[5118]: E0223 07:08:25.480767 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5b80ddbb-91d7-496a-bd7c-d0798dad05f6" containerName="nova-scheduler-scheduler" Feb 23 07:08:26 crc kubenswrapper[5118]: I0223 07:08:26.045757 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:08:26 crc kubenswrapper[5118]: I0223 07:08:26.046032 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="6249b699-2754-43b0-ae07-e726d80c5233" containerName="kube-state-metrics" containerID="cri-o://f1fb890b79737bfbf99ce5ecc106b736e0487142f1609990b3b71bfce17058cd" gracePeriod=30 Feb 23 07:08:26 crc kubenswrapper[5118]: I0223 07:08:26.506943 5118 generic.go:334] "Generic (PLEG): container finished" podID="6249b699-2754-43b0-ae07-e726d80c5233" containerID="f1fb890b79737bfbf99ce5ecc106b736e0487142f1609990b3b71bfce17058cd" exitCode=2 Feb 23 07:08:26 crc kubenswrapper[5118]: I0223 07:08:26.507175 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6249b699-2754-43b0-ae07-e726d80c5233","Type":"ContainerDied","Data":"f1fb890b79737bfbf99ce5ecc106b736e0487142f1609990b3b71bfce17058cd"} Feb 23 07:08:26 crc kubenswrapper[5118]: I0223 07:08:26.507527 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6249b699-2754-43b0-ae07-e726d80c5233","Type":"ContainerDied","Data":"5a019e560972794adcea298997061d3ea7c2e95996d214e59f3ad5f5853f8f88"} Feb 23 07:08:26 crc kubenswrapper[5118]: I0223 07:08:26.507546 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a019e560972794adcea298997061d3ea7c2e95996d214e59f3ad5f5853f8f88" Feb 23 07:08:26 crc kubenswrapper[5118]: I0223 07:08:26.563864 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:08:26 crc kubenswrapper[5118]: I0223 07:08:26.708835 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn7t5\" (UniqueName: \"kubernetes.io/projected/6249b699-2754-43b0-ae07-e726d80c5233-kube-api-access-pn7t5\") pod \"6249b699-2754-43b0-ae07-e726d80c5233\" (UID: \"6249b699-2754-43b0-ae07-e726d80c5233\") " Feb 23 07:08:26 crc kubenswrapper[5118]: I0223 07:08:26.750225 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6249b699-2754-43b0-ae07-e726d80c5233-kube-api-access-pn7t5" (OuterVolumeSpecName: "kube-api-access-pn7t5") pod "6249b699-2754-43b0-ae07-e726d80c5233" (UID: "6249b699-2754-43b0-ae07-e726d80c5233"). InnerVolumeSpecName "kube-api-access-pn7t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:26 crc kubenswrapper[5118]: I0223 07:08:26.812736 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn7t5\" (UniqueName: \"kubernetes.io/projected/6249b699-2754-43b0-ae07-e726d80c5233-kube-api-access-pn7t5\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.520866 5118 generic.go:334] "Generic (PLEG): container finished" podID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerID="29ab15e771c8d2bfde6f33b4f6d051919adf09dce39953172b199ed5aca3c723" exitCode=0 Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.520985 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2764489-48a8-4e1d-88e6-2bbd0c9e3585","Type":"ContainerDied","Data":"29ab15e771c8d2bfde6f33b4f6d051919adf09dce39953172b199ed5aca3c723"} Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.523759 5118 generic.go:334] "Generic (PLEG): container finished" podID="5b80ddbb-91d7-496a-bd7c-d0798dad05f6" containerID="b3addf3691e84cbed33a8caa1ed42d09867e2ded2b5b13a3ed5d86681396513a" exitCode=0 Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.523851 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.523871 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5b80ddbb-91d7-496a-bd7c-d0798dad05f6","Type":"ContainerDied","Data":"b3addf3691e84cbed33a8caa1ed42d09867e2ded2b5b13a3ed5d86681396513a"} Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.523946 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5b80ddbb-91d7-496a-bd7c-d0798dad05f6","Type":"ContainerDied","Data":"e93e9969e2d3b1f54823eb80d373a785d8fca82a53315883881a40c6a8f1edd0"} Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.525449 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e93e9969e2d3b1f54823eb80d373a785d8fca82a53315883881a40c6a8f1edd0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.560338 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.638930 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.679599 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.691033 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.713135 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6249b699-2754-43b0-ae07-e726d80c5233" path="/var/lib/kubelet/pods/6249b699-2754-43b0-ae07-e726d80c5233/volumes" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.720551 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:08:27 crc kubenswrapper[5118]: E0223 07:08:27.721332 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6249b699-2754-43b0-ae07-e726d80c5233" containerName="kube-state-metrics" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.721353 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6249b699-2754-43b0-ae07-e726d80c5233" containerName="kube-state-metrics" Feb 23 07:08:27 crc kubenswrapper[5118]: E0223 07:08:27.721365 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerName="nova-api-api" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.721374 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerName="nova-api-api" Feb 23 07:08:27 crc kubenswrapper[5118]: E0223 07:08:27.721388 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b80ddbb-91d7-496a-bd7c-d0798dad05f6" containerName="nova-scheduler-scheduler" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.721395 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b80ddbb-91d7-496a-bd7c-d0798dad05f6" containerName="nova-scheduler-scheduler" Feb 23 07:08:27 crc kubenswrapper[5118]: E0223 07:08:27.721425 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerName="nova-api-log" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.721431 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerName="nova-api-log" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.721630 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerName="nova-api-api" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.721646 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="6249b699-2754-43b0-ae07-e726d80c5233" containerName="kube-state-metrics" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.721661 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" containerName="nova-api-log" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.721675 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b80ddbb-91d7-496a-bd7c-d0798dad05f6" containerName="nova-scheduler-scheduler" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.722507 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.725259 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.725501 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.732622 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.752046 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-combined-ca-bundle\") pod \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.752360 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-combined-ca-bundle\") pod \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.752489 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-config-data\") pod \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.752695 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-config-data\") pod \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.752838 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tctdm\" (UniqueName: \"kubernetes.io/projected/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-kube-api-access-tctdm\") pod \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.752944 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-logs\") pod \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\" (UID: \"d2764489-48a8-4e1d-88e6-2bbd0c9e3585\") " Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.753116 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mhcb\" (UniqueName: \"kubernetes.io/projected/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-kube-api-access-8mhcb\") pod \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\" (UID: \"5b80ddbb-91d7-496a-bd7c-d0798dad05f6\") " Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.753971 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-logs" (OuterVolumeSpecName: "logs") pod "d2764489-48a8-4e1d-88e6-2bbd0c9e3585" (UID: "d2764489-48a8-4e1d-88e6-2bbd0c9e3585"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.759311 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-kube-api-access-8mhcb" (OuterVolumeSpecName: "kube-api-access-8mhcb") pod "5b80ddbb-91d7-496a-bd7c-d0798dad05f6" (UID: "5b80ddbb-91d7-496a-bd7c-d0798dad05f6"). InnerVolumeSpecName "kube-api-access-8mhcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.759380 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-kube-api-access-tctdm" (OuterVolumeSpecName: "kube-api-access-tctdm") pod "d2764489-48a8-4e1d-88e6-2bbd0c9e3585" (UID: "d2764489-48a8-4e1d-88e6-2bbd0c9e3585"). InnerVolumeSpecName "kube-api-access-tctdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.780522 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-config-data" (OuterVolumeSpecName: "config-data") pod "5b80ddbb-91d7-496a-bd7c-d0798dad05f6" (UID: "5b80ddbb-91d7-496a-bd7c-d0798dad05f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.790486 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b80ddbb-91d7-496a-bd7c-d0798dad05f6" (UID: "5b80ddbb-91d7-496a-bd7c-d0798dad05f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.791275 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-config-data" (OuterVolumeSpecName: "config-data") pod "d2764489-48a8-4e1d-88e6-2bbd0c9e3585" (UID: "d2764489-48a8-4e1d-88e6-2bbd0c9e3585"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.800498 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2764489-48a8-4e1d-88e6-2bbd0c9e3585" (UID: "d2764489-48a8-4e1d-88e6-2bbd0c9e3585"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.855788 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.855868 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.855907 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4492q\" (UniqueName: \"kubernetes.io/projected/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-api-access-4492q\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.855936 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.856419 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.856531 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.856588 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.856608 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.856657 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tctdm\" (UniqueName: \"kubernetes.io/projected/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-kube-api-access-tctdm\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.856682 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2764489-48a8-4e1d-88e6-2bbd0c9e3585-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.856701 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mhcb\" (UniqueName: \"kubernetes.io/projected/5b80ddbb-91d7-496a-bd7c-d0798dad05f6-kube-api-access-8mhcb\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.914316 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.914665 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="ceilometer-central-agent" containerID="cri-o://5b1673251e24d33e14cb885d1a30100b57c0b80b0883e50be4e26c0544e5eeaa" gracePeriod=30 Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.914723 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="proxy-httpd" containerID="cri-o://a065dfe336ce5b3881868f56481691176a9b31d3f9dbcb65b5a810d6d7860b51" gracePeriod=30 Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.914780 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="ceilometer-notification-agent" containerID="cri-o://63d318b94849d84942e140978eb3d696a0f38b314c7ae1a84ee03f020ac59443" gracePeriod=30 Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.914745 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="sg-core" containerID="cri-o://85649c8553e19892cd3a25464dcbc4e9c7dbd2ca6f717c585fde0037df3d2ef1" gracePeriod=30 Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.959738 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.959827 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.959892 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4492q\" (UniqueName: \"kubernetes.io/projected/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-api-access-4492q\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.959914 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.964409 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.965006 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.966540 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:27 crc kubenswrapper[5118]: I0223 07:08:27.982565 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4492q\" (UniqueName: \"kubernetes.io/projected/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-api-access-4492q\") pod \"kube-state-metrics-0\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " pod="openstack/kube-state-metrics-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.042632 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.538189 5118 generic.go:334] "Generic (PLEG): container finished" podID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerID="a065dfe336ce5b3881868f56481691176a9b31d3f9dbcb65b5a810d6d7860b51" exitCode=0 Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.538692 5118 generic.go:334] "Generic (PLEG): container finished" podID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerID="85649c8553e19892cd3a25464dcbc4e9c7dbd2ca6f717c585fde0037df3d2ef1" exitCode=2 Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.538720 5118 generic.go:334] "Generic (PLEG): container finished" podID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerID="5b1673251e24d33e14cb885d1a30100b57c0b80b0883e50be4e26c0544e5eeaa" exitCode=0 Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.538249 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b92146b1-cfa3-40c6-9c91-cb268024bd23","Type":"ContainerDied","Data":"a065dfe336ce5b3881868f56481691176a9b31d3f9dbcb65b5a810d6d7860b51"} Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.538807 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b92146b1-cfa3-40c6-9c91-cb268024bd23","Type":"ContainerDied","Data":"85649c8553e19892cd3a25464dcbc4e9c7dbd2ca6f717c585fde0037df3d2ef1"} Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.538823 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b92146b1-cfa3-40c6-9c91-cb268024bd23","Type":"ContainerDied","Data":"5b1673251e24d33e14cb885d1a30100b57c0b80b0883e50be4e26c0544e5eeaa"} Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.543534 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2764489-48a8-4e1d-88e6-2bbd0c9e3585","Type":"ContainerDied","Data":"8534901dcdd6f644a098cc5e6d9c59313283e91668c3c633fcf1f5c724faf7f0"} Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.543568 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.543580 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.543595 5118 scope.go:117] "RemoveContainer" containerID="29ab15e771c8d2bfde6f33b4f6d051919adf09dce39953172b199ed5aca3c723" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.578449 5118 scope.go:117] "RemoveContainer" containerID="85debac3ccc2911a804ae95237cba4b8a027c9e5be200e4cd588484f443d6ba6" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.593348 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.616314 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.627240 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.637796 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.639239 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.643826 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.658941 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.663417 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.672327 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.683516 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.687553 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.691728 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.692691 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.788131 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5jhq\" (UniqueName: \"kubernetes.io/projected/703b9028-5e0b-426c-bfe4-7751f8a68276-kube-api-access-r5jhq\") pod \"nova-scheduler-0\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.788629 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.788827 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-config-data\") pod \"nova-scheduler-0\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.892861 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-config-data\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.892936 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6512a05-f676-4dd2-a35b-80263ce769e0-logs\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.893006 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-config-data\") pod \"nova-scheduler-0\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.893056 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.893173 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5jhq\" (UniqueName: \"kubernetes.io/projected/703b9028-5e0b-426c-bfe4-7751f8a68276-kube-api-access-r5jhq\") pod \"nova-scheduler-0\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.893203 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.893232 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh89n\" (UniqueName: \"kubernetes.io/projected/b6512a05-f676-4dd2-a35b-80263ce769e0-kube-api-access-zh89n\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.913057 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.913553 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-config-data\") pod \"nova-scheduler-0\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.921046 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5jhq\" (UniqueName: \"kubernetes.io/projected/703b9028-5e0b-426c-bfe4-7751f8a68276-kube-api-access-r5jhq\") pod \"nova-scheduler-0\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.979936 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.996074 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh89n\" (UniqueName: \"kubernetes.io/projected/b6512a05-f676-4dd2-a35b-80263ce769e0-kube-api-access-zh89n\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.996253 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-config-data\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.996294 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6512a05-f676-4dd2-a35b-80263ce769e0-logs\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.996354 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:28 crc kubenswrapper[5118]: I0223 07:08:28.996840 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6512a05-f676-4dd2-a35b-80263ce769e0-logs\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.000444 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.004890 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-config-data\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.020817 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh89n\" (UniqueName: \"kubernetes.io/projected/b6512a05-f676-4dd2-a35b-80263ce769e0-kube-api-access-zh89n\") pod \"nova-api-0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " pod="openstack/nova-api-0" Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.311848 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.550546 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.564725 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed","Type":"ContainerStarted","Data":"f9fbd71a03f2a93ba309cdce906910ae6003771c797503a1153318d351d6f24a"} Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.564792 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed","Type":"ContainerStarted","Data":"dbd9f4de7ae688bd647f75ec9ac708ff74b177712c8b7acc2f8cbd82f2a1bdec"} Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.565351 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.608443 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.153545178 podStartE2EDuration="2.608418873s" podCreationTimestamp="2026-02-23 07:08:27 +0000 UTC" firstStartedPulling="2026-02-23 07:08:28.594360242 +0000 UTC m=+1371.598144815" lastFinishedPulling="2026-02-23 07:08:29.049233927 +0000 UTC m=+1372.053018510" observedRunningTime="2026-02-23 07:08:29.594792042 +0000 UTC m=+1372.598576615" watchObservedRunningTime="2026-02-23 07:08:29.608418873 +0000 UTC m=+1372.612203446" Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.711469 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b80ddbb-91d7-496a-bd7c-d0798dad05f6" path="/var/lib/kubelet/pods/5b80ddbb-91d7-496a-bd7c-d0798dad05f6/volumes" Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.712170 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2764489-48a8-4e1d-88e6-2bbd0c9e3585" path="/var/lib/kubelet/pods/d2764489-48a8-4e1d-88e6-2bbd0c9e3585/volumes" Feb 23 07:08:29 crc kubenswrapper[5118]: I0223 07:08:29.798493 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:29 crc kubenswrapper[5118]: W0223 07:08:29.805202 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6512a05_f676_4dd2_a35b_80263ce769e0.slice/crio-f2f422196d269524611ca79c1f18d1771baaf95b950a6297d7575c232ebf19c0 WatchSource:0}: Error finding container f2f422196d269524611ca79c1f18d1771baaf95b950a6297d7575c232ebf19c0: Status 404 returned error can't find the container with id f2f422196d269524611ca79c1f18d1771baaf95b950a6297d7575c232ebf19c0 Feb 23 07:08:30 crc kubenswrapper[5118]: I0223 07:08:30.591347 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"703b9028-5e0b-426c-bfe4-7751f8a68276","Type":"ContainerStarted","Data":"d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c"} Feb 23 07:08:30 crc kubenswrapper[5118]: I0223 07:08:30.591399 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"703b9028-5e0b-426c-bfe4-7751f8a68276","Type":"ContainerStarted","Data":"fd36ed7d3b6f739815a0cd87960249edab1b3c001bdbc2aaf4f1ba833fcf1628"} Feb 23 07:08:30 crc kubenswrapper[5118]: I0223 07:08:30.593757 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6512a05-f676-4dd2-a35b-80263ce769e0","Type":"ContainerStarted","Data":"1414c73ef92874acbf185d09448ffefaeea1cdeb57051ac9f893863926df4d56"} Feb 23 07:08:30 crc kubenswrapper[5118]: I0223 07:08:30.593814 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6512a05-f676-4dd2-a35b-80263ce769e0","Type":"ContainerStarted","Data":"59fa67691a10dd3e3fcad8a170edf9e3b186caa3f1705ebdfab9b0fbd839fcc2"} Feb 23 07:08:30 crc kubenswrapper[5118]: I0223 07:08:30.593831 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6512a05-f676-4dd2-a35b-80263ce769e0","Type":"ContainerStarted","Data":"f2f422196d269524611ca79c1f18d1771baaf95b950a6297d7575c232ebf19c0"} Feb 23 07:08:30 crc kubenswrapper[5118]: I0223 07:08:30.633521 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.633493422 podStartE2EDuration="2.633493422s" podCreationTimestamp="2026-02-23 07:08:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:30.611494368 +0000 UTC m=+1373.615278981" watchObservedRunningTime="2026-02-23 07:08:30.633493422 +0000 UTC m=+1373.637278005" Feb 23 07:08:30 crc kubenswrapper[5118]: I0223 07:08:30.634691 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.634684441 podStartE2EDuration="2.634684441s" podCreationTimestamp="2026-02-23 07:08:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:30.631641337 +0000 UTC m=+1373.635425930" watchObservedRunningTime="2026-02-23 07:08:30.634684441 +0000 UTC m=+1373.638469024" Feb 23 07:08:32 crc kubenswrapper[5118]: I0223 07:08:32.935222 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.633452 5118 generic.go:334] "Generic (PLEG): container finished" podID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerID="63d318b94849d84942e140978eb3d696a0f38b314c7ae1a84ee03f020ac59443" exitCode=0 Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.633621 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b92146b1-cfa3-40c6-9c91-cb268024bd23","Type":"ContainerDied","Data":"63d318b94849d84942e140978eb3d696a0f38b314c7ae1a84ee03f020ac59443"} Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.634068 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b92146b1-cfa3-40c6-9c91-cb268024bd23","Type":"ContainerDied","Data":"2915c3f84cbeca37904a31cee0bd591a30a84fd57cf88e068fe1164f20aaa5ed"} Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.634082 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2915c3f84cbeca37904a31cee0bd591a30a84fd57cf88e068fe1164f20aaa5ed" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.681872 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.721466 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sqbt\" (UniqueName: \"kubernetes.io/projected/b92146b1-cfa3-40c6-9c91-cb268024bd23-kube-api-access-6sqbt\") pod \"b92146b1-cfa3-40c6-9c91-cb268024bd23\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.721569 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-run-httpd\") pod \"b92146b1-cfa3-40c6-9c91-cb268024bd23\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.721629 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-combined-ca-bundle\") pod \"b92146b1-cfa3-40c6-9c91-cb268024bd23\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.721681 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-scripts\") pod \"b92146b1-cfa3-40c6-9c91-cb268024bd23\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.721704 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-sg-core-conf-yaml\") pod \"b92146b1-cfa3-40c6-9c91-cb268024bd23\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.721783 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-config-data\") pod \"b92146b1-cfa3-40c6-9c91-cb268024bd23\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.721908 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-log-httpd\") pod \"b92146b1-cfa3-40c6-9c91-cb268024bd23\" (UID: \"b92146b1-cfa3-40c6-9c91-cb268024bd23\") " Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.727799 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b92146b1-cfa3-40c6-9c91-cb268024bd23" (UID: "b92146b1-cfa3-40c6-9c91-cb268024bd23"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.729141 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b92146b1-cfa3-40c6-9c91-cb268024bd23" (UID: "b92146b1-cfa3-40c6-9c91-cb268024bd23"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.734553 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b92146b1-cfa3-40c6-9c91-cb268024bd23-kube-api-access-6sqbt" (OuterVolumeSpecName: "kube-api-access-6sqbt") pod "b92146b1-cfa3-40c6-9c91-cb268024bd23" (UID: "b92146b1-cfa3-40c6-9c91-cb268024bd23"). InnerVolumeSpecName "kube-api-access-6sqbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.752412 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-scripts" (OuterVolumeSpecName: "scripts") pod "b92146b1-cfa3-40c6-9c91-cb268024bd23" (UID: "b92146b1-cfa3-40c6-9c91-cb268024bd23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.784013 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b92146b1-cfa3-40c6-9c91-cb268024bd23" (UID: "b92146b1-cfa3-40c6-9c91-cb268024bd23"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.825381 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sqbt\" (UniqueName: \"kubernetes.io/projected/b92146b1-cfa3-40c6-9c91-cb268024bd23-kube-api-access-6sqbt\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.825437 5118 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.825450 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.825461 5118 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.825472 5118 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b92146b1-cfa3-40c6-9c91-cb268024bd23-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.837837 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b92146b1-cfa3-40c6-9c91-cb268024bd23" (UID: "b92146b1-cfa3-40c6-9c91-cb268024bd23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.882527 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-config-data" (OuterVolumeSpecName: "config-data") pod "b92146b1-cfa3-40c6-9c91-cb268024bd23" (UID: "b92146b1-cfa3-40c6-9c91-cb268024bd23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.927828 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.927868 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92146b1-cfa3-40c6-9c91-cb268024bd23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:33 crc kubenswrapper[5118]: I0223 07:08:33.980851 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.644165 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.686212 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.700554 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.714715 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:34 crc kubenswrapper[5118]: E0223 07:08:34.715387 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="proxy-httpd" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.715410 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="proxy-httpd" Feb 23 07:08:34 crc kubenswrapper[5118]: E0223 07:08:34.715430 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="ceilometer-central-agent" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.715439 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="ceilometer-central-agent" Feb 23 07:08:34 crc kubenswrapper[5118]: E0223 07:08:34.715454 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="ceilometer-notification-agent" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.715462 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="ceilometer-notification-agent" Feb 23 07:08:34 crc kubenswrapper[5118]: E0223 07:08:34.715491 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="sg-core" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.715498 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="sg-core" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.715672 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="ceilometer-central-agent" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.715686 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="proxy-httpd" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.715708 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="ceilometer-notification-agent" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.715718 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" containerName="sg-core" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.719545 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.723204 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.723441 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.723595 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.725586 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.748663 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-config-data\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.748758 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-run-httpd\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.749701 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdwvv\" (UniqueName: \"kubernetes.io/projected/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-kube-api-access-zdwvv\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.750056 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.750193 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-scripts\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.750329 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.750484 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-log-httpd\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.750697 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.852264 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-config-data\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.852352 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-run-httpd\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.852398 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdwvv\" (UniqueName: \"kubernetes.io/projected/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-kube-api-access-zdwvv\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.852496 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.852551 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-scripts\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.852616 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.852687 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-log-httpd\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.852777 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.853337 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-run-httpd\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.853436 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-log-httpd\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.857644 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.857676 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.858873 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-config-data\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.860076 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.860193 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-scripts\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:34 crc kubenswrapper[5118]: I0223 07:08:34.880578 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdwvv\" (UniqueName: \"kubernetes.io/projected/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-kube-api-access-zdwvv\") pod \"ceilometer-0\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " pod="openstack/ceilometer-0" Feb 23 07:08:35 crc kubenswrapper[5118]: I0223 07:08:35.041646 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:35 crc kubenswrapper[5118]: I0223 07:08:35.597398 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:35 crc kubenswrapper[5118]: I0223 07:08:35.661799 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285e2f2f-0a3b-4955-b36a-4d94a1c743ce","Type":"ContainerStarted","Data":"bdd6a988be93b392be6cf671556430693b8fb514b98f8493580a163bef73c59c"} Feb 23 07:08:35 crc kubenswrapper[5118]: I0223 07:08:35.714852 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b92146b1-cfa3-40c6-9c91-cb268024bd23" path="/var/lib/kubelet/pods/b92146b1-cfa3-40c6-9c91-cb268024bd23/volumes" Feb 23 07:08:36 crc kubenswrapper[5118]: I0223 07:08:36.704517 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285e2f2f-0a3b-4955-b36a-4d94a1c743ce","Type":"ContainerStarted","Data":"a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393"} Feb 23 07:08:37 crc kubenswrapper[5118]: I0223 07:08:37.756487 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285e2f2f-0a3b-4955-b36a-4d94a1c743ce","Type":"ContainerStarted","Data":"26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f"} Feb 23 07:08:38 crc kubenswrapper[5118]: I0223 07:08:38.075617 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 23 07:08:38 crc kubenswrapper[5118]: I0223 07:08:38.772778 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285e2f2f-0a3b-4955-b36a-4d94a1c743ce","Type":"ContainerStarted","Data":"399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0"} Feb 23 07:08:38 crc kubenswrapper[5118]: I0223 07:08:38.980837 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 07:08:39 crc kubenswrapper[5118]: I0223 07:08:39.030309 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 07:08:39 crc kubenswrapper[5118]: I0223 07:08:39.312978 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:08:39 crc kubenswrapper[5118]: I0223 07:08:39.313552 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:08:39 crc kubenswrapper[5118]: I0223 07:08:39.786625 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285e2f2f-0a3b-4955-b36a-4d94a1c743ce","Type":"ContainerStarted","Data":"69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e"} Feb 23 07:08:39 crc kubenswrapper[5118]: I0223 07:08:39.786930 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:08:39 crc kubenswrapper[5118]: I0223 07:08:39.823590 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 07:08:39 crc kubenswrapper[5118]: I0223 07:08:39.825951 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.381450601 podStartE2EDuration="5.825939925s" podCreationTimestamp="2026-02-23 07:08:34 +0000 UTC" firstStartedPulling="2026-02-23 07:08:35.612857904 +0000 UTC m=+1378.616642477" lastFinishedPulling="2026-02-23 07:08:39.057347228 +0000 UTC m=+1382.061131801" observedRunningTime="2026-02-23 07:08:39.825854233 +0000 UTC m=+1382.829638796" watchObservedRunningTime="2026-02-23 07:08:39.825939925 +0000 UTC m=+1382.829724498" Feb 23 07:08:40 crc kubenswrapper[5118]: I0223 07:08:40.395346 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:08:40 crc kubenswrapper[5118]: I0223 07:08:40.395416 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.856670 5118 generic.go:334] "Generic (PLEG): container finished" podID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" containerID="0adb6c329a2cd02154518bf66f2b846b623f8eee18887b789bac01be55ba8b0b" exitCode=137 Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.856773 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0eedf734-c6e4-47ae-ba24-64c3a057ce8b","Type":"ContainerDied","Data":"0adb6c329a2cd02154518bf66f2b846b623f8eee18887b789bac01be55ba8b0b"} Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.857693 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0eedf734-c6e4-47ae-ba24-64c3a057ce8b","Type":"ContainerDied","Data":"bd3c26a6f00f7f4816b3f45550043c05f576a7b30e7bd0ac80c70d09c7d65ecf"} Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.857720 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd3c26a6f00f7f4816b3f45550043c05f576a7b30e7bd0ac80c70d09c7d65ecf" Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.860600 5118 generic.go:334] "Generic (PLEG): container finished" podID="e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b" containerID="85d21d0dac4a95b6e222c0bbe6b332c072d7e302a0738ab596e737641693a839" exitCode=137 Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.860665 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b","Type":"ContainerDied","Data":"85d21d0dac4a95b6e222c0bbe6b332c072d7e302a0738ab596e737641693a839"} Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.860717 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b","Type":"ContainerDied","Data":"9daaec3fd27e9f2a4caa2f2fe363aef604dcede92bf585d98eab1ca7e32649d0"} Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.860738 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9daaec3fd27e9f2a4caa2f2fe363aef604dcede92bf585d98eab1ca7e32649d0" Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.868222 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.873848 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.938846 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-logs\") pod \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.938943 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-combined-ca-bundle\") pod \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.939030 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-config-data\") pod \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.939132 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-combined-ca-bundle\") pod \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.939174 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzrx9\" (UniqueName: \"kubernetes.io/projected/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-kube-api-access-fzrx9\") pod \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.939357 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w72bl\" (UniqueName: \"kubernetes.io/projected/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-kube-api-access-w72bl\") pod \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\" (UID: \"0eedf734-c6e4-47ae-ba24-64c3a057ce8b\") " Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.939456 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-config-data\") pod \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\" (UID: \"e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b\") " Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.940148 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-logs" (OuterVolumeSpecName: "logs") pod "0eedf734-c6e4-47ae-ba24-64c3a057ce8b" (UID: "0eedf734-c6e4-47ae-ba24-64c3a057ce8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.946736 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-kube-api-access-w72bl" (OuterVolumeSpecName: "kube-api-access-w72bl") pod "0eedf734-c6e4-47ae-ba24-64c3a057ce8b" (UID: "0eedf734-c6e4-47ae-ba24-64c3a057ce8b"). InnerVolumeSpecName "kube-api-access-w72bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.949396 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-kube-api-access-fzrx9" (OuterVolumeSpecName: "kube-api-access-fzrx9") pod "e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b" (UID: "e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b"). InnerVolumeSpecName "kube-api-access-fzrx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.982932 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b" (UID: "e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.987962 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0eedf734-c6e4-47ae-ba24-64c3a057ce8b" (UID: "0eedf734-c6e4-47ae-ba24-64c3a057ce8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:45 crc kubenswrapper[5118]: I0223 07:08:45.998267 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-config-data" (OuterVolumeSpecName: "config-data") pod "e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b" (UID: "e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.004519 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-config-data" (OuterVolumeSpecName: "config-data") pod "0eedf734-c6e4-47ae-ba24-64c3a057ce8b" (UID: "0eedf734-c6e4-47ae-ba24-64c3a057ce8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.041937 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.041974 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.041988 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.042000 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.042011 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.042077 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzrx9\" (UniqueName: \"kubernetes.io/projected/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b-kube-api-access-fzrx9\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.042091 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w72bl\" (UniqueName: \"kubernetes.io/projected/0eedf734-c6e4-47ae-ba24-64c3a057ce8b-kube-api-access-w72bl\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.876216 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.876246 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.948364 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.965731 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.985826 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:08:46 crc kubenswrapper[5118]: I0223 07:08:46.996631 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.008473 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:08:47 crc kubenswrapper[5118]: E0223 07:08:47.009234 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.009265 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:08:47 crc kubenswrapper[5118]: E0223 07:08:47.009305 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" containerName="nova-metadata-log" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.009318 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" containerName="nova-metadata-log" Feb 23 07:08:47 crc kubenswrapper[5118]: E0223 07:08:47.009366 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" containerName="nova-metadata-metadata" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.009380 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" containerName="nova-metadata-metadata" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.009689 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.009734 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" containerName="nova-metadata-metadata" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.009764 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" containerName="nova-metadata-log" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.011604 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.014941 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.015276 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.017781 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.019280 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.028637 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.028742 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.029537 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.039173 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.046434 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.067997 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rpg\" (UniqueName: \"kubernetes.io/projected/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-kube-api-access-44rpg\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.068082 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-config-data\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.068206 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j74dw\" (UniqueName: \"kubernetes.io/projected/04ba04e0-7d62-472f-ab31-c41f926c93e7-kube-api-access-j74dw\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.068262 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.068304 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.068388 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.068426 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.068480 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-logs\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.068528 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.068566 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.170396 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.170508 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44rpg\" (UniqueName: \"kubernetes.io/projected/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-kube-api-access-44rpg\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.170532 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-config-data\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.170574 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j74dw\" (UniqueName: \"kubernetes.io/projected/04ba04e0-7d62-472f-ab31-c41f926c93e7-kube-api-access-j74dw\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.170600 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.170624 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.170671 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.170691 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.170716 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-logs\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.170743 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.177612 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-logs\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.179151 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.179636 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.182892 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.185275 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.191431 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.198793 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-config-data\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.201792 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.204650 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rpg\" (UniqueName: \"kubernetes.io/projected/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-kube-api-access-44rpg\") pod \"nova-metadata-0\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.208693 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j74dw\" (UniqueName: \"kubernetes.io/projected/04ba04e0-7d62-472f-ab31-c41f926c93e7-kube-api-access-j74dw\") pod \"nova-cell1-novncproxy-0\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.343154 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.355432 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.717503 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eedf734-c6e4-47ae-ba24-64c3a057ce8b" path="/var/lib/kubelet/pods/0eedf734-c6e4-47ae-ba24-64c3a057ce8b/volumes" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.718441 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b" path="/var/lib/kubelet/pods/e7e96e10-a4f8-4b46-8b0c-d2a447d19a6b/volumes" Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.965560 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:08:47 crc kubenswrapper[5118]: W0223 07:08:47.973655 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf07d4ae_f7da_44f0_80c1_a8e7b935ba26.slice/crio-2f53954a5a26a859544ed3edf9f7f958daf5e21dc1c11ca0926a01d788925294 WatchSource:0}: Error finding container 2f53954a5a26a859544ed3edf9f7f958daf5e21dc1c11ca0926a01d788925294: Status 404 returned error can't find the container with id 2f53954a5a26a859544ed3edf9f7f958daf5e21dc1c11ca0926a01d788925294 Feb 23 07:08:47 crc kubenswrapper[5118]: I0223 07:08:47.976962 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:08:47 crc kubenswrapper[5118]: W0223 07:08:47.979652 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ba04e0_7d62_472f_ab31_c41f926c93e7.slice/crio-d97e95f2b2442bb38db35cd0f871789560e7f0e34611ea6eca024c22fe1d3893 WatchSource:0}: Error finding container d97e95f2b2442bb38db35cd0f871789560e7f0e34611ea6eca024c22fe1d3893: Status 404 returned error can't find the container with id d97e95f2b2442bb38db35cd0f871789560e7f0e34611ea6eca024c22fe1d3893 Feb 23 07:08:48 crc kubenswrapper[5118]: I0223 07:08:48.910338 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df07d4ae-f7da-44f0-80c1-a8e7b935ba26","Type":"ContainerStarted","Data":"eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35"} Feb 23 07:08:48 crc kubenswrapper[5118]: I0223 07:08:48.911188 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df07d4ae-f7da-44f0-80c1-a8e7b935ba26","Type":"ContainerStarted","Data":"79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b"} Feb 23 07:08:48 crc kubenswrapper[5118]: I0223 07:08:48.911256 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df07d4ae-f7da-44f0-80c1-a8e7b935ba26","Type":"ContainerStarted","Data":"2f53954a5a26a859544ed3edf9f7f958daf5e21dc1c11ca0926a01d788925294"} Feb 23 07:08:48 crc kubenswrapper[5118]: I0223 07:08:48.912713 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"04ba04e0-7d62-472f-ab31-c41f926c93e7","Type":"ContainerStarted","Data":"0751ec02f589e9689680bdbc3c817812f2676d3d07252002303baef1e65fe57b"} Feb 23 07:08:48 crc kubenswrapper[5118]: I0223 07:08:48.912771 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"04ba04e0-7d62-472f-ab31-c41f926c93e7","Type":"ContainerStarted","Data":"d97e95f2b2442bb38db35cd0f871789560e7f0e34611ea6eca024c22fe1d3893"} Feb 23 07:08:48 crc kubenswrapper[5118]: I0223 07:08:48.942921 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.942895547 podStartE2EDuration="2.942895547s" podCreationTimestamp="2026-02-23 07:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:48.933767335 +0000 UTC m=+1391.937551948" watchObservedRunningTime="2026-02-23 07:08:48.942895547 +0000 UTC m=+1391.946680140" Feb 23 07:08:48 crc kubenswrapper[5118]: I0223 07:08:48.986436 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.986407512 podStartE2EDuration="2.986407512s" podCreationTimestamp="2026-02-23 07:08:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:48.967700419 +0000 UTC m=+1391.971485032" watchObservedRunningTime="2026-02-23 07:08:48.986407512 +0000 UTC m=+1391.990192115" Feb 23 07:08:49 crc kubenswrapper[5118]: I0223 07:08:49.318542 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:08:49 crc kubenswrapper[5118]: I0223 07:08:49.319115 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:08:49 crc kubenswrapper[5118]: I0223 07:08:49.319429 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:08:49 crc kubenswrapper[5118]: I0223 07:08:49.323699 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:08:49 crc kubenswrapper[5118]: I0223 07:08:49.927525 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:08:49 crc kubenswrapper[5118]: I0223 07:08:49.930563 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.166690 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-tlrdl"] Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.169720 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.183912 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-tlrdl"] Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.268906 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-swift-storage-0\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.269208 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-nb\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.269303 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-config\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.269383 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-svc\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.269446 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-sb\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.269712 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zckmn\" (UniqueName: \"kubernetes.io/projected/15482ef3-bf3a-4442-9b2e-89222e09d218-kube-api-access-zckmn\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.371905 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zckmn\" (UniqueName: \"kubernetes.io/projected/15482ef3-bf3a-4442-9b2e-89222e09d218-kube-api-access-zckmn\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.372006 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-swift-storage-0\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.372083 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-nb\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.372122 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-config\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.372154 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-svc\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.372178 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-sb\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.373116 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-nb\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.373254 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-svc\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.373260 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-swift-storage-0\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.373465 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-config\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.373763 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-sb\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.406655 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zckmn\" (UniqueName: \"kubernetes.io/projected/15482ef3-bf3a-4442-9b2e-89222e09d218-kube-api-access-zckmn\") pod \"dnsmasq-dns-58f6456c9f-tlrdl\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:50 crc kubenswrapper[5118]: I0223 07:08:50.497350 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:51 crc kubenswrapper[5118]: W0223 07:08:51.100310 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15482ef3_bf3a_4442_9b2e_89222e09d218.slice/crio-ef247005c027dd1fcb615daca58067a68edf7c681dbfb3756e7e42eb7e3e5240 WatchSource:0}: Error finding container ef247005c027dd1fcb615daca58067a68edf7c681dbfb3756e7e42eb7e3e5240: Status 404 returned error can't find the container with id ef247005c027dd1fcb615daca58067a68edf7c681dbfb3756e7e42eb7e3e5240 Feb 23 07:08:51 crc kubenswrapper[5118]: I0223 07:08:51.105617 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-tlrdl"] Feb 23 07:08:51 crc kubenswrapper[5118]: I0223 07:08:51.948911 5118 generic.go:334] "Generic (PLEG): container finished" podID="15482ef3-bf3a-4442-9b2e-89222e09d218" containerID="136f6da0caaf2340fd15c1a32301bdddbb712bf41669c14f592aaf8d43d192b1" exitCode=0 Feb 23 07:08:51 crc kubenswrapper[5118]: I0223 07:08:51.950510 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" event={"ID":"15482ef3-bf3a-4442-9b2e-89222e09d218","Type":"ContainerDied","Data":"136f6da0caaf2340fd15c1a32301bdddbb712bf41669c14f592aaf8d43d192b1"} Feb 23 07:08:51 crc kubenswrapper[5118]: I0223 07:08:51.950579 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" event={"ID":"15482ef3-bf3a-4442-9b2e-89222e09d218","Type":"ContainerStarted","Data":"ef247005c027dd1fcb615daca58067a68edf7c681dbfb3756e7e42eb7e3e5240"} Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.268231 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.269385 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="sg-core" containerID="cri-o://399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0" gracePeriod=30 Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.269488 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="proxy-httpd" containerID="cri-o://69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e" gracePeriod=30 Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.269757 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="ceilometer-notification-agent" containerID="cri-o://26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f" gracePeriod=30 Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.271583 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="ceilometer-central-agent" containerID="cri-o://a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393" gracePeriod=30 Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.291776 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.192:3000/\": EOF" Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.344064 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.345443 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.355741 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.965015 5118 generic.go:334] "Generic (PLEG): container finished" podID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerID="69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e" exitCode=0 Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.965576 5118 generic.go:334] "Generic (PLEG): container finished" podID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerID="399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0" exitCode=2 Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.965590 5118 generic.go:334] "Generic (PLEG): container finished" podID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerID="a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393" exitCode=0 Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.965065 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285e2f2f-0a3b-4955-b36a-4d94a1c743ce","Type":"ContainerDied","Data":"69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e"} Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.965690 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285e2f2f-0a3b-4955-b36a-4d94a1c743ce","Type":"ContainerDied","Data":"399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0"} Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.965709 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285e2f2f-0a3b-4955-b36a-4d94a1c743ce","Type":"ContainerDied","Data":"a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393"} Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.971612 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" event={"ID":"15482ef3-bf3a-4442-9b2e-89222e09d218","Type":"ContainerStarted","Data":"193b69117be5af96f8bada0bd3b6b78639823387315ad23c2241b27eb7e2cc5a"} Feb 23 07:08:52 crc kubenswrapper[5118]: I0223 07:08:52.972019 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.527957 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.535389 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" podStartSLOduration=3.535369622 podStartE2EDuration="3.535369622s" podCreationTimestamp="2026-02-23 07:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:08:52.996437407 +0000 UTC m=+1396.000221980" watchObservedRunningTime="2026-02-23 07:08:53.535369622 +0000 UTC m=+1396.539154195" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.541973 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.542381 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerName="nova-api-log" containerID="cri-o://59fa67691a10dd3e3fcad8a170edf9e3b186caa3f1705ebdfab9b0fbd839fcc2" gracePeriod=30 Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.542413 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerName="nova-api-api" containerID="cri-o://1414c73ef92874acbf185d09448ffefaeea1cdeb57051ac9f893863926df4d56" gracePeriod=30 Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.647334 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-ceilometer-tls-certs\") pod \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.647415 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-scripts\") pod \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.647451 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-log-httpd\") pod \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.647501 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-combined-ca-bundle\") pod \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.647603 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-config-data\") pod \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.647728 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-run-httpd\") pod \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.647859 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdwvv\" (UniqueName: \"kubernetes.io/projected/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-kube-api-access-zdwvv\") pod \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.648627 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "285e2f2f-0a3b-4955-b36a-4d94a1c743ce" (UID: "285e2f2f-0a3b-4955-b36a-4d94a1c743ce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.648940 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-sg-core-conf-yaml\") pod \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\" (UID: \"285e2f2f-0a3b-4955-b36a-4d94a1c743ce\") " Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.649175 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "285e2f2f-0a3b-4955-b36a-4d94a1c743ce" (UID: "285e2f2f-0a3b-4955-b36a-4d94a1c743ce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.649523 5118 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.649546 5118 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.656737 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-kube-api-access-zdwvv" (OuterVolumeSpecName: "kube-api-access-zdwvv") pod "285e2f2f-0a3b-4955-b36a-4d94a1c743ce" (UID: "285e2f2f-0a3b-4955-b36a-4d94a1c743ce"). InnerVolumeSpecName "kube-api-access-zdwvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.670405 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-scripts" (OuterVolumeSpecName: "scripts") pod "285e2f2f-0a3b-4955-b36a-4d94a1c743ce" (UID: "285e2f2f-0a3b-4955-b36a-4d94a1c743ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.689202 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "285e2f2f-0a3b-4955-b36a-4d94a1c743ce" (UID: "285e2f2f-0a3b-4955-b36a-4d94a1c743ce"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.723391 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "285e2f2f-0a3b-4955-b36a-4d94a1c743ce" (UID: "285e2f2f-0a3b-4955-b36a-4d94a1c743ce"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.755108 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdwvv\" (UniqueName: \"kubernetes.io/projected/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-kube-api-access-zdwvv\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.755580 5118 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.755660 5118 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.756011 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.761218 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "285e2f2f-0a3b-4955-b36a-4d94a1c743ce" (UID: "285e2f2f-0a3b-4955-b36a-4d94a1c743ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.794981 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-config-data" (OuterVolumeSpecName: "config-data") pod "285e2f2f-0a3b-4955-b36a-4d94a1c743ce" (UID: "285e2f2f-0a3b-4955-b36a-4d94a1c743ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.858784 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.858852 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285e2f2f-0a3b-4955-b36a-4d94a1c743ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.983271 5118 generic.go:334] "Generic (PLEG): container finished" podID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerID="59fa67691a10dd3e3fcad8a170edf9e3b186caa3f1705ebdfab9b0fbd839fcc2" exitCode=143 Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.983362 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6512a05-f676-4dd2-a35b-80263ce769e0","Type":"ContainerDied","Data":"59fa67691a10dd3e3fcad8a170edf9e3b186caa3f1705ebdfab9b0fbd839fcc2"} Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.988277 5118 generic.go:334] "Generic (PLEG): container finished" podID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerID="26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f" exitCode=0 Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.988393 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285e2f2f-0a3b-4955-b36a-4d94a1c743ce","Type":"ContainerDied","Data":"26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f"} Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.988476 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"285e2f2f-0a3b-4955-b36a-4d94a1c743ce","Type":"ContainerDied","Data":"bdd6a988be93b392be6cf671556430693b8fb514b98f8493580a163bef73c59c"} Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.988501 5118 scope.go:117] "RemoveContainer" containerID="69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e" Feb 23 07:08:53 crc kubenswrapper[5118]: I0223 07:08:53.988931 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.016431 5118 scope.go:117] "RemoveContainer" containerID="399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.033664 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.043964 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.055737 5118 scope.go:117] "RemoveContainer" containerID="26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.060734 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:54 crc kubenswrapper[5118]: E0223 07:08:54.061374 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="proxy-httpd" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.061422 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="proxy-httpd" Feb 23 07:08:54 crc kubenswrapper[5118]: E0223 07:08:54.061459 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="sg-core" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.061468 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="sg-core" Feb 23 07:08:54 crc kubenswrapper[5118]: E0223 07:08:54.061477 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="ceilometer-notification-agent" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.061484 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="ceilometer-notification-agent" Feb 23 07:08:54 crc kubenswrapper[5118]: E0223 07:08:54.061510 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="ceilometer-central-agent" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.061517 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="ceilometer-central-agent" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.064260 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="ceilometer-notification-agent" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.064287 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="sg-core" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.064301 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="proxy-httpd" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.064322 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" containerName="ceilometer-central-agent" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.067557 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.070035 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.072551 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.072874 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.074215 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.092724 5118 scope.go:117] "RemoveContainer" containerID="a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.112341 5118 scope.go:117] "RemoveContainer" containerID="69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e" Feb 23 07:08:54 crc kubenswrapper[5118]: E0223 07:08:54.112737 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e\": container with ID starting with 69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e not found: ID does not exist" containerID="69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.112766 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e"} err="failed to get container status \"69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e\": rpc error: code = NotFound desc = could not find container \"69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e\": container with ID starting with 69d0d9db7cb1e994ab777e833c9d2df5d78c880385e43af3ba7a3acf9798d54e not found: ID does not exist" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.112789 5118 scope.go:117] "RemoveContainer" containerID="399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0" Feb 23 07:08:54 crc kubenswrapper[5118]: E0223 07:08:54.114253 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0\": container with ID starting with 399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0 not found: ID does not exist" containerID="399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.114309 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0"} err="failed to get container status \"399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0\": rpc error: code = NotFound desc = could not find container \"399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0\": container with ID starting with 399ab416e61773989f019d8cbb034c1380af2578198bf44e2f35c213452cf6a0 not found: ID does not exist" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.114340 5118 scope.go:117] "RemoveContainer" containerID="26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f" Feb 23 07:08:54 crc kubenswrapper[5118]: E0223 07:08:54.114723 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f\": container with ID starting with 26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f not found: ID does not exist" containerID="26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.114805 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f"} err="failed to get container status \"26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f\": rpc error: code = NotFound desc = could not find container \"26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f\": container with ID starting with 26117194f9fbc2f3b8f6e92c8b16c7eed053dd677e8a0e8c8f311a7c4ac37a1f not found: ID does not exist" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.114861 5118 scope.go:117] "RemoveContainer" containerID="a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393" Feb 23 07:08:54 crc kubenswrapper[5118]: E0223 07:08:54.115431 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393\": container with ID starting with a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393 not found: ID does not exist" containerID="a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.115462 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393"} err="failed to get container status \"a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393\": rpc error: code = NotFound desc = could not find container \"a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393\": container with ID starting with a598cc1071236dcec560e5481945e12cd4f80e708a077124a4c434d1b4641393 not found: ID does not exist" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.164811 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-scripts\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.164873 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-log-httpd\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.164903 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-run-httpd\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.164943 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-config-data\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.164967 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8fwn\" (UniqueName: \"kubernetes.io/projected/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-kube-api-access-w8fwn\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.165005 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.165019 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.165036 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.266986 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-run-httpd\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.267070 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-config-data\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.267127 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8fwn\" (UniqueName: \"kubernetes.io/projected/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-kube-api-access-w8fwn\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.267170 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.267187 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.267201 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.267267 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-scripts\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.267299 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-log-httpd\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.267778 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-log-httpd\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.268036 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-run-httpd\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.271520 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.272055 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-config-data\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.272639 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.273768 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.273957 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-scripts\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.284134 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8fwn\" (UniqueName: \"kubernetes.io/projected/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-kube-api-access-w8fwn\") pod \"ceilometer-0\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.399996 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:08:54 crc kubenswrapper[5118]: I0223 07:08:54.878658 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:55 crc kubenswrapper[5118]: I0223 07:08:55.001009 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa","Type":"ContainerStarted","Data":"df178c7e10f31f99e3c5f4d7ecadd03c3649fab878673230ca404ab5a0ce0981"} Feb 23 07:08:55 crc kubenswrapper[5118]: I0223 07:08:55.709374 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="285e2f2f-0a3b-4955-b36a-4d94a1c743ce" path="/var/lib/kubelet/pods/285e2f2f-0a3b-4955-b36a-4d94a1c743ce/volumes" Feb 23 07:08:56 crc kubenswrapper[5118]: I0223 07:08:56.029119 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa","Type":"ContainerStarted","Data":"8f5a8014e3a875fb07bb3023582b95243c2d46e31018f7ebc42025b11649dfb5"} Feb 23 07:08:56 crc kubenswrapper[5118]: I0223 07:08:56.139284 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.058739 5118 generic.go:334] "Generic (PLEG): container finished" podID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerID="1414c73ef92874acbf185d09448ffefaeea1cdeb57051ac9f893863926df4d56" exitCode=0 Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.059199 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6512a05-f676-4dd2-a35b-80263ce769e0","Type":"ContainerDied","Data":"1414c73ef92874acbf185d09448ffefaeea1cdeb57051ac9f893863926df4d56"} Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.075929 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa","Type":"ContainerStarted","Data":"8910b598bcdb0b7a6b1ead3db7ef2f2ac462de91d527087c2f1dc7dfa1fb9306"} Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.076359 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa","Type":"ContainerStarted","Data":"9ad9cfa03d5f4a2fc74f22113814dd513cf775e66d7871a4dfeebcbce74a0078"} Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.136732 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.238347 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6512a05-f676-4dd2-a35b-80263ce769e0-logs\") pod \"b6512a05-f676-4dd2-a35b-80263ce769e0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.238439 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh89n\" (UniqueName: \"kubernetes.io/projected/b6512a05-f676-4dd2-a35b-80263ce769e0-kube-api-access-zh89n\") pod \"b6512a05-f676-4dd2-a35b-80263ce769e0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.238751 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-config-data\") pod \"b6512a05-f676-4dd2-a35b-80263ce769e0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.238904 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-combined-ca-bundle\") pod \"b6512a05-f676-4dd2-a35b-80263ce769e0\" (UID: \"b6512a05-f676-4dd2-a35b-80263ce769e0\") " Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.239628 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6512a05-f676-4dd2-a35b-80263ce769e0-logs" (OuterVolumeSpecName: "logs") pod "b6512a05-f676-4dd2-a35b-80263ce769e0" (UID: "b6512a05-f676-4dd2-a35b-80263ce769e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.265433 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6512a05-f676-4dd2-a35b-80263ce769e0-kube-api-access-zh89n" (OuterVolumeSpecName: "kube-api-access-zh89n") pod "b6512a05-f676-4dd2-a35b-80263ce769e0" (UID: "b6512a05-f676-4dd2-a35b-80263ce769e0"). InnerVolumeSpecName "kube-api-access-zh89n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.279290 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-config-data" (OuterVolumeSpecName: "config-data") pod "b6512a05-f676-4dd2-a35b-80263ce769e0" (UID: "b6512a05-f676-4dd2-a35b-80263ce769e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.283541 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6512a05-f676-4dd2-a35b-80263ce769e0" (UID: "b6512a05-f676-4dd2-a35b-80263ce769e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.341749 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.341787 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6512a05-f676-4dd2-a35b-80263ce769e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.341798 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6512a05-f676-4dd2-a35b-80263ce769e0-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.341821 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh89n\" (UniqueName: \"kubernetes.io/projected/b6512a05-f676-4dd2-a35b-80263ce769e0-kube-api-access-zh89n\") on node \"crc\" DevicePath \"\"" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.344086 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.344178 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.356132 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:57 crc kubenswrapper[5118]: I0223 07:08:57.382080 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.093242 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6512a05-f676-4dd2-a35b-80263ce769e0","Type":"ContainerDied","Data":"f2f422196d269524611ca79c1f18d1771baaf95b950a6297d7575c232ebf19c0"} Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.093774 5118 scope.go:117] "RemoveContainer" containerID="1414c73ef92874acbf185d09448ffefaeea1cdeb57051ac9f893863926df4d56" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.093249 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.116269 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.179888 5118 scope.go:117] "RemoveContainer" containerID="59fa67691a10dd3e3fcad8a170edf9e3b186caa3f1705ebdfab9b0fbd839fcc2" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.211161 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.220303 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.287123 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:58 crc kubenswrapper[5118]: E0223 07:08:58.288362 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerName="nova-api-api" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.288390 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerName="nova-api-api" Feb 23 07:08:58 crc kubenswrapper[5118]: E0223 07:08:58.288436 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerName="nova-api-log" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.288445 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerName="nova-api-log" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.288935 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerName="nova-api-log" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.288987 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" containerName="nova-api-api" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.296250 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.296435 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.299950 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.306083 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.306512 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.375256 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-config-data\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.375379 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcbn5\" (UniqueName: \"kubernetes.io/projected/6ef89878-5fea-439a-832c-4862953bacda-kube-api-access-hcbn5\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.375412 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-public-tls-certs\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.375447 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef89878-5fea-439a-832c-4862953bacda-logs\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.375509 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.375547 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.378469 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.378584 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.478189 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-d7ws6"] Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.478985 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.479086 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.479135 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-config-data\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.479217 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcbn5\" (UniqueName: \"kubernetes.io/projected/6ef89878-5fea-439a-832c-4862953bacda-kube-api-access-hcbn5\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.479246 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-public-tls-certs\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.479285 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef89878-5fea-439a-832c-4862953bacda-logs\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.479811 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef89878-5fea-439a-832c-4862953bacda-logs\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.481491 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.486365 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.486794 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.487761 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.492188 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.492238 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-config-data\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.518548 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-public-tls-certs\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.520164 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcbn5\" (UniqueName: \"kubernetes.io/projected/6ef89878-5fea-439a-832c-4862953bacda-kube-api-access-hcbn5\") pod \"nova-api-0\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.522017 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-d7ws6"] Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.582008 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-scripts\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.582110 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxkkk\" (UniqueName: \"kubernetes.io/projected/f3ee323c-688a-4726-9c5d-75a21d189c67-kube-api-access-jxkkk\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.582146 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-config-data\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.582182 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.627621 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.703975 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-scripts\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.704055 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkkk\" (UniqueName: \"kubernetes.io/projected/f3ee323c-688a-4726-9c5d-75a21d189c67-kube-api-access-jxkkk\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.704122 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-config-data\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.704197 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.710198 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.711976 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-config-data\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.715582 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-scripts\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.725281 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxkkk\" (UniqueName: \"kubernetes.io/projected/f3ee323c-688a-4726-9c5d-75a21d189c67-kube-api-access-jxkkk\") pod \"nova-cell1-cell-mapping-d7ws6\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:58 crc kubenswrapper[5118]: I0223 07:08:58.887737 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:08:59 crc kubenswrapper[5118]: I0223 07:08:59.103509 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa","Type":"ContainerStarted","Data":"4f450c7e25066260be2baa5f9364bc90e704739b7822bece41543319592f07ed"} Feb 23 07:08:59 crc kubenswrapper[5118]: I0223 07:08:59.103697 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="ceilometer-central-agent" containerID="cri-o://8f5a8014e3a875fb07bb3023582b95243c2d46e31018f7ebc42025b11649dfb5" gracePeriod=30 Feb 23 07:08:59 crc kubenswrapper[5118]: I0223 07:08:59.103954 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:08:59 crc kubenswrapper[5118]: I0223 07:08:59.104250 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="sg-core" containerID="cri-o://8910b598bcdb0b7a6b1ead3db7ef2f2ac462de91d527087c2f1dc7dfa1fb9306" gracePeriod=30 Feb 23 07:08:59 crc kubenswrapper[5118]: I0223 07:08:59.104207 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="proxy-httpd" containerID="cri-o://4f450c7e25066260be2baa5f9364bc90e704739b7822bece41543319592f07ed" gracePeriod=30 Feb 23 07:08:59 crc kubenswrapper[5118]: I0223 07:08:59.104270 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="ceilometer-notification-agent" containerID="cri-o://9ad9cfa03d5f4a2fc74f22113814dd513cf775e66d7871a4dfeebcbce74a0078" gracePeriod=30 Feb 23 07:08:59 crc kubenswrapper[5118]: I0223 07:08:59.163441 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:08:59 crc kubenswrapper[5118]: W0223 07:08:59.172206 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef89878_5fea_439a_832c_4862953bacda.slice/crio-50dbd0a7123dd9308a812b2c50560facc13df3029b66e36c67bbef2e5689f382 WatchSource:0}: Error finding container 50dbd0a7123dd9308a812b2c50560facc13df3029b66e36c67bbef2e5689f382: Status 404 returned error can't find the container with id 50dbd0a7123dd9308a812b2c50560facc13df3029b66e36c67bbef2e5689f382 Feb 23 07:08:59 crc kubenswrapper[5118]: I0223 07:08:59.184298 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.073236602 podStartE2EDuration="5.184231436s" podCreationTimestamp="2026-02-23 07:08:54 +0000 UTC" firstStartedPulling="2026-02-23 07:08:54.884697578 +0000 UTC m=+1397.888482161" lastFinishedPulling="2026-02-23 07:08:57.995692412 +0000 UTC m=+1400.999476995" observedRunningTime="2026-02-23 07:08:59.151108093 +0000 UTC m=+1402.154892656" watchObservedRunningTime="2026-02-23 07:08:59.184231436 +0000 UTC m=+1402.188016009" Feb 23 07:08:59 crc kubenswrapper[5118]: I0223 07:08:59.412615 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-d7ws6"] Feb 23 07:08:59 crc kubenswrapper[5118]: I0223 07:08:59.709295 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6512a05-f676-4dd2-a35b-80263ce769e0" path="/var/lib/kubelet/pods/b6512a05-f676-4dd2-a35b-80263ce769e0/volumes" Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.122609 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ef89878-5fea-439a-832c-4862953bacda","Type":"ContainerStarted","Data":"cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a"} Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.122663 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ef89878-5fea-439a-832c-4862953bacda","Type":"ContainerStarted","Data":"63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce"} Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.122678 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ef89878-5fea-439a-832c-4862953bacda","Type":"ContainerStarted","Data":"50dbd0a7123dd9308a812b2c50560facc13df3029b66e36c67bbef2e5689f382"} Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.128546 5118 generic.go:334] "Generic (PLEG): container finished" podID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerID="4f450c7e25066260be2baa5f9364bc90e704739b7822bece41543319592f07ed" exitCode=0 Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.128601 5118 generic.go:334] "Generic (PLEG): container finished" podID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerID="8910b598bcdb0b7a6b1ead3db7ef2f2ac462de91d527087c2f1dc7dfa1fb9306" exitCode=2 Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.128610 5118 generic.go:334] "Generic (PLEG): container finished" podID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerID="9ad9cfa03d5f4a2fc74f22113814dd513cf775e66d7871a4dfeebcbce74a0078" exitCode=0 Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.128626 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa","Type":"ContainerDied","Data":"4f450c7e25066260be2baa5f9364bc90e704739b7822bece41543319592f07ed"} Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.128663 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa","Type":"ContainerDied","Data":"8910b598bcdb0b7a6b1ead3db7ef2f2ac462de91d527087c2f1dc7dfa1fb9306"} Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.128688 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa","Type":"ContainerDied","Data":"9ad9cfa03d5f4a2fc74f22113814dd513cf775e66d7871a4dfeebcbce74a0078"} Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.130183 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-d7ws6" event={"ID":"f3ee323c-688a-4726-9c5d-75a21d189c67","Type":"ContainerStarted","Data":"e6177a5edc264e84a87c302b966b45b66cce1a423d61e2b512f7e870b53fe9e4"} Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.130204 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-d7ws6" event={"ID":"f3ee323c-688a-4726-9c5d-75a21d189c67","Type":"ContainerStarted","Data":"93e0e47eebdad2a022a7edf4bf12c3dc57de4773bdc29940cff5e0ec54760c6b"} Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.147678 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.147658489 podStartE2EDuration="2.147658489s" podCreationTimestamp="2026-02-23 07:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:00.142604167 +0000 UTC m=+1403.146388750" watchObservedRunningTime="2026-02-23 07:09:00.147658489 +0000 UTC m=+1403.151443062" Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.167270 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-d7ws6" podStartSLOduration=2.167245234 podStartE2EDuration="2.167245234s" podCreationTimestamp="2026-02-23 07:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:00.160521741 +0000 UTC m=+1403.164306324" watchObservedRunningTime="2026-02-23 07:09:00.167245234 +0000 UTC m=+1403.171029807" Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.499359 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.567744 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-bb58m"] Feb 23 07:09:00 crc kubenswrapper[5118]: I0223 07:09:00.568432 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-849fff7679-bb58m" podUID="8a38ce22-4dfe-44b7-bd53-df058bf38afe" containerName="dnsmasq-dns" containerID="cri-o://867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20" gracePeriod=10 Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.064844 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.171815 5118 generic.go:334] "Generic (PLEG): container finished" podID="8a38ce22-4dfe-44b7-bd53-df058bf38afe" containerID="867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20" exitCode=0 Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.171930 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849fff7679-bb58m" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.171882 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-bb58m" event={"ID":"8a38ce22-4dfe-44b7-bd53-df058bf38afe","Type":"ContainerDied","Data":"867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20"} Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.172015 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849fff7679-bb58m" event={"ID":"8a38ce22-4dfe-44b7-bd53-df058bf38afe","Type":"ContainerDied","Data":"0e2590d9ecf33e189bf395fa71206cd698caaf3963536229dcc415e5e911ab7c"} Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.172034 5118 scope.go:117] "RemoveContainer" containerID="867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.182764 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74xns\" (UniqueName: \"kubernetes.io/projected/8a38ce22-4dfe-44b7-bd53-df058bf38afe-kube-api-access-74xns\") pod \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.183047 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-nb\") pod \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.183157 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-sb\") pod \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.183206 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-swift-storage-0\") pod \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.183559 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-config\") pod \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.183618 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-svc\") pod \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\" (UID: \"8a38ce22-4dfe-44b7-bd53-df058bf38afe\") " Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.195816 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a38ce22-4dfe-44b7-bd53-df058bf38afe-kube-api-access-74xns" (OuterVolumeSpecName: "kube-api-access-74xns") pod "8a38ce22-4dfe-44b7-bd53-df058bf38afe" (UID: "8a38ce22-4dfe-44b7-bd53-df058bf38afe"). InnerVolumeSpecName "kube-api-access-74xns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.234920 5118 scope.go:117] "RemoveContainer" containerID="6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.238654 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a38ce22-4dfe-44b7-bd53-df058bf38afe" (UID: "8a38ce22-4dfe-44b7-bd53-df058bf38afe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.251085 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8a38ce22-4dfe-44b7-bd53-df058bf38afe" (UID: "8a38ce22-4dfe-44b7-bd53-df058bf38afe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.288519 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.288660 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74xns\" (UniqueName: \"kubernetes.io/projected/8a38ce22-4dfe-44b7-bd53-df058bf38afe-kube-api-access-74xns\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.288738 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.299550 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8a38ce22-4dfe-44b7-bd53-df058bf38afe" (UID: "8a38ce22-4dfe-44b7-bd53-df058bf38afe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.300511 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-config" (OuterVolumeSpecName: "config") pod "8a38ce22-4dfe-44b7-bd53-df058bf38afe" (UID: "8a38ce22-4dfe-44b7-bd53-df058bf38afe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.317312 5118 scope.go:117] "RemoveContainer" containerID="867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20" Feb 23 07:09:01 crc kubenswrapper[5118]: E0223 07:09:01.317841 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20\": container with ID starting with 867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20 not found: ID does not exist" containerID="867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.317881 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20"} err="failed to get container status \"867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20\": rpc error: code = NotFound desc = could not find container \"867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20\": container with ID starting with 867a319a7eca21fbdaf6f45131a60469af6dfd2319aad5b879c7890a7def5b20 not found: ID does not exist" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.317905 5118 scope.go:117] "RemoveContainer" containerID="6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a" Feb 23 07:09:01 crc kubenswrapper[5118]: E0223 07:09:01.318170 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a\": container with ID starting with 6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a not found: ID does not exist" containerID="6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.318194 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a"} err="failed to get container status \"6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a\": rpc error: code = NotFound desc = could not find container \"6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a\": container with ID starting with 6761615ec5947016085b587c98b9b8428d5aaf1552a6b33dcd7c908b0d5f7d5a not found: ID does not exist" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.322632 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8a38ce22-4dfe-44b7-bd53-df058bf38afe" (UID: "8a38ce22-4dfe-44b7-bd53-df058bf38afe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.390954 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.390991 5118 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.391005 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a38ce22-4dfe-44b7-bd53-df058bf38afe-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.520714 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-bb58m"] Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.541378 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-849fff7679-bb58m"] Feb 23 07:09:01 crc kubenswrapper[5118]: I0223 07:09:01.717315 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a38ce22-4dfe-44b7-bd53-df058bf38afe" path="/var/lib/kubelet/pods/8a38ce22-4dfe-44b7-bd53-df058bf38afe/volumes" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.236046 5118 generic.go:334] "Generic (PLEG): container finished" podID="f3ee323c-688a-4726-9c5d-75a21d189c67" containerID="e6177a5edc264e84a87c302b966b45b66cce1a423d61e2b512f7e870b53fe9e4" exitCode=0 Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.236226 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-d7ws6" event={"ID":"f3ee323c-688a-4726-9c5d-75a21d189c67","Type":"ContainerDied","Data":"e6177a5edc264e84a87c302b966b45b66cce1a423d61e2b512f7e870b53fe9e4"} Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.247649 5118 generic.go:334] "Generic (PLEG): container finished" podID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerID="8f5a8014e3a875fb07bb3023582b95243c2d46e31018f7ebc42025b11649dfb5" exitCode=0 Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.247728 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa","Type":"ContainerDied","Data":"8f5a8014e3a875fb07bb3023582b95243c2d46e31018f7ebc42025b11649dfb5"} Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.601390 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.713391 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-config-data\") pod \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.715140 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-log-httpd\") pod \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.715334 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-scripts\") pod \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.715385 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-combined-ca-bundle\") pod \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.715545 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-sg-core-conf-yaml\") pod \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.715585 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-run-httpd\") pod \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.716286 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" (UID: "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.716661 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8fwn\" (UniqueName: \"kubernetes.io/projected/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-kube-api-access-w8fwn\") pod \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.716974 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-ceilometer-tls-certs\") pod \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\" (UID: \"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa\") " Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.716662 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" (UID: "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.718297 5118 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.718647 5118 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.722140 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-kube-api-access-w8fwn" (OuterVolumeSpecName: "kube-api-access-w8fwn") pod "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" (UID: "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa"). InnerVolumeSpecName "kube-api-access-w8fwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.725358 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-scripts" (OuterVolumeSpecName: "scripts") pod "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" (UID: "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.793818 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" (UID: "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.794883 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" (UID: "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.815506 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" (UID: "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.821046 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.821075 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.821085 5118 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.821125 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8fwn\" (UniqueName: \"kubernetes.io/projected/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-kube-api-access-w8fwn\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.821135 5118 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.855050 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-config-data" (OuterVolumeSpecName: "config-data") pod "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" (UID: "0197a7bd-cb53-4fd4-b862-f2d3883ab1aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:05 crc kubenswrapper[5118]: I0223 07:09:05.923856 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.266751 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0197a7bd-cb53-4fd4-b862-f2d3883ab1aa","Type":"ContainerDied","Data":"df178c7e10f31f99e3c5f4d7ecadd03c3649fab878673230ca404ab5a0ce0981"} Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.266855 5118 scope.go:117] "RemoveContainer" containerID="4f450c7e25066260be2baa5f9364bc90e704739b7822bece41543319592f07ed" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.266788 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.308497 5118 scope.go:117] "RemoveContainer" containerID="8910b598bcdb0b7a6b1ead3db7ef2f2ac462de91d527087c2f1dc7dfa1fb9306" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.336335 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.346667 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.362783 5118 scope.go:117] "RemoveContainer" containerID="9ad9cfa03d5f4a2fc74f22113814dd513cf775e66d7871a4dfeebcbce74a0078" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.391947 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:06 crc kubenswrapper[5118]: E0223 07:09:06.392742 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="sg-core" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.392775 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="sg-core" Feb 23 07:09:06 crc kubenswrapper[5118]: E0223 07:09:06.392813 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="ceilometer-central-agent" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.392828 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="ceilometer-central-agent" Feb 23 07:09:06 crc kubenswrapper[5118]: E0223 07:09:06.392856 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a38ce22-4dfe-44b7-bd53-df058bf38afe" containerName="init" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.392869 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a38ce22-4dfe-44b7-bd53-df058bf38afe" containerName="init" Feb 23 07:09:06 crc kubenswrapper[5118]: E0223 07:09:06.392886 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="proxy-httpd" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.392902 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="proxy-httpd" Feb 23 07:09:06 crc kubenswrapper[5118]: E0223 07:09:06.392945 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a38ce22-4dfe-44b7-bd53-df058bf38afe" containerName="dnsmasq-dns" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.392959 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a38ce22-4dfe-44b7-bd53-df058bf38afe" containerName="dnsmasq-dns" Feb 23 07:09:06 crc kubenswrapper[5118]: E0223 07:09:06.392983 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="ceilometer-notification-agent" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.392996 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="ceilometer-notification-agent" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.393344 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="ceilometer-notification-agent" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.393370 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="ceilometer-central-agent" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.393395 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="proxy-httpd" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.393435 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" containerName="sg-core" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.393460 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a38ce22-4dfe-44b7-bd53-df058bf38afe" containerName="dnsmasq-dns" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.396855 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.401933 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.405464 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.408670 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.408690 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.435520 5118 scope.go:117] "RemoveContainer" containerID="8f5a8014e3a875fb07bb3023582b95243c2d46e31018f7ebc42025b11649dfb5" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.537605 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-log-httpd\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.537680 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-run-httpd\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.537704 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.537731 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-scripts\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.537769 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.537852 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-config-data\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.537874 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kq2p\" (UniqueName: \"kubernetes.io/projected/a0a7f53e-845e-4dfd-a80d-f790b60270fc-kube-api-access-2kq2p\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.537924 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.640005 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-config-data\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.640074 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kq2p\" (UniqueName: \"kubernetes.io/projected/a0a7f53e-845e-4dfd-a80d-f790b60270fc-kube-api-access-2kq2p\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.640202 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.640255 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-log-httpd\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.640287 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-run-httpd\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.640306 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.640327 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-scripts\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.640349 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.647484 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-run-httpd\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.647616 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-log-httpd\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.648407 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.648607 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.648998 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.649527 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-scripts\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.663690 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kq2p\" (UniqueName: \"kubernetes.io/projected/a0a7f53e-845e-4dfd-a80d-f790b60270fc-kube-api-access-2kq2p\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.675601 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-config-data\") pod \"ceilometer-0\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.727178 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.859584 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.949682 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-combined-ca-bundle\") pod \"f3ee323c-688a-4726-9c5d-75a21d189c67\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.949776 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxkkk\" (UniqueName: \"kubernetes.io/projected/f3ee323c-688a-4726-9c5d-75a21d189c67-kube-api-access-jxkkk\") pod \"f3ee323c-688a-4726-9c5d-75a21d189c67\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.949834 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-scripts\") pod \"f3ee323c-688a-4726-9c5d-75a21d189c67\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.949900 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-config-data\") pod \"f3ee323c-688a-4726-9c5d-75a21d189c67\" (UID: \"f3ee323c-688a-4726-9c5d-75a21d189c67\") " Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.956945 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-scripts" (OuterVolumeSpecName: "scripts") pod "f3ee323c-688a-4726-9c5d-75a21d189c67" (UID: "f3ee323c-688a-4726-9c5d-75a21d189c67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.957165 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ee323c-688a-4726-9c5d-75a21d189c67-kube-api-access-jxkkk" (OuterVolumeSpecName: "kube-api-access-jxkkk") pod "f3ee323c-688a-4726-9c5d-75a21d189c67" (UID: "f3ee323c-688a-4726-9c5d-75a21d189c67"). InnerVolumeSpecName "kube-api-access-jxkkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.984222 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3ee323c-688a-4726-9c5d-75a21d189c67" (UID: "f3ee323c-688a-4726-9c5d-75a21d189c67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:06 crc kubenswrapper[5118]: I0223 07:09:06.987408 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-config-data" (OuterVolumeSpecName: "config-data") pod "f3ee323c-688a-4726-9c5d-75a21d189c67" (UID: "f3ee323c-688a-4726-9c5d-75a21d189c67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.052506 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.052546 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxkkk\" (UniqueName: \"kubernetes.io/projected/f3ee323c-688a-4726-9c5d-75a21d189c67-kube-api-access-jxkkk\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.052561 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.052572 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ee323c-688a-4726-9c5d-75a21d189c67-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:07 crc kubenswrapper[5118]: W0223 07:09:07.204383 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0a7f53e_845e_4dfd_a80d_f790b60270fc.slice/crio-095c3652bb6291c30e57cd77c235c928b95d49ecee6ea8d409c2537557574246 WatchSource:0}: Error finding container 095c3652bb6291c30e57cd77c235c928b95d49ecee6ea8d409c2537557574246: Status 404 returned error can't find the container with id 095c3652bb6291c30e57cd77c235c928b95d49ecee6ea8d409c2537557574246 Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.217879 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.280774 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-d7ws6" event={"ID":"f3ee323c-688a-4726-9c5d-75a21d189c67","Type":"ContainerDied","Data":"93e0e47eebdad2a022a7edf4bf12c3dc57de4773bdc29940cff5e0ec54760c6b"} Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.280857 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93e0e47eebdad2a022a7edf4bf12c3dc57de4773bdc29940cff5e0ec54760c6b" Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.280920 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-d7ws6" Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.283834 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a7f53e-845e-4dfd-a80d-f790b60270fc","Type":"ContainerStarted","Data":"095c3652bb6291c30e57cd77c235c928b95d49ecee6ea8d409c2537557574246"} Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.352080 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.360607 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.362951 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.468548 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.468856 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6ef89878-5fea-439a-832c-4862953bacda" containerName="nova-api-log" containerID="cri-o://63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce" gracePeriod=30 Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.468981 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6ef89878-5fea-439a-832c-4862953bacda" containerName="nova-api-api" containerID="cri-o://cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a" gracePeriod=30 Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.554608 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.555156 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="703b9028-5e0b-426c-bfe4-7751f8a68276" containerName="nova-scheduler-scheduler" containerID="cri-o://d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c" gracePeriod=30 Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.579872 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:07 crc kubenswrapper[5118]: I0223 07:09:07.716664 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0197a7bd-cb53-4fd4-b862-f2d3883ab1aa" path="/var/lib/kubelet/pods/0197a7bd-cb53-4fd4-b862-f2d3883ab1aa/volumes" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.034006 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.197288 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-internal-tls-certs\") pod \"6ef89878-5fea-439a-832c-4862953bacda\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.197419 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef89878-5fea-439a-832c-4862953bacda-logs\") pod \"6ef89878-5fea-439a-832c-4862953bacda\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.197538 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-config-data\") pod \"6ef89878-5fea-439a-832c-4862953bacda\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.197663 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-combined-ca-bundle\") pod \"6ef89878-5fea-439a-832c-4862953bacda\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.197851 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-public-tls-certs\") pod \"6ef89878-5fea-439a-832c-4862953bacda\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.197895 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcbn5\" (UniqueName: \"kubernetes.io/projected/6ef89878-5fea-439a-832c-4862953bacda-kube-api-access-hcbn5\") pod \"6ef89878-5fea-439a-832c-4862953bacda\" (UID: \"6ef89878-5fea-439a-832c-4862953bacda\") " Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.197975 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef89878-5fea-439a-832c-4862953bacda-logs" (OuterVolumeSpecName: "logs") pod "6ef89878-5fea-439a-832c-4862953bacda" (UID: "6ef89878-5fea-439a-832c-4862953bacda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.199294 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ef89878-5fea-439a-832c-4862953bacda-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.207635 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef89878-5fea-439a-832c-4862953bacda-kube-api-access-hcbn5" (OuterVolumeSpecName: "kube-api-access-hcbn5") pod "6ef89878-5fea-439a-832c-4862953bacda" (UID: "6ef89878-5fea-439a-832c-4862953bacda"). InnerVolumeSpecName "kube-api-access-hcbn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.228202 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ef89878-5fea-439a-832c-4862953bacda" (UID: "6ef89878-5fea-439a-832c-4862953bacda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.232695 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-config-data" (OuterVolumeSpecName: "config-data") pod "6ef89878-5fea-439a-832c-4862953bacda" (UID: "6ef89878-5fea-439a-832c-4862953bacda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.258624 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6ef89878-5fea-439a-832c-4862953bacda" (UID: "6ef89878-5fea-439a-832c-4862953bacda"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.273324 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6ef89878-5fea-439a-832c-4862953bacda" (UID: "6ef89878-5fea-439a-832c-4862953bacda"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.296336 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a7f53e-845e-4dfd-a80d-f790b60270fc","Type":"ContainerStarted","Data":"ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8"} Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.298656 5118 generic.go:334] "Generic (PLEG): container finished" podID="6ef89878-5fea-439a-832c-4862953bacda" containerID="cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a" exitCode=0 Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.298686 5118 generic.go:334] "Generic (PLEG): container finished" podID="6ef89878-5fea-439a-832c-4862953bacda" containerID="63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce" exitCode=143 Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.300128 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.300230 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ef89878-5fea-439a-832c-4862953bacda","Type":"ContainerDied","Data":"cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a"} Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.300289 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ef89878-5fea-439a-832c-4862953bacda","Type":"ContainerDied","Data":"63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce"} Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.300304 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6ef89878-5fea-439a-832c-4862953bacda","Type":"ContainerDied","Data":"50dbd0a7123dd9308a812b2c50560facc13df3029b66e36c67bbef2e5689f382"} Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.300328 5118 scope.go:117] "RemoveContainer" containerID="cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.302457 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.302476 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcbn5\" (UniqueName: \"kubernetes.io/projected/6ef89878-5fea-439a-832c-4862953bacda-kube-api-access-hcbn5\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.302537 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.302552 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.302565 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef89878-5fea-439a-832c-4862953bacda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.316979 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.460199 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.486151 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.493708 5118 scope.go:117] "RemoveContainer" containerID="63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.513751 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:08 crc kubenswrapper[5118]: E0223 07:09:08.514257 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ee323c-688a-4726-9c5d-75a21d189c67" containerName="nova-manage" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.514276 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ee323c-688a-4726-9c5d-75a21d189c67" containerName="nova-manage" Feb 23 07:09:08 crc kubenswrapper[5118]: E0223 07:09:08.514289 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef89878-5fea-439a-832c-4862953bacda" containerName="nova-api-log" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.514296 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef89878-5fea-439a-832c-4862953bacda" containerName="nova-api-log" Feb 23 07:09:08 crc kubenswrapper[5118]: E0223 07:09:08.514311 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef89878-5fea-439a-832c-4862953bacda" containerName="nova-api-api" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.514316 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef89878-5fea-439a-832c-4862953bacda" containerName="nova-api-api" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.514882 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef89878-5fea-439a-832c-4862953bacda" containerName="nova-api-log" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.514902 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ee323c-688a-4726-9c5d-75a21d189c67" containerName="nova-manage" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.514911 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef89878-5fea-439a-832c-4862953bacda" containerName="nova-api-api" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.516404 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.520258 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.520548 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.520722 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.527050 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.542889 5118 scope.go:117] "RemoveContainer" containerID="cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a" Feb 23 07:09:08 crc kubenswrapper[5118]: E0223 07:09:08.554652 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a\": container with ID starting with cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a not found: ID does not exist" containerID="cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.554710 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a"} err="failed to get container status \"cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a\": rpc error: code = NotFound desc = could not find container \"cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a\": container with ID starting with cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a not found: ID does not exist" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.554742 5118 scope.go:117] "RemoveContainer" containerID="63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce" Feb 23 07:09:08 crc kubenswrapper[5118]: E0223 07:09:08.557443 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce\": container with ID starting with 63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce not found: ID does not exist" containerID="63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.557480 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce"} err="failed to get container status \"63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce\": rpc error: code = NotFound desc = could not find container \"63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce\": container with ID starting with 63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce not found: ID does not exist" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.557499 5118 scope.go:117] "RemoveContainer" containerID="cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.561892 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a"} err="failed to get container status \"cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a\": rpc error: code = NotFound desc = could not find container \"cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a\": container with ID starting with cdf108827f90b1eafdf64f4fe8f7eae5893626c011ee2e47c24c5ff024e16e2a not found: ID does not exist" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.561925 5118 scope.go:117] "RemoveContainer" containerID="63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.571188 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce"} err="failed to get container status \"63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce\": rpc error: code = NotFound desc = could not find container \"63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce\": container with ID starting with 63f0b52d3ad15fb9f94ac005dd8546db5c8c806025e4d1976e8d5e7aefcda4ce not found: ID does not exist" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.619443 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-public-tls-certs\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.619539 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.619637 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjkg8\" (UniqueName: \"kubernetes.io/projected/cd013d81-347d-4c1c-9ccf-0f5e1a590755-kube-api-access-jjkg8\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.619667 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.619699 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-config-data\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.619728 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd013d81-347d-4c1c-9ccf-0f5e1a590755-logs\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.721254 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd013d81-347d-4c1c-9ccf-0f5e1a590755-logs\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.721343 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-public-tls-certs\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.721394 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.721468 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjkg8\" (UniqueName: \"kubernetes.io/projected/cd013d81-347d-4c1c-9ccf-0f5e1a590755-kube-api-access-jjkg8\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.721514 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.721539 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-config-data\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.730816 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd013d81-347d-4c1c-9ccf-0f5e1a590755-logs\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.734160 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-public-tls-certs\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.734991 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.738218 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.743815 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-config-data\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.746953 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjkg8\" (UniqueName: \"kubernetes.io/projected/cd013d81-347d-4c1c-9ccf-0f5e1a590755-kube-api-access-jjkg8\") pod \"nova-api-0\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: I0223 07:09:08.861328 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:09:08 crc kubenswrapper[5118]: E0223 07:09:08.983387 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:09:08 crc kubenswrapper[5118]: E0223 07:09:08.985331 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:09:08 crc kubenswrapper[5118]: E0223 07:09:08.986652 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:09:08 crc kubenswrapper[5118]: E0223 07:09:08.986685 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="703b9028-5e0b-426c-bfe4-7751f8a68276" containerName="nova-scheduler-scheduler" Feb 23 07:09:09 crc kubenswrapper[5118]: I0223 07:09:09.317541 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:09:09 crc kubenswrapper[5118]: I0223 07:09:09.317910 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a7f53e-845e-4dfd-a80d-f790b60270fc","Type":"ContainerStarted","Data":"5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7"} Feb 23 07:09:09 crc kubenswrapper[5118]: I0223 07:09:09.326623 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-log" containerID="cri-o://79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b" gracePeriod=30 Feb 23 07:09:09 crc kubenswrapper[5118]: I0223 07:09:09.327150 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-metadata" containerID="cri-o://eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35" gracePeriod=30 Feb 23 07:09:09 crc kubenswrapper[5118]: I0223 07:09:09.709541 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef89878-5fea-439a-832c-4862953bacda" path="/var/lib/kubelet/pods/6ef89878-5fea-439a-832c-4862953bacda/volumes" Feb 23 07:09:10 crc kubenswrapper[5118]: I0223 07:09:10.344632 5118 generic.go:334] "Generic (PLEG): container finished" podID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerID="79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b" exitCode=143 Feb 23 07:09:10 crc kubenswrapper[5118]: I0223 07:09:10.344725 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df07d4ae-f7da-44f0-80c1-a8e7b935ba26","Type":"ContainerDied","Data":"79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b"} Feb 23 07:09:10 crc kubenswrapper[5118]: I0223 07:09:10.349745 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a7f53e-845e-4dfd-a80d-f790b60270fc","Type":"ContainerStarted","Data":"807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18"} Feb 23 07:09:10 crc kubenswrapper[5118]: I0223 07:09:10.352689 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cd013d81-347d-4c1c-9ccf-0f5e1a590755","Type":"ContainerStarted","Data":"5285ecb4115449aa0975018a2c7a4ad15449770fc6a16a5b4cc0704a6eea208b"} Feb 23 07:09:10 crc kubenswrapper[5118]: I0223 07:09:10.352881 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cd013d81-347d-4c1c-9ccf-0f5e1a590755","Type":"ContainerStarted","Data":"af045486508fe8cab1c8b590668cb9fcd431b4bc730c81a19eb7129715e8530d"} Feb 23 07:09:10 crc kubenswrapper[5118]: I0223 07:09:10.353067 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cd013d81-347d-4c1c-9ccf-0f5e1a590755","Type":"ContainerStarted","Data":"53b693b3924399c70496c4b7866c72831302d508f70145998a682b5cd6cadbf2"} Feb 23 07:09:10 crc kubenswrapper[5118]: I0223 07:09:10.387522 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.387496782 podStartE2EDuration="2.387496782s" podCreationTimestamp="2026-02-23 07:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:10.374090846 +0000 UTC m=+1413.377875449" watchObservedRunningTime="2026-02-23 07:09:10.387496782 +0000 UTC m=+1413.391281395" Feb 23 07:09:11 crc kubenswrapper[5118]: I0223 07:09:11.373236 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a7f53e-845e-4dfd-a80d-f790b60270fc","Type":"ContainerStarted","Data":"fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274"} Feb 23 07:09:11 crc kubenswrapper[5118]: I0223 07:09:11.373755 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 07:09:11 crc kubenswrapper[5118]: I0223 07:09:11.420580 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.196414756 podStartE2EDuration="5.420554515s" podCreationTimestamp="2026-02-23 07:09:06 +0000 UTC" firstStartedPulling="2026-02-23 07:09:07.207086684 +0000 UTC m=+1410.210871257" lastFinishedPulling="2026-02-23 07:09:10.431226433 +0000 UTC m=+1413.435011016" observedRunningTime="2026-02-23 07:09:11.409689161 +0000 UTC m=+1414.413473764" watchObservedRunningTime="2026-02-23 07:09:11.420554515 +0000 UTC m=+1414.424339118" Feb 23 07:09:12 crc kubenswrapper[5118]: I0223 07:09:12.474918 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:52234->10.217.0.193:8775: read: connection reset by peer" Feb 23 07:09:12 crc kubenswrapper[5118]: I0223 07:09:12.475791 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:52220->10.217.0.193:8775: read: connection reset by peer" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.008328 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.072372 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.131449 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-logs\") pod \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.131960 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-logs" (OuterVolumeSpecName: "logs") pod "df07d4ae-f7da-44f0-80c1-a8e7b935ba26" (UID: "df07d4ae-f7da-44f0-80c1-a8e7b935ba26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.134407 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-combined-ca-bundle\") pod \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.134476 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-nova-metadata-tls-certs\") pod \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.134543 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44rpg\" (UniqueName: \"kubernetes.io/projected/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-kube-api-access-44rpg\") pod \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.134578 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-config-data\") pod \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\" (UID: \"df07d4ae-f7da-44f0-80c1-a8e7b935ba26\") " Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.135200 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.141306 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-kube-api-access-44rpg" (OuterVolumeSpecName: "kube-api-access-44rpg") pod "df07d4ae-f7da-44f0-80c1-a8e7b935ba26" (UID: "df07d4ae-f7da-44f0-80c1-a8e7b935ba26"). InnerVolumeSpecName "kube-api-access-44rpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.175721 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df07d4ae-f7da-44f0-80c1-a8e7b935ba26" (UID: "df07d4ae-f7da-44f0-80c1-a8e7b935ba26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.177471 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-config-data" (OuterVolumeSpecName: "config-data") pod "df07d4ae-f7da-44f0-80c1-a8e7b935ba26" (UID: "df07d4ae-f7da-44f0-80c1-a8e7b935ba26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.195388 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "df07d4ae-f7da-44f0-80c1-a8e7b935ba26" (UID: "df07d4ae-f7da-44f0-80c1-a8e7b935ba26"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.238251 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-combined-ca-bundle\") pod \"703b9028-5e0b-426c-bfe4-7751f8a68276\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.238699 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5jhq\" (UniqueName: \"kubernetes.io/projected/703b9028-5e0b-426c-bfe4-7751f8a68276-kube-api-access-r5jhq\") pod \"703b9028-5e0b-426c-bfe4-7751f8a68276\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.238800 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-config-data\") pod \"703b9028-5e0b-426c-bfe4-7751f8a68276\" (UID: \"703b9028-5e0b-426c-bfe4-7751f8a68276\") " Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.239301 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.239320 5118 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.239331 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44rpg\" (UniqueName: \"kubernetes.io/projected/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-kube-api-access-44rpg\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.239342 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df07d4ae-f7da-44f0-80c1-a8e7b935ba26-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.243004 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703b9028-5e0b-426c-bfe4-7751f8a68276-kube-api-access-r5jhq" (OuterVolumeSpecName: "kube-api-access-r5jhq") pod "703b9028-5e0b-426c-bfe4-7751f8a68276" (UID: "703b9028-5e0b-426c-bfe4-7751f8a68276"). InnerVolumeSpecName "kube-api-access-r5jhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.265840 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-config-data" (OuterVolumeSpecName: "config-data") pod "703b9028-5e0b-426c-bfe4-7751f8a68276" (UID: "703b9028-5e0b-426c-bfe4-7751f8a68276"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.266701 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "703b9028-5e0b-426c-bfe4-7751f8a68276" (UID: "703b9028-5e0b-426c-bfe4-7751f8a68276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.342253 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.342296 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5jhq\" (UniqueName: \"kubernetes.io/projected/703b9028-5e0b-426c-bfe4-7751f8a68276-kube-api-access-r5jhq\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.342309 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/703b9028-5e0b-426c-bfe4-7751f8a68276-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.426315 5118 generic.go:334] "Generic (PLEG): container finished" podID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerID="eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35" exitCode=0 Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.426749 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df07d4ae-f7da-44f0-80c1-a8e7b935ba26","Type":"ContainerDied","Data":"eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35"} Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.426953 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df07d4ae-f7da-44f0-80c1-a8e7b935ba26","Type":"ContainerDied","Data":"2f53954a5a26a859544ed3edf9f7f958daf5e21dc1c11ca0926a01d788925294"} Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.427171 5118 scope.go:117] "RemoveContainer" containerID="eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.427412 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.431935 5118 generic.go:334] "Generic (PLEG): container finished" podID="703b9028-5e0b-426c-bfe4-7751f8a68276" containerID="d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c" exitCode=0 Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.431978 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"703b9028-5e0b-426c-bfe4-7751f8a68276","Type":"ContainerDied","Data":"d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c"} Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.432003 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"703b9028-5e0b-426c-bfe4-7751f8a68276","Type":"ContainerDied","Data":"fd36ed7d3b6f739815a0cd87960249edab1b3c001bdbc2aaf4f1ba833fcf1628"} Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.432090 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.473515 5118 scope.go:117] "RemoveContainer" containerID="79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.515641 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.527975 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.539968 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.553135 5118 scope.go:117] "RemoveContainer" containerID="eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35" Feb 23 07:09:13 crc kubenswrapper[5118]: E0223 07:09:13.554203 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35\": container with ID starting with eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35 not found: ID does not exist" containerID="eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.554234 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35"} err="failed to get container status \"eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35\": rpc error: code = NotFound desc = could not find container \"eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35\": container with ID starting with eab7a80608749038a3b93a29864d57d843b6401db78e7c5adf174a98bf0f4c35 not found: ID does not exist" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.554255 5118 scope.go:117] "RemoveContainer" containerID="79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b" Feb 23 07:09:13 crc kubenswrapper[5118]: E0223 07:09:13.560126 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b\": container with ID starting with 79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b not found: ID does not exist" containerID="79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.560164 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b"} err="failed to get container status \"79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b\": rpc error: code = NotFound desc = could not find container \"79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b\": container with ID starting with 79e061b98e14e17e7fa3fccfb3ee7e521dfedd4fb9ac2b0504415cc1559b635b not found: ID does not exist" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.560188 5118 scope.go:117] "RemoveContainer" containerID="d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.563195 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.573048 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:13 crc kubenswrapper[5118]: E0223 07:09:13.573680 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-metadata" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.573748 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-metadata" Feb 23 07:09:13 crc kubenswrapper[5118]: E0223 07:09:13.573969 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-log" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.574034 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-log" Feb 23 07:09:13 crc kubenswrapper[5118]: E0223 07:09:13.574185 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703b9028-5e0b-426c-bfe4-7751f8a68276" containerName="nova-scheduler-scheduler" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.574240 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b9028-5e0b-426c-bfe4-7751f8a68276" containerName="nova-scheduler-scheduler" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.574729 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-log" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.574809 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="703b9028-5e0b-426c-bfe4-7751f8a68276" containerName="nova-scheduler-scheduler" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.574863 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" containerName="nova-metadata-metadata" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.580798 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.581263 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.583252 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.584051 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.595150 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.596593 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.598546 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.609686 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.623053 5118 scope.go:117] "RemoveContainer" containerID="d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c" Feb 23 07:09:13 crc kubenswrapper[5118]: E0223 07:09:13.623608 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c\": container with ID starting with d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c not found: ID does not exist" containerID="d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.623647 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c"} err="failed to get container status \"d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c\": rpc error: code = NotFound desc = could not find container \"d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c\": container with ID starting with d14d03931dc0e906f7a493bea52b3b6aff17f02592156d77d2eb3e145197b10c not found: ID does not exist" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.647568 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdnn\" (UniqueName: \"kubernetes.io/projected/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-kube-api-access-vgdnn\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.647845 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-config-data\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.647981 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-logs\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.648181 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.648327 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: E0223 07:09:13.706967 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod703b9028_5e0b_426c_bfe4_7751f8a68276.slice/crio-fd36ed7d3b6f739815a0cd87960249edab1b3c001bdbc2aaf4f1ba833fcf1628\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod703b9028_5e0b_426c_bfe4_7751f8a68276.slice\": RecentStats: unable to find data in memory cache]" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.713135 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703b9028-5e0b-426c-bfe4-7751f8a68276" path="/var/lib/kubelet/pods/703b9028-5e0b-426c-bfe4-7751f8a68276/volumes" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.713969 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df07d4ae-f7da-44f0-80c1-a8e7b935ba26" path="/var/lib/kubelet/pods/df07d4ae-f7da-44f0-80c1-a8e7b935ba26/volumes" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.751142 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbrhk\" (UniqueName: \"kubernetes.io/projected/cf1e2e3d-0fc3-474a-a15d-6808347c8240-kube-api-access-rbrhk\") pod \"nova-scheduler-0\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.751265 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdnn\" (UniqueName: \"kubernetes.io/projected/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-kube-api-access-vgdnn\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.751305 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.751791 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-config-data\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.751895 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-logs\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.751978 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-config-data\") pod \"nova-scheduler-0\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.752021 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.752056 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.752959 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-logs\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.756821 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.762300 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-config-data\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.773594 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.778623 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdnn\" (UniqueName: \"kubernetes.io/projected/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-kube-api-access-vgdnn\") pod \"nova-metadata-0\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.856235 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-config-data\") pod \"nova-scheduler-0\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.856376 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbrhk\" (UniqueName: \"kubernetes.io/projected/cf1e2e3d-0fc3-474a-a15d-6808347c8240-kube-api-access-rbrhk\") pod \"nova-scheduler-0\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.856424 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.864785 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-config-data\") pod \"nova-scheduler-0\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.866003 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.896768 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbrhk\" (UniqueName: \"kubernetes.io/projected/cf1e2e3d-0fc3-474a-a15d-6808347c8240-kube-api-access-rbrhk\") pod \"nova-scheduler-0\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " pod="openstack/nova-scheduler-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.926169 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:09:13 crc kubenswrapper[5118]: I0223 07:09:13.932769 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:09:14 crc kubenswrapper[5118]: W0223 07:09:14.490545 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf1e2e3d_0fc3_474a_a15d_6808347c8240.slice/crio-fcedf1092d1c13d1246bb56caf2b69ae2f4ec09b64162087349f53e983c0c268 WatchSource:0}: Error finding container fcedf1092d1c13d1246bb56caf2b69ae2f4ec09b64162087349f53e983c0c268: Status 404 returned error can't find the container with id fcedf1092d1c13d1246bb56caf2b69ae2f4ec09b64162087349f53e983c0c268 Feb 23 07:09:14 crc kubenswrapper[5118]: I0223 07:09:14.500540 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:09:14 crc kubenswrapper[5118]: W0223 07:09:14.561792 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84fdf432_1886_4e91_bd3c_bca6f1b90c3a.slice/crio-269735b34d3fb86fc641759aea22911ee76884e3a2348a8d5779bfe39a3512bc WatchSource:0}: Error finding container 269735b34d3fb86fc641759aea22911ee76884e3a2348a8d5779bfe39a3512bc: Status 404 returned error can't find the container with id 269735b34d3fb86fc641759aea22911ee76884e3a2348a8d5779bfe39a3512bc Feb 23 07:09:14 crc kubenswrapper[5118]: I0223 07:09:14.562413 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:09:15 crc kubenswrapper[5118]: I0223 07:09:15.472081 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84fdf432-1886-4e91-bd3c-bca6f1b90c3a","Type":"ContainerStarted","Data":"63d5f3f40e0bd27af62b5fc9b31aa60381f047575c6ba94cbf1284e1e6bbf343"} Feb 23 07:09:15 crc kubenswrapper[5118]: I0223 07:09:15.472612 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84fdf432-1886-4e91-bd3c-bca6f1b90c3a","Type":"ContainerStarted","Data":"d26689d86956826776f6a68df424f29414b578bdb30f6c4e05418d6010a876b3"} Feb 23 07:09:15 crc kubenswrapper[5118]: I0223 07:09:15.472638 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84fdf432-1886-4e91-bd3c-bca6f1b90c3a","Type":"ContainerStarted","Data":"269735b34d3fb86fc641759aea22911ee76884e3a2348a8d5779bfe39a3512bc"} Feb 23 07:09:15 crc kubenswrapper[5118]: I0223 07:09:15.475638 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cf1e2e3d-0fc3-474a-a15d-6808347c8240","Type":"ContainerStarted","Data":"399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d"} Feb 23 07:09:15 crc kubenswrapper[5118]: I0223 07:09:15.475718 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cf1e2e3d-0fc3-474a-a15d-6808347c8240","Type":"ContainerStarted","Data":"fcedf1092d1c13d1246bb56caf2b69ae2f4ec09b64162087349f53e983c0c268"} Feb 23 07:09:15 crc kubenswrapper[5118]: I0223 07:09:15.566151 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.566127769 podStartE2EDuration="2.566127769s" podCreationTimestamp="2026-02-23 07:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:15.543947191 +0000 UTC m=+1418.547731764" watchObservedRunningTime="2026-02-23 07:09:15.566127769 +0000 UTC m=+1418.569912352" Feb 23 07:09:15 crc kubenswrapper[5118]: I0223 07:09:15.568900 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5688895560000002 podStartE2EDuration="2.568889556s" podCreationTimestamp="2026-02-23 07:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:09:15.561183689 +0000 UTC m=+1418.564968262" watchObservedRunningTime="2026-02-23 07:09:15.568889556 +0000 UTC m=+1418.572674139" Feb 23 07:09:18 crc kubenswrapper[5118]: I0223 07:09:18.862742 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:09:18 crc kubenswrapper[5118]: I0223 07:09:18.863135 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 07:09:18 crc kubenswrapper[5118]: I0223 07:09:18.926344 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:09:18 crc kubenswrapper[5118]: I0223 07:09:18.928230 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 07:09:18 crc kubenswrapper[5118]: I0223 07:09:18.933384 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 07:09:19 crc kubenswrapper[5118]: I0223 07:09:19.882429 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:09:19 crc kubenswrapper[5118]: I0223 07:09:19.883670 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:09:23 crc kubenswrapper[5118]: I0223 07:09:23.926490 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:09:23 crc kubenswrapper[5118]: I0223 07:09:23.927540 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 07:09:23 crc kubenswrapper[5118]: I0223 07:09:23.934028 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 07:09:23 crc kubenswrapper[5118]: I0223 07:09:23.983579 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 07:09:24 crc kubenswrapper[5118]: I0223 07:09:24.656037 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 07:09:24 crc kubenswrapper[5118]: I0223 07:09:24.950416 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:09:24 crc kubenswrapper[5118]: I0223 07:09:24.950949 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:09:28 crc kubenswrapper[5118]: I0223 07:09:28.872987 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:09:28 crc kubenswrapper[5118]: I0223 07:09:28.876071 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:09:28 crc kubenswrapper[5118]: I0223 07:09:28.876918 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 07:09:28 crc kubenswrapper[5118]: I0223 07:09:28.888272 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5118]: I0223 07:09:29.694415 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 07:09:29 crc kubenswrapper[5118]: I0223 07:09:29.718971 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 07:09:33 crc kubenswrapper[5118]: I0223 07:09:33.935420 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:09:33 crc kubenswrapper[5118]: I0223 07:09:33.936984 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 07:09:33 crc kubenswrapper[5118]: I0223 07:09:33.948360 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:09:33 crc kubenswrapper[5118]: I0223 07:09:33.952962 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 07:09:36 crc kubenswrapper[5118]: I0223 07:09:36.747415 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.815777 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-284f-account-create-update-75k6g"] Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.817837 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-284f-account-create-update-75k6g" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.820983 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.825026 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-skw6h"] Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.826637 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-skw6h" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.829009 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.863260 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-284f-account-create-update-75k6g"] Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.871023 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4gmc\" (UniqueName: \"kubernetes.io/projected/5e254320-082c-442b-a1a9-4b7fafe2c556-kube-api-access-b4gmc\") pod \"root-account-create-update-skw6h\" (UID: \"5e254320-082c-442b-a1a9-4b7fafe2c556\") " pod="openstack/root-account-create-update-skw6h" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.871136 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e254320-082c-442b-a1a9-4b7fafe2c556-operator-scripts\") pod \"root-account-create-update-skw6h\" (UID: \"5e254320-082c-442b-a1a9-4b7fafe2c556\") " pod="openstack/root-account-create-update-skw6h" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.871187 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-operator-scripts\") pod \"glance-284f-account-create-update-75k6g\" (UID: \"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d\") " pod="openstack/glance-284f-account-create-update-75k6g" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.871238 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2gvm\" (UniqueName: \"kubernetes.io/projected/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-kube-api-access-g2gvm\") pod \"glance-284f-account-create-update-75k6g\" (UID: \"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d\") " pod="openstack/glance-284f-account-create-update-75k6g" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.874849 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-skw6h"] Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.972962 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4gmc\" (UniqueName: \"kubernetes.io/projected/5e254320-082c-442b-a1a9-4b7fafe2c556-kube-api-access-b4gmc\") pod \"root-account-create-update-skw6h\" (UID: \"5e254320-082c-442b-a1a9-4b7fafe2c556\") " pod="openstack/root-account-create-update-skw6h" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.973110 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e254320-082c-442b-a1a9-4b7fafe2c556-operator-scripts\") pod \"root-account-create-update-skw6h\" (UID: \"5e254320-082c-442b-a1a9-4b7fafe2c556\") " pod="openstack/root-account-create-update-skw6h" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.973165 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-operator-scripts\") pod \"glance-284f-account-create-update-75k6g\" (UID: \"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d\") " pod="openstack/glance-284f-account-create-update-75k6g" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.973223 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2gvm\" (UniqueName: \"kubernetes.io/projected/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-kube-api-access-g2gvm\") pod \"glance-284f-account-create-update-75k6g\" (UID: \"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d\") " pod="openstack/glance-284f-account-create-update-75k6g" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.974312 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e254320-082c-442b-a1a9-4b7fafe2c556-operator-scripts\") pod \"root-account-create-update-skw6h\" (UID: \"5e254320-082c-442b-a1a9-4b7fafe2c556\") " pod="openstack/root-account-create-update-skw6h" Feb 23 07:09:55 crc kubenswrapper[5118]: I0223 07:09:55.974607 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-operator-scripts\") pod \"glance-284f-account-create-update-75k6g\" (UID: \"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d\") " pod="openstack/glance-284f-account-create-update-75k6g" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.027985 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4gmc\" (UniqueName: \"kubernetes.io/projected/5e254320-082c-442b-a1a9-4b7fafe2c556-kube-api-access-b4gmc\") pod \"root-account-create-update-skw6h\" (UID: \"5e254320-082c-442b-a1a9-4b7fafe2c556\") " pod="openstack/root-account-create-update-skw6h" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.047826 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-587b7cf474-nwcvp"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.049446 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.051749 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2gvm\" (UniqueName: \"kubernetes.io/projected/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-kube-api-access-g2gvm\") pod \"glance-284f-account-create-update-75k6g\" (UID: \"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d\") " pod="openstack/glance-284f-account-create-update-75k6g" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.075269 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2e07-account-create-update-9hjkr"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.075493 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fczfn\" (UniqueName: \"kubernetes.io/projected/fa65e5e6-1e90-407a-a462-c8ef3e406df3-kube-api-access-fczfn\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.075543 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data-custom\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.075566 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa65e5e6-1e90-407a-a462-c8ef3e406df3-logs\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.075582 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-combined-ca-bundle\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.075618 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.077255 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2e07-account-create-update-9hjkr" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.099205 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.099563 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" containerName="cinder-scheduler" containerID="cri-o://56b4d180783d05f72c701c589b061dafc5eb3886c60c04d8b8ec016724107c8f" gracePeriod=30 Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.099741 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" containerName="probe" containerID="cri-o://befb16d8a3f73c9a8d367e972c7e2cebda984c914cdc82b0856f084d6aa276f7" gracePeriod=30 Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.110884 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.126553 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-587b7cf474-nwcvp"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.142559 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a094-account-create-update-5m7nf"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.144371 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a094-account-create-update-5m7nf" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.153423 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.157871 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-284f-account-create-update-75k6g" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.168293 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-skw6h" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.178282 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-75595465d9-2pkqt"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.180587 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjhgn\" (UniqueName: \"kubernetes.io/projected/9b555633-a53e-4689-b746-98bd29e6742e-kube-api-access-xjhgn\") pod \"barbican-a094-account-create-update-5m7nf\" (UID: \"9b555633-a53e-4689-b746-98bd29e6742e\") " pod="openstack/barbican-a094-account-create-update-5m7nf" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.180645 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-operator-scripts\") pod \"placement-2e07-account-create-update-9hjkr\" (UID: \"745f142b-ee2d-4354-98b3-2b6cd13e3b5e\") " pod="openstack/placement-2e07-account-create-update-9hjkr" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.180683 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.180754 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztlj6\" (UniqueName: \"kubernetes.io/projected/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-kube-api-access-ztlj6\") pod \"placement-2e07-account-create-update-9hjkr\" (UID: \"745f142b-ee2d-4354-98b3-2b6cd13e3b5e\") " pod="openstack/placement-2e07-account-create-update-9hjkr" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.180791 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fczfn\" (UniqueName: \"kubernetes.io/projected/fa65e5e6-1e90-407a-a462-c8ef3e406df3-kube-api-access-fczfn\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.180812 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data-custom\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.180834 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa65e5e6-1e90-407a-a462-c8ef3e406df3-logs\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.180851 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-combined-ca-bundle\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.180887 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.180914 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b555633-a53e-4689-b746-98bd29e6742e-operator-scripts\") pod \"barbican-a094-account-create-update-5m7nf\" (UID: \"9b555633-a53e-4689-b746-98bd29e6742e\") " pod="openstack/barbican-a094-account-create-update-5m7nf" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.183773 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa65e5e6-1e90-407a-a462-c8ef3e406df3-logs\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.195299 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-combined-ca-bundle\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.208567 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.223665 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data-custom\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.223748 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2e07-account-create-update-9hjkr"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.264086 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fczfn\" (UniqueName: \"kubernetes.io/projected/fa65e5e6-1e90-407a-a462-c8ef3e406df3-kube-api-access-fczfn\") pod \"barbican-keystone-listener-587b7cf474-nwcvp\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.272375 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-75595465d9-2pkqt"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.288396 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-combined-ca-bundle\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.288476 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b555633-a53e-4689-b746-98bd29e6742e-operator-scripts\") pod \"barbican-a094-account-create-update-5m7nf\" (UID: \"9b555633-a53e-4689-b746-98bd29e6742e\") " pod="openstack/barbican-a094-account-create-update-5m7nf" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.288540 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjhgn\" (UniqueName: \"kubernetes.io/projected/9b555633-a53e-4689-b746-98bd29e6742e-kube-api-access-xjhgn\") pod \"barbican-a094-account-create-update-5m7nf\" (UID: \"9b555633-a53e-4689-b746-98bd29e6742e\") " pod="openstack/barbican-a094-account-create-update-5m7nf" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.288557 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-operator-scripts\") pod \"placement-2e07-account-create-update-9hjkr\" (UID: \"745f142b-ee2d-4354-98b3-2b6cd13e3b5e\") " pod="openstack/placement-2e07-account-create-update-9hjkr" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.288585 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data-custom\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.288608 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.288643 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b42tc\" (UniqueName: \"kubernetes.io/projected/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-kube-api-access-b42tc\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.288679 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-logs\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.288709 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztlj6\" (UniqueName: \"kubernetes.io/projected/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-kube-api-access-ztlj6\") pod \"placement-2e07-account-create-update-9hjkr\" (UID: \"745f142b-ee2d-4354-98b3-2b6cd13e3b5e\") " pod="openstack/placement-2e07-account-create-update-9hjkr" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.289470 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b555633-a53e-4689-b746-98bd29e6742e-operator-scripts\") pod \"barbican-a094-account-create-update-5m7nf\" (UID: \"9b555633-a53e-4689-b746-98bd29e6742e\") " pod="openstack/barbican-a094-account-create-update-5m7nf" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.289771 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-operator-scripts\") pod \"placement-2e07-account-create-update-9hjkr\" (UID: \"745f142b-ee2d-4354-98b3-2b6cd13e3b5e\") " pod="openstack/placement-2e07-account-create-update-9hjkr" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.289813 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a094-account-create-update-5m7nf"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.314264 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-284f-account-create-update-w9fdc"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.329686 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-284f-account-create-update-w9fdc"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.351073 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8c77bbddd-85rm4"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.352793 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.362401 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztlj6\" (UniqueName: \"kubernetes.io/projected/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-kube-api-access-ztlj6\") pod \"placement-2e07-account-create-update-9hjkr\" (UID: \"745f142b-ee2d-4354-98b3-2b6cd13e3b5e\") " pod="openstack/placement-2e07-account-create-update-9hjkr" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.363603 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f94b-account-create-update-cvr4d"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.386722 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.387253 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.390731 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-logs\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.397394 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-internal-tls-certs\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.397644 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f515e00-c6e0-4849-b073-64721780e216-logs\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.397774 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.397947 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-combined-ca-bundle\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.398116 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-public-tls-certs\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.398260 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data-custom\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.398440 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-combined-ca-bundle\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.398581 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srdls\" (UniqueName: \"kubernetes.io/projected/bf376c5f-cc04-4733-9968-e199472b4241-kube-api-access-srdls\") pod \"neutron-f94b-account-create-update-cvr4d\" (UID: \"bf376c5f-cc04-4733-9968-e199472b4241\") " pod="openstack/neutron-f94b-account-create-update-cvr4d" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.398707 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl94x\" (UniqueName: \"kubernetes.io/projected/5f515e00-c6e0-4849-b073-64721780e216-kube-api-access-hl94x\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.398827 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data-custom\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.398948 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.399127 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf376c5f-cc04-4733-9968-e199472b4241-operator-scripts\") pod \"neutron-f94b-account-create-update-cvr4d\" (UID: \"bf376c5f-cc04-4733-9968-e199472b4241\") " pod="openstack/neutron-f94b-account-create-update-cvr4d" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.399229 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b42tc\" (UniqueName: \"kubernetes.io/projected/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-kube-api-access-b42tc\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.399599 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fst8q"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.413779 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.414088 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjhgn\" (UniqueName: \"kubernetes.io/projected/9b555633-a53e-4689-b746-98bd29e6742e-kube-api-access-xjhgn\") pod \"barbican-a094-account-create-update-5m7nf\" (UID: \"9b555633-a53e-4689-b746-98bd29e6742e\") " pod="openstack/barbican-a094-account-create-update-5m7nf" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.417608 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2e07-account-create-update-9hjkr" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.418431 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-logs\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.419793 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="1b3633e5-65f3-41c8-be57-5c4e28227ec9" containerName="openstackclient" containerID="cri-o://4ba34d66d196c48e8577f6109b59eff4f5b86c53ed431eae0c2091d19a42b92c" gracePeriod=2 Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.420013 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f94b-account-create-update-cvr4d" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.433647 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fst8q"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.434917 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.441565 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8c77bbddd-85rm4"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.442346 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data-custom\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.443261 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.444236 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-combined-ca-bundle\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.456514 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.460430 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f94b-account-create-update-cvr4d"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.498510 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b42tc\" (UniqueName: \"kubernetes.io/projected/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-kube-api-access-b42tc\") pod \"barbican-worker-75595465d9-2pkqt\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.523873 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6db8-account-create-update-2ckqc"] Feb 23 07:09:56 crc kubenswrapper[5118]: E0223 07:09:56.524505 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3633e5-65f3-41c8-be57-5c4e28227ec9" containerName="openstackclient" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.524527 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3633e5-65f3-41c8-be57-5c4e28227ec9" containerName="openstackclient" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.524732 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3633e5-65f3-41c8-be57-5c4e28227ec9" containerName="openstackclient" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.525482 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6db8-account-create-update-2ckqc" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.536798 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.538820 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-internal-tls-certs\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.538869 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f515e00-c6e0-4849-b073-64721780e216-logs\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.538904 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.538958 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-public-tls-certs\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.539033 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data-custom\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.539084 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-combined-ca-bundle\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.539134 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srdls\" (UniqueName: \"kubernetes.io/projected/bf376c5f-cc04-4733-9968-e199472b4241-kube-api-access-srdls\") pod \"neutron-f94b-account-create-update-cvr4d\" (UID: \"bf376c5f-cc04-4733-9968-e199472b4241\") " pod="openstack/neutron-f94b-account-create-update-cvr4d" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.539183 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl94x\" (UniqueName: \"kubernetes.io/projected/5f515e00-c6e0-4849-b073-64721780e216-kube-api-access-hl94x\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.539256 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf376c5f-cc04-4733-9968-e199472b4241-operator-scripts\") pod \"neutron-f94b-account-create-update-cvr4d\" (UID: \"bf376c5f-cc04-4733-9968-e199472b4241\") " pod="openstack/neutron-f94b-account-create-update-cvr4d" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.606518 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-internal-tls-certs\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.607179 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f515e00-c6e0-4849-b073-64721780e216-logs\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: E0223 07:09:56.607515 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:09:56 crc kubenswrapper[5118]: E0223 07:09:56.607638 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data podName:5721793b-d753-4519-b484-fa9cb958def9 nodeName:}" failed. No retries permitted until 2026-02-23 07:09:57.107616465 +0000 UTC m=+1460.111401028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data") pod "rabbitmq-cell1-server-0" (UID: "5721793b-d753-4519-b484-fa9cb958def9") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.640293 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf376c5f-cc04-4733-9968-e199472b4241-operator-scripts\") pod \"neutron-f94b-account-create-update-cvr4d\" (UID: \"bf376c5f-cc04-4733-9968-e199472b4241\") " pod="openstack/neutron-f94b-account-create-update-cvr4d" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.644667 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.661724 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data-custom\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.667419 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-combined-ca-bundle\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.679898 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-public-tls-certs\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.692734 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a094-account-create-update-5m7nf" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.708164 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6db8-account-create-update-2ckqc"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.728505 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srdls\" (UniqueName: \"kubernetes.io/projected/bf376c5f-cc04-4733-9968-e199472b4241-kube-api-access-srdls\") pod \"neutron-f94b-account-create-update-cvr4d\" (UID: \"bf376c5f-cc04-4733-9968-e199472b4241\") " pod="openstack/neutron-f94b-account-create-update-cvr4d" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.744382 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.761664 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8fgs\" (UniqueName: \"kubernetes.io/projected/2c9e11a2-4b8f-4578-8c92-7d3e06258800-kube-api-access-j8fgs\") pod \"cinder-6db8-account-create-update-2ckqc\" (UID: \"2c9e11a2-4b8f-4578-8c92-7d3e06258800\") " pod="openstack/cinder-6db8-account-create-update-2ckqc" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.761797 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c9e11a2-4b8f-4578-8c92-7d3e06258800-operator-scripts\") pod \"cinder-6db8-account-create-update-2ckqc\" (UID: \"2c9e11a2-4b8f-4578-8c92-7d3e06258800\") " pod="openstack/cinder-6db8-account-create-update-2ckqc" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.853326 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.853755 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerName="cinder-api-log" containerID="cri-o://744308f3e6aa45dc8d65925950643c4deaca79d38d7fbf553daec9ce3a79d864" gracePeriod=30 Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.853961 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerName="cinder-api" containerID="cri-o://50ca6bcdb97f2c6020994172d7c32e3ae68a54330a49a74eecef1ee656c8e4db" gracePeriod=30 Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.860171 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl94x\" (UniqueName: \"kubernetes.io/projected/5f515e00-c6e0-4849-b073-64721780e216-kube-api-access-hl94x\") pod \"barbican-api-8c77bbddd-85rm4\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.865585 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c9e11a2-4b8f-4578-8c92-7d3e06258800-operator-scripts\") pod \"cinder-6db8-account-create-update-2ckqc\" (UID: \"2c9e11a2-4b8f-4578-8c92-7d3e06258800\") " pod="openstack/cinder-6db8-account-create-update-2ckqc" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.866698 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8fgs\" (UniqueName: \"kubernetes.io/projected/2c9e11a2-4b8f-4578-8c92-7d3e06258800-kube-api-access-j8fgs\") pod \"cinder-6db8-account-create-update-2ckqc\" (UID: \"2c9e11a2-4b8f-4578-8c92-7d3e06258800\") " pod="openstack/cinder-6db8-account-create-update-2ckqc" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.868112 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c9e11a2-4b8f-4578-8c92-7d3e06258800-operator-scripts\") pod \"cinder-6db8-account-create-update-2ckqc\" (UID: \"2c9e11a2-4b8f-4578-8c92-7d3e06258800\") " pod="openstack/cinder-6db8-account-create-update-2ckqc" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.896976 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2e07-account-create-update-sqj4n"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.922369 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f94b-account-create-update-cvr4d" Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.933417 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2e07-account-create-update-sqj4n"] Feb 23 07:09:56 crc kubenswrapper[5118]: I0223 07:09:56.973001 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2de9-account-create-update-7hqls"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:56.992854 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-241c-account-create-update-2sgxt"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:56.993116 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2de9-account-create-update-7hqls" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.013495 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-241c-account-create-update-2sgxt" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.020317 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2de9-account-create-update-7hqls"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.039643 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.039966 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.078382 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/652f586b-2ae4-4a45-bc82-01c65ec27696-operator-scripts\") pod \"nova-api-2de9-account-create-update-7hqls\" (UID: \"652f586b-2ae4-4a45-bc82-01c65ec27696\") " pod="openstack/nova-api-2de9-account-create-update-7hqls" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.078520 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvnw7\" (UniqueName: \"kubernetes.io/projected/652f586b-2ae4-4a45-bc82-01c65ec27696-kube-api-access-wvnw7\") pod \"nova-api-2de9-account-create-update-7hqls\" (UID: \"652f586b-2ae4-4a45-bc82-01c65ec27696\") " pod="openstack/nova-api-2de9-account-create-update-7hqls" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.098187 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a094-account-create-update-htzlh"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.119850 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-241c-account-create-update-2sgxt"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.126318 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.161526 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a094-account-create-update-htzlh"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.182815 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvnw7\" (UniqueName: \"kubernetes.io/projected/652f586b-2ae4-4a45-bc82-01c65ec27696-kube-api-access-wvnw7\") pod \"nova-api-2de9-account-create-update-7hqls\" (UID: \"652f586b-2ae4-4a45-bc82-01c65ec27696\") " pod="openstack/nova-api-2de9-account-create-update-7hqls" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.182860 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88fwz\" (UniqueName: \"kubernetes.io/projected/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-kube-api-access-88fwz\") pod \"nova-cell1-241c-account-create-update-2sgxt\" (UID: \"0079e5d0-c2f8-43f7-8dad-207eaedca4d6\") " pod="openstack/nova-cell1-241c-account-create-update-2sgxt" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.182962 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/652f586b-2ae4-4a45-bc82-01c65ec27696-operator-scripts\") pod \"nova-api-2de9-account-create-update-7hqls\" (UID: \"652f586b-2ae4-4a45-bc82-01c65ec27696\") " pod="openstack/nova-api-2de9-account-create-update-7hqls" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.183002 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-operator-scripts\") pod \"nova-cell1-241c-account-create-update-2sgxt\" (UID: \"0079e5d0-c2f8-43f7-8dad-207eaedca4d6\") " pod="openstack/nova-cell1-241c-account-create-update-2sgxt" Feb 23 07:09:57 crc kubenswrapper[5118]: E0223 07:09:57.183257 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:09:57 crc kubenswrapper[5118]: E0223 07:09:57.183363 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data podName:5721793b-d753-4519-b484-fa9cb958def9 nodeName:}" failed. No retries permitted until 2026-02-23 07:09:58.183344812 +0000 UTC m=+1461.187129385 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data") pod "rabbitmq-cell1-server-0" (UID: "5721793b-d753-4519-b484-fa9cb958def9") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.183997 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/652f586b-2ae4-4a45-bc82-01c65ec27696-operator-scripts\") pod \"nova-api-2de9-account-create-update-7hqls\" (UID: \"652f586b-2ae4-4a45-bc82-01c65ec27696\") " pod="openstack/nova-api-2de9-account-create-update-7hqls" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.215389 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9656-account-create-update-dshm9"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.216662 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9656-account-create-update-dshm9" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.222296 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.244195 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8fgs\" (UniqueName: \"kubernetes.io/projected/2c9e11a2-4b8f-4578-8c92-7d3e06258800-kube-api-access-j8fgs\") pod \"cinder-6db8-account-create-update-2ckqc\" (UID: \"2c9e11a2-4b8f-4578-8c92-7d3e06258800\") " pod="openstack/cinder-6db8-account-create-update-2ckqc" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.258176 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9656-account-create-update-dshm9"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.269694 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6db8-account-create-update-2ckqc" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.287600 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f94b-account-create-update-dxxsj"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.288896 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88fwz\" (UniqueName: \"kubernetes.io/projected/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-kube-api-access-88fwz\") pod \"nova-cell1-241c-account-create-update-2sgxt\" (UID: \"0079e5d0-c2f8-43f7-8dad-207eaedca4d6\") " pod="openstack/nova-cell1-241c-account-create-update-2sgxt" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.289017 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzxf\" (UniqueName: \"kubernetes.io/projected/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-kube-api-access-fwzxf\") pod \"nova-cell0-9656-account-create-update-dshm9\" (UID: \"8a26f9ed-63ef-4fa5-934f-e1190d79cf85\") " pod="openstack/nova-cell0-9656-account-create-update-dshm9" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.289046 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-operator-scripts\") pod \"nova-cell1-241c-account-create-update-2sgxt\" (UID: \"0079e5d0-c2f8-43f7-8dad-207eaedca4d6\") " pod="openstack/nova-cell1-241c-account-create-update-2sgxt" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.289125 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-operator-scripts\") pod \"nova-cell0-9656-account-create-update-dshm9\" (UID: \"8a26f9ed-63ef-4fa5-934f-e1190d79cf85\") " pod="openstack/nova-cell0-9656-account-create-update-dshm9" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.290245 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-operator-scripts\") pod \"nova-cell1-241c-account-create-update-2sgxt\" (UID: \"0079e5d0-c2f8-43f7-8dad-207eaedca4d6\") " pod="openstack/nova-cell1-241c-account-create-update-2sgxt" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.329181 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f94b-account-create-update-dxxsj"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.359364 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mtk98"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.361943 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.397250 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzxf\" (UniqueName: \"kubernetes.io/projected/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-kube-api-access-fwzxf\") pod \"nova-cell0-9656-account-create-update-dshm9\" (UID: \"8a26f9ed-63ef-4fa5-934f-e1190d79cf85\") " pod="openstack/nova-cell0-9656-account-create-update-dshm9" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.397708 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-operator-scripts\") pod \"nova-cell0-9656-account-create-update-dshm9\" (UID: \"8a26f9ed-63ef-4fa5-934f-e1190d79cf85\") " pod="openstack/nova-cell0-9656-account-create-update-dshm9" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.399031 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-operator-scripts\") pod \"nova-cell0-9656-account-create-update-dshm9\" (UID: \"8a26f9ed-63ef-4fa5-934f-e1190d79cf85\") " pod="openstack/nova-cell0-9656-account-create-update-dshm9" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.407711 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvnw7\" (UniqueName: \"kubernetes.io/projected/652f586b-2ae4-4a45-bc82-01c65ec27696-kube-api-access-wvnw7\") pod \"nova-api-2de9-account-create-update-7hqls\" (UID: \"652f586b-2ae4-4a45-bc82-01c65ec27696\") " pod="openstack/nova-api-2de9-account-create-update-7hqls" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.407790 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtk98"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.443679 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88fwz\" (UniqueName: \"kubernetes.io/projected/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-kube-api-access-88fwz\") pod \"nova-cell1-241c-account-create-update-2sgxt\" (UID: \"0079e5d0-c2f8-43f7-8dad-207eaedca4d6\") " pod="openstack/nova-cell1-241c-account-create-update-2sgxt" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.445448 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.445725 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerName="ovn-northd" containerID="cri-o://b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac" gracePeriod=30 Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.445873 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerName="openstack-network-exporter" containerID="cri-o://25d2ae49d9a05edcb3c8c8fa2bea4dca27a3dbb07af2c6c8fc988dc353770b62" gracePeriod=30 Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.454876 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzxf\" (UniqueName: \"kubernetes.io/projected/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-kube-api-access-fwzxf\") pod \"nova-cell0-9656-account-create-update-dshm9\" (UID: \"8a26f9ed-63ef-4fa5-934f-e1190d79cf85\") " pod="openstack/nova-cell0-9656-account-create-update-dshm9" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.505011 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2de9-account-create-update-8hbsf"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.506898 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-utilities\") pod \"redhat-marketplace-mtk98\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.526739 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-catalog-content\") pod \"redhat-marketplace-mtk98\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.527282 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95rh8\" (UniqueName: \"kubernetes.io/projected/8a98024d-a91d-4769-9ec8-5537f7d6c20f-kube-api-access-95rh8\") pod \"redhat-marketplace-mtk98\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.588188 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2de9-account-create-update-8hbsf"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.621554 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6db8-account-create-update-c2rqc"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.654036 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95rh8\" (UniqueName: \"kubernetes.io/projected/8a98024d-a91d-4769-9ec8-5537f7d6c20f-kube-api-access-95rh8\") pod \"redhat-marketplace-mtk98\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.654473 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-utilities\") pod \"redhat-marketplace-mtk98\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.654563 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-catalog-content\") pod \"redhat-marketplace-mtk98\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.655331 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-catalog-content\") pod \"redhat-marketplace-mtk98\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.655942 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-utilities\") pod \"redhat-marketplace-mtk98\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.749263 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-241c-account-create-update-2sgxt" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.795380 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="235a2c28-6291-44bc-8b66-10fe612e0c9e" path="/var/lib/kubelet/pods/235a2c28-6291-44bc-8b66-10fe612e0c9e/volumes" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.796485 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2de9-account-create-update-7hqls" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.797326 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35127816-7431-4cc3-abde-24014813fbec" path="/var/lib/kubelet/pods/35127816-7431-4cc3-abde-24014813fbec/volumes" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.797883 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af5d4fa-f037-4eb9-a893-7506a4541eb5" path="/var/lib/kubelet/pods/3af5d4fa-f037-4eb9-a893-7506a4541eb5/volumes" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.798407 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ec3af7-316f-43bc-9a57-cee6a641b441" path="/var/lib/kubelet/pods/50ec3af7-316f-43bc-9a57-cee6a641b441/volumes" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.802161 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66bdd8af-d48f-4419-83cc-c06f0b71ee32" path="/var/lib/kubelet/pods/66bdd8af-d48f-4419-83cc-c06f0b71ee32/volumes" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.809825 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deca025a-7a57-4244-84e3-541fd5f7760d" path="/var/lib/kubelet/pods/deca025a-7a57-4244-84e3-541fd5f7760d/volumes" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.826039 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9656-account-create-update-dshm9" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.829814 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.829860 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6db8-account-create-update-c2rqc"] Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.830236 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="015de465-4df3-4178-b28b-3dd5ec0f37aa" containerName="openstack-network-exporter" containerID="cri-o://2ddc356225a64a2567d2c939ebf5232096d7165e8a350bbdddf0c8812106b5e2" gracePeriod=300 Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.886127 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95rh8\" (UniqueName: \"kubernetes.io/projected/8a98024d-a91d-4769-9ec8-5537f7d6c20f-kube-api-access-95rh8\") pod \"redhat-marketplace-mtk98\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.917369 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:09:57 crc kubenswrapper[5118]: I0223 07:09:57.954474 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="015de465-4df3-4178-b28b-3dd5ec0f37aa" containerName="ovsdbserver-nb" containerID="cri-o://49b70411a72db2ea12793c02e06b682bbf363fae840fa8bc8f2d66bc890a6eef" gracePeriod=300 Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.290888 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.300960 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerName="openstack-network-exporter" containerID="cri-o://4bcc61517bff899445a02cd4853722bfdc2eee81bfb4ecb4fabf4e9cdeba87db" gracePeriod=300 Feb 23 07:09:58 crc kubenswrapper[5118]: E0223 07:09:58.348218 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:09:58 crc kubenswrapper[5118]: E0223 07:09:58.406022 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data podName:5721793b-d753-4519-b484-fa9cb958def9 nodeName:}" failed. No retries permitted until 2026-02-23 07:10:00.348302804 +0000 UTC m=+1463.352087377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data") pod "rabbitmq-cell1-server-0" (UID: "5721793b-d753-4519-b484-fa9cb958def9") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.408273 5118 generic.go:334] "Generic (PLEG): container finished" podID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerID="25d2ae49d9a05edcb3c8c8fa2bea4dca27a3dbb07af2c6c8fc988dc353770b62" exitCode=2 Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.408328 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752","Type":"ContainerDied","Data":"25d2ae49d9a05edcb3c8c8fa2bea4dca27a3dbb07af2c6c8fc988dc353770b62"} Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.417189 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-241c-account-create-update-tnztd"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.446156 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-241c-account-create-update-tnztd"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.452661 5118 generic.go:334] "Generic (PLEG): container finished" podID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerID="744308f3e6aa45dc8d65925950643c4deaca79d38d7fbf553daec9ce3a79d864" exitCode=143 Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.459265 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f424d603-7efb-4075-9a9b-5117dec09a6a","Type":"ContainerDied","Data":"744308f3e6aa45dc8d65925950643c4deaca79d38d7fbf553daec9ce3a79d864"} Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.492347 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9656-account-create-update-k8fz9"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.568590 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_015de465-4df3-4178-b28b-3dd5ec0f37aa/ovsdbserver-nb/0.log" Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.568641 5118 generic.go:334] "Generic (PLEG): container finished" podID="015de465-4df3-4178-b28b-3dd5ec0f37aa" containerID="2ddc356225a64a2567d2c939ebf5232096d7165e8a350bbdddf0c8812106b5e2" exitCode=2 Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.568671 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"015de465-4df3-4178-b28b-3dd5ec0f37aa","Type":"ContainerDied","Data":"2ddc356225a64a2567d2c939ebf5232096d7165e8a350bbdddf0c8812106b5e2"} Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.572741 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9656-account-create-update-k8fz9"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.649537 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-lhskw"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.676636 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-lhskw"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.692817 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ms95n"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.706374 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ms95n"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.720654 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-95g6h"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.736387 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-95g6h"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.794501 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerName="ovsdbserver-sb" containerID="cri-o://ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d" gracePeriod=300 Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.825181 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-tv452"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.857028 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-tv452"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.894252 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fblpx"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.904179 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fblpx"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.922376 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-f8rdg"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.957511 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-jtrcj"] Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.958515 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-jtrcj" podUID="9568fac1-abdb-4b34-a0b7-e27d6c2183ee" containerName="openstack-network-exporter" containerID="cri-o://e782f3020fb567d1d112542543ef24f19219d86c018400d9252caaab0cecce88" gracePeriod=30 Feb 23 07:09:58 crc kubenswrapper[5118]: I0223 07:09:58.992185 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-dvc2v"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.057041 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-h47kr"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.084544 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-tlrdl"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.084924 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" podUID="15482ef3-bf3a-4442-9b2e-89222e09d218" containerName="dnsmasq-dns" containerID="cri-o://193b69117be5af96f8bada0bd3b6b78639823387315ad23c2241b27eb7e2cc5a" gracePeriod=10 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.122197 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-h47kr"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.179910 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-d7ws6"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.205019 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-d7ws6"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.213402 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.214016 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2a79f618-3555-44a5-8c52-ec9120261645" containerName="glance-log" containerID="cri-o://6b8c6eb9b9ce37fbe932ac5074db82c6dfa9e1f23cb6e0b090172cccdaf73dcf" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.217674 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2a79f618-3555-44a5-8c52-ec9120261645" containerName="glance-httpd" containerID="cri-o://4000f09676001090ea18cdb7df7926a1a769a2407d4b9008c3f8702217c3491e" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.242916 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b5b85dd46-5prjs"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.243211 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b5b85dd46-5prjs" podUID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" containerName="placement-log" containerID="cri-o://d9b130e05974e910c11275a4779f706f2b8c61a00e6da82692831f10958cd486" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.243622 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b5b85dd46-5prjs" podUID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" containerName="placement-api" containerID="cri-o://bf1c589b730495897b68ee4004af7ede183db68decb4860331926ec63e038b03" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.305115 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-vlb9z"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.339711 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6ff569978f-gwmwn"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.339947 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6ff569978f-gwmwn" podUID="cf1c2052-6563-45c5-888c-f7a153225f83" containerName="neutron-api" containerID="cri-o://26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.343720 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6ff569978f-gwmwn" podUID="cf1c2052-6563-45c5-888c-f7a153225f83" containerName="neutron-httpd" containerID="cri-o://8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.388139 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-vlb9z"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.433304 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-7lndk"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.516574 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-7lndk"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.573585 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-dcn6z"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.600136 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-284f-account-create-update-75k6g"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.608784 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.609411 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-server" containerID="cri-o://b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.609687 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-updater" containerID="cri-o://5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.609885 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-auditor" containerID="cri-o://9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.609941 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-replicator" containerID="cri-o://8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.609977 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-server" containerID="cri-o://5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.610012 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-reaper" containerID="cri-o://8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.610048 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-auditor" containerID="cri-o://3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.610085 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-replicator" containerID="cri-o://1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.610209 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="rsync" containerID="cri-o://6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.609899 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="swift-recon-cron" containerID="cri-o://e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.610403 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-expirer" containerID="cri-o://a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.610492 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-auditor" containerID="cri-o://e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.610569 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-updater" containerID="cri-o://ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.610576 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-replicator" containerID="cri-o://254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.610646 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-server" containerID="cri-o://0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.638970 5118 generic.go:334] "Generic (PLEG): container finished" podID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" containerID="befb16d8a3f73c9a8d367e972c7e2cebda984c914cdc82b0856f084d6aa276f7" exitCode=0 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.639039 5118 generic.go:334] "Generic (PLEG): container finished" podID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" containerID="56b4d180783d05f72c701c589b061dafc5eb3886c60c04d8b8ec016724107c8f" exitCode=0 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.639115 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff79df5-722f-4ea5-91a2-8368d8eeee99","Type":"ContainerDied","Data":"befb16d8a3f73c9a8d367e972c7e2cebda984c914cdc82b0856f084d6aa276f7"} Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.639153 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff79df5-722f-4ea5-91a2-8368d8eeee99","Type":"ContainerDied","Data":"56b4d180783d05f72c701c589b061dafc5eb3886c60c04d8b8ec016724107c8f"} Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.656204 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jtrcj_9568fac1-abdb-4b34-a0b7-e27d6c2183ee/openstack-network-exporter/0.log" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.666378 5118 generic.go:334] "Generic (PLEG): container finished" podID="9568fac1-abdb-4b34-a0b7-e27d6c2183ee" containerID="e782f3020fb567d1d112542543ef24f19219d86c018400d9252caaab0cecce88" exitCode=2 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.666609 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jtrcj" event={"ID":"9568fac1-abdb-4b34-a0b7-e27d6c2183ee","Type":"ContainerDied","Data":"e782f3020fb567d1d112542543ef24f19219d86c018400d9252caaab0cecce88"} Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.692636 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-dcn6z"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.693812 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2474920d-9d8a-4fc8-a8bc-7844ed0ef139/ovsdbserver-sb/0.log" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.693875 5118 generic.go:334] "Generic (PLEG): container finished" podID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerID="4bcc61517bff899445a02cd4853722bfdc2eee81bfb4ecb4fabf4e9cdeba87db" exitCode=2 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.694209 5118 generic.go:334] "Generic (PLEG): container finished" podID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerID="ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d" exitCode=143 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.694278 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2474920d-9d8a-4fc8-a8bc-7844ed0ef139","Type":"ContainerDied","Data":"4bcc61517bff899445a02cd4853722bfdc2eee81bfb4ecb4fabf4e9cdeba87db"} Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.694337 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2474920d-9d8a-4fc8-a8bc-7844ed0ef139","Type":"ContainerDied","Data":"ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d"} Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.719815 5118 generic.go:334] "Generic (PLEG): container finished" podID="1b3633e5-65f3-41c8-be57-5c4e28227ec9" containerID="4ba34d66d196c48e8577f6109b59eff4f5b86c53ed431eae0c2091d19a42b92c" exitCode=137 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.760471 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f00e7b-49f2-4bb1-a540-c7f9a13a76b7" path="/var/lib/kubelet/pods/07f00e7b-49f2-4bb1-a540-c7f9a13a76b7/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.772845 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="222c1eb0-e9da-4365-ad64-850496d1ceb7" path="/var/lib/kubelet/pods/222c1eb0-e9da-4365-ad64-850496d1ceb7/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.773718 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df7ebd2-0918-4164-beec-18057338255d" path="/var/lib/kubelet/pods/2df7ebd2-0918-4164-beec-18057338255d/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.774320 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3867ac0b-7e84-483e-917e-3f08a8ee2ae0" path="/var/lib/kubelet/pods/3867ac0b-7e84-483e-917e-3f08a8ee2ae0/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.780495 5118 generic.go:334] "Generic (PLEG): container finished" podID="15482ef3-bf3a-4442-9b2e-89222e09d218" containerID="193b69117be5af96f8bada0bd3b6b78639823387315ad23c2241b27eb7e2cc5a" exitCode=0 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.790635 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3940758f-3420-45cc-8824-06a4daf1b598" path="/var/lib/kubelet/pods/3940758f-3420-45cc-8824-06a4daf1b598/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.791839 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4acffeac-bca3-441a-bf4c-81033e75dd62" path="/var/lib/kubelet/pods/4acffeac-bca3-441a-bf4c-81033e75dd62/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.792505 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70884006-8eb9-4cbb-ac48-a18531a8fe62" path="/var/lib/kubelet/pods/70884006-8eb9-4cbb-ac48-a18531a8fe62/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.803548 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81af6409-f8ce-485c-a2a1-1b1cce7c5433" path="/var/lib/kubelet/pods/81af6409-f8ce-485c-a2a1-1b1cce7c5433/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.804590 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="892a426d-2034-416c-b5ce-9dbe665ff99e" path="/var/lib/kubelet/pods/892a426d-2034-416c-b5ce-9dbe665ff99e/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.805834 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc15918-ffec-4589-85ea-bdf3999dc8d6" path="/var/lib/kubelet/pods/8bc15918-ffec-4589-85ea-bdf3999dc8d6/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.810949 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92010864-4f7f-4066-9ce5-d14281346f97" path="/var/lib/kubelet/pods/92010864-4f7f-4066-9ce5-d14281346f97/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.811591 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d081a48a-959d-43fe-92de-180551979ba7" path="/var/lib/kubelet/pods/d081a48a-959d-43fe-92de-180551979ba7/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.812133 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ee323c-688a-4726-9c5d-75a21d189c67" path="/var/lib/kubelet/pods/f3ee323c-688a-4726-9c5d-75a21d189c67/volumes" Feb 23 07:09:59 crc kubenswrapper[5118]: E0223 07:09:59.813396 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.816892 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.816934 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" event={"ID":"15482ef3-bf3a-4442-9b2e-89222e09d218","Type":"ContainerDied","Data":"193b69117be5af96f8bada0bd3b6b78639823387315ad23c2241b27eb7e2cc5a"} Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.817237 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="376ff246-417d-442a-83d1-1579abd318ba" containerName="glance-log" containerID="cri-o://7aa9aba83a6a4faa226f1ac01b3aa1d78be1db352ff3cdcb6601be775a19cd5c" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.817663 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="376ff246-417d-442a-83d1-1579abd318ba" containerName="glance-httpd" containerID="cri-o://f79ac5b11fb0deba4274f00ce33e6c2a41832523dee197459bcb484c55984024" gracePeriod=30 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.824222 5118 generic.go:334] "Generic (PLEG): container finished" podID="2a79f618-3555-44a5-8c52-ec9120261645" containerID="6b8c6eb9b9ce37fbe932ac5074db82c6dfa9e1f23cb6e0b090172cccdaf73dcf" exitCode=143 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.824332 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a79f618-3555-44a5-8c52-ec9120261645","Type":"ContainerDied","Data":"6b8c6eb9b9ce37fbe932ac5074db82c6dfa9e1f23cb6e0b090172cccdaf73dcf"} Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.829624 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2e07-account-create-update-9hjkr"] Feb 23 07:09:59 crc kubenswrapper[5118]: E0223 07:09:59.876913 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.882015 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f94b-account-create-update-cvr4d"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.919956 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2fp2g"] Feb 23 07:09:59 crc kubenswrapper[5118]: E0223 07:09:59.932886 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:09:59 crc kubenswrapper[5118]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:09:59 crc kubenswrapper[5118]: Feb 23 07:09:59 crc kubenswrapper[5118]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:09:59 crc kubenswrapper[5118]: Feb 23 07:09:59 crc kubenswrapper[5118]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:09:59 crc kubenswrapper[5118]: Feb 23 07:09:59 crc kubenswrapper[5118]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:09:59 crc kubenswrapper[5118]: Feb 23 07:09:59 crc kubenswrapper[5118]: if [ -n "glance" ]; then Feb 23 07:09:59 crc kubenswrapper[5118]: GRANT_DATABASE="glance" Feb 23 07:09:59 crc kubenswrapper[5118]: else Feb 23 07:09:59 crc kubenswrapper[5118]: GRANT_DATABASE="*" Feb 23 07:09:59 crc kubenswrapper[5118]: fi Feb 23 07:09:59 crc kubenswrapper[5118]: Feb 23 07:09:59 crc kubenswrapper[5118]: # going for maximum compatibility here: Feb 23 07:09:59 crc kubenswrapper[5118]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:09:59 crc kubenswrapper[5118]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:09:59 crc kubenswrapper[5118]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:09:59 crc kubenswrapper[5118]: # support updates Feb 23 07:09:59 crc kubenswrapper[5118]: Feb 23 07:09:59 crc kubenswrapper[5118]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:09:59 crc kubenswrapper[5118]: E0223 07:09:59.933346 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 07:09:59 crc kubenswrapper[5118]: E0223 07:09:59.933399 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerName="ovn-northd" Feb 23 07:09:59 crc kubenswrapper[5118]: E0223 07:09:59.934916 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-284f-account-create-update-75k6g" podUID="2f6bea7d-670d-4ade-a7d6-0fea6a7e503d" Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.938278 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-wgxlm"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.945296 5118 generic.go:334] "Generic (PLEG): container finished" podID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" containerID="d9b130e05974e910c11275a4779f706f2b8c61a00e6da82692831f10958cd486" exitCode=143 Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.945421 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5b85dd46-5prjs" event={"ID":"b0e825c7-deb0-41b5-b358-f23dcc0f1082","Type":"ContainerDied","Data":"d9b130e05974e910c11275a4779f706f2b8c61a00e6da82692831f10958cd486"} Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.947550 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-wgxlm"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.956681 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wvskp"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.967784 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2fp2g"] Feb 23 07:09:59 crc kubenswrapper[5118]: I0223 07:09:59.976607 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6db8-account-create-update-2ckqc"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:09:59.986446 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_015de465-4df3-4178-b28b-3dd5ec0f37aa/ovsdbserver-nb/0.log" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:09:59.986527 5118 generic.go:334] "Generic (PLEG): container finished" podID="015de465-4df3-4178-b28b-3dd5ec0f37aa" containerID="49b70411a72db2ea12793c02e06b682bbf363fae840fa8bc8f2d66bc890a6eef" exitCode=143 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:09:59.986590 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"015de465-4df3-4178-b28b-3dd5ec0f37aa","Type":"ContainerDied","Data":"49b70411a72db2ea12793c02e06b682bbf363fae840fa8bc8f2d66bc890a6eef"} Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:09:59.997229 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wvskp"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.016305 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a094-account-create-update-5m7nf"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.026240 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.026503 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-log" containerID="cri-o://d26689d86956826776f6a68df424f29414b578bdb30f6c4e05418d6010a876b3" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.026997 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-metadata" containerID="cri-o://63d5f3f40e0bd27af62b5fc9b31aa60381f047575c6ba94cbf1284e1e6bbf343" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.035416 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.050757 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-77bd586555-4s2g7"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.051747 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-77bd586555-4s2g7" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" containerName="proxy-httpd" containerID="cri-o://d4786168035f880a50d9daf3c146a82f881d078e61fee117923247704b6bc372" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.052011 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-77bd586555-4s2g7" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" containerName="proxy-server" containerID="cri-o://66b033f5cdba1801f896cac33e18ec0b1d1f6261e8ea02c514f953da7c1c5a84" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.076200 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-n6s7x"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.089656 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-241c-account-create-update-2sgxt"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.107570 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-n6s7x"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.125184 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-69db8f76f-xbrx7"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.125820 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-69db8f76f-xbrx7" podUID="e66bcbb5-075a-4a87-981c-0dc608f19742" containerName="barbican-worker-log" containerID="cri-o://c1cb45006da98332eb6cd8b2fbe6eb461c5eecf7073435dc50db5dc59fb5544b" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.126336 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-69db8f76f-xbrx7" podUID="e66bcbb5-075a-4a87-981c-0dc608f19742" containerName="barbican-worker" containerID="cri-o://6562661aead1d6afb939f1cc5250e5487e9146e38f4c8f43650a040bdd480d69" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.138430 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-75595465d9-2pkqt"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.148518 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.161810 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerName="nova-api-log" containerID="cri-o://af045486508fe8cab1c8b590668cb9fcd431b4bc730c81a19eb7129715e8530d" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.161984 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerName="nova-api-api" containerID="cri-o://5285ecb4115449aa0975018a2c7a4ad15449770fc6a16a5b4cc0704a6eea208b" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.212559 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-868db49cd-qncjp"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.213165 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" podUID="08b38d85-cf57-41a9-9779-1593300b77a3" containerName="barbican-keystone-listener-log" containerID="cri-o://838eea4ca37b326a64d5d678a35327acd726a105a4ac2e5f2979892d448df136" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.219079 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" podUID="08b38d85-cf57-41a9-9779-1593300b77a3" containerName="barbican-keystone-listener" containerID="cri-o://4478c2eb2702eff226081dd62c590b24a317a608e7697bf1cd6f86ae6c3d71e1" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.233687 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovs-vswitchd" containerID="cri-o://b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" gracePeriod=29 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.268398 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.268864 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="460a8b7a-b61f-4f56-889e-54b5c2346679" containerName="nova-cell1-conductor-conductor" containerID="cri-o://896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.288145 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8c77bbddd-85rm4"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.296024 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_015de465-4df3-4178-b28b-3dd5ec0f37aa/ovsdbserver-nb/0.log" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.296215 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.296553 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-ks9t7"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.311768 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wps6h"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.347203 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.347667 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2de9-account-create-update-7hqls"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.360321 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-combined-ca-bundle\") pod \"015de465-4df3-4178-b28b-3dd5ec0f37aa\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.360378 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-metrics-certs-tls-certs\") pod \"015de465-4df3-4178-b28b-3dd5ec0f37aa\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.360856 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqm47\" (UniqueName: \"kubernetes.io/projected/015de465-4df3-4178-b28b-3dd5ec0f37aa-kube-api-access-bqm47\") pod \"015de465-4df3-4178-b28b-3dd5ec0f37aa\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.360887 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67t5b\" (UniqueName: \"kubernetes.io/projected/1b3633e5-65f3-41c8-be57-5c4e28227ec9-kube-api-access-67t5b\") pod \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.361130 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config-secret\") pod \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.361158 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdbserver-nb-tls-certs\") pod \"015de465-4df3-4178-b28b-3dd5ec0f37aa\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.361196 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdb-rundir\") pod \"015de465-4df3-4178-b28b-3dd5ec0f37aa\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.361217 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-combined-ca-bundle\") pod \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.361259 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-scripts\") pod \"015de465-4df3-4178-b28b-3dd5ec0f37aa\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.361447 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config\") pod \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.365436 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"015de465-4df3-4178-b28b-3dd5ec0f37aa\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.365489 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-config\") pod \"015de465-4df3-4178-b28b-3dd5ec0f37aa\" (UID: \"015de465-4df3-4178-b28b-3dd5ec0f37aa\") " Feb 23 07:10:00 crc kubenswrapper[5118]: E0223 07:10:00.373756 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:10:00 crc kubenswrapper[5118]: E0223 07:10:00.373857 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data podName:5721793b-d753-4519-b484-fa9cb958def9 nodeName:}" failed. No retries permitted until 2026-02-23 07:10:04.373831015 +0000 UTC m=+1467.377615588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data") pod "rabbitmq-cell1-server-0" (UID: "5721793b-d753-4519-b484-fa9cb958def9") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.395543 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-scripts" (OuterVolumeSpecName: "scripts") pod "015de465-4df3-4178-b28b-3dd5ec0f37aa" (UID: "015de465-4df3-4178-b28b-3dd5ec0f37aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.395835 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "015de465-4df3-4178-b28b-3dd5ec0f37aa" (UID: "015de465-4df3-4178-b28b-3dd5ec0f37aa"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.400415 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-config" (OuterVolumeSpecName: "config") pod "015de465-4df3-4178-b28b-3dd5ec0f37aa" (UID: "015de465-4df3-4178-b28b-3dd5ec0f37aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.405827 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-587b7cf474-nwcvp"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.454401 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "015de465-4df3-4178-b28b-3dd5ec0f37aa" (UID: "015de465-4df3-4178-b28b-3dd5ec0f37aa"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.454679 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015de465-4df3-4178-b28b-3dd5ec0f37aa-kube-api-access-bqm47" (OuterVolumeSpecName: "kube-api-access-bqm47") pod "015de465-4df3-4178-b28b-3dd5ec0f37aa" (UID: "015de465-4df3-4178-b28b-3dd5ec0f37aa"). InnerVolumeSpecName "kube-api-access-bqm47". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.487192 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.487245 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.487256 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/015de465-4df3-4178-b28b-3dd5ec0f37aa-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.487267 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqm47\" (UniqueName: \"kubernetes.io/projected/015de465-4df3-4178-b28b-3dd5ec0f37aa-kube-api-access-bqm47\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.487281 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.543991 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3633e5-65f3-41c8-be57-5c4e28227ec9-kube-api-access-67t5b" (OuterVolumeSpecName: "kube-api-access-67t5b") pod "1b3633e5-65f3-41c8-be57-5c4e28227ec9" (UID: "1b3633e5-65f3-41c8-be57-5c4e28227ec9"). InnerVolumeSpecName "kube-api-access-67t5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.580030 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="178ef478-d8d3-49a5-9188-9970d3859049" containerName="galera" containerID="cri-o://e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.608000 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5df75dfc9b-mpgf2"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.608324 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5df75dfc9b-mpgf2" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api-log" containerID="cri-o://27b321bfffbb4bd657b82546013c7861bf1c2f5af05ee3956e5a515a75954c59" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.609065 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5df75dfc9b-mpgf2" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api" containerID="cri-o://b78a6f1ffe539d4eb30bf74ad5a43f3802cf0350d1aba256152e434c6b28d94e" gracePeriod=30 Feb 23 07:10:00 crc kubenswrapper[5118]: E0223 07:10:00.610279 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d is running failed: container process not found" containerID="ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.628336 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67t5b\" (UniqueName: \"kubernetes.io/projected/1b3633e5-65f3-41c8-be57-5c4e28227ec9-kube-api-access-67t5b\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:00 crc kubenswrapper[5118]: E0223 07:10:00.633872 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d is running failed: container process not found" containerID="ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 23 07:10:00 crc kubenswrapper[5118]: E0223 07:10:00.635355 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d is running failed: container process not found" containerID="ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 23 07:10:00 crc kubenswrapper[5118]: E0223 07:10:00.635489 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerName="ovsdbserver-sb" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.752064 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-v6s5m"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.774446 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b3633e5-65f3-41c8-be57-5c4e28227ec9" (UID: "1b3633e5-65f3-41c8-be57-5c4e28227ec9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.793713 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wps6h"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.829501 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-ks9t7"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.836784 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.843878 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-v6s5m"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.879273 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "015de465-4df3-4178-b28b-3dd5ec0f37aa" (UID: "015de465-4df3-4178-b28b-3dd5ec0f37aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.923255 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9656-account-create-update-dshm9"] Feb 23 07:10:00 crc kubenswrapper[5118]: E0223 07:10:00.936434 5118 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 23 07:10:00 crc kubenswrapper[5118]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 23 07:10:00 crc kubenswrapper[5118]: + source /usr/local/bin/container-scripts/functions Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNBridge=br-int Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNRemote=tcp:localhost:6642 Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNEncapType=geneve Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNAvailabilityZones= Feb 23 07:10:00 crc kubenswrapper[5118]: ++ EnableChassisAsGateway=true Feb 23 07:10:00 crc kubenswrapper[5118]: ++ PhysicalNetworks= Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNHostName= Feb 23 07:10:00 crc kubenswrapper[5118]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 23 07:10:00 crc kubenswrapper[5118]: ++ ovs_dir=/var/lib/openvswitch Feb 23 07:10:00 crc kubenswrapper[5118]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 23 07:10:00 crc kubenswrapper[5118]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 23 07:10:00 crc kubenswrapper[5118]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:10:00 crc kubenswrapper[5118]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:10:00 crc kubenswrapper[5118]: + sleep 0.5 Feb 23 07:10:00 crc kubenswrapper[5118]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:10:00 crc kubenswrapper[5118]: + sleep 0.5 Feb 23 07:10:00 crc kubenswrapper[5118]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:10:00 crc kubenswrapper[5118]: + cleanup_ovsdb_server_semaphore Feb 23 07:10:00 crc kubenswrapper[5118]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:10:00 crc kubenswrapper[5118]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 23 07:10:00 crc kubenswrapper[5118]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-dvc2v" message=< Feb 23 07:10:00 crc kubenswrapper[5118]: Exiting ovsdb-server (5) [ OK ] Feb 23 07:10:00 crc kubenswrapper[5118]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 23 07:10:00 crc kubenswrapper[5118]: + source /usr/local/bin/container-scripts/functions Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNBridge=br-int Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNRemote=tcp:localhost:6642 Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNEncapType=geneve Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNAvailabilityZones= Feb 23 07:10:00 crc kubenswrapper[5118]: ++ EnableChassisAsGateway=true Feb 23 07:10:00 crc kubenswrapper[5118]: ++ PhysicalNetworks= Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNHostName= Feb 23 07:10:00 crc kubenswrapper[5118]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 23 07:10:00 crc kubenswrapper[5118]: ++ ovs_dir=/var/lib/openvswitch Feb 23 07:10:00 crc kubenswrapper[5118]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 23 07:10:00 crc kubenswrapper[5118]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 23 07:10:00 crc kubenswrapper[5118]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:10:00 crc kubenswrapper[5118]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:10:00 crc kubenswrapper[5118]: + sleep 0.5 Feb 23 07:10:00 crc kubenswrapper[5118]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:10:00 crc kubenswrapper[5118]: + sleep 0.5 Feb 23 07:10:00 crc kubenswrapper[5118]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:10:00 crc kubenswrapper[5118]: + cleanup_ovsdb_server_semaphore Feb 23 07:10:00 crc kubenswrapper[5118]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:10:00 crc kubenswrapper[5118]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 23 07:10:00 crc kubenswrapper[5118]: > Feb 23 07:10:00 crc kubenswrapper[5118]: E0223 07:10:00.936509 5118 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 23 07:10:00 crc kubenswrapper[5118]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 23 07:10:00 crc kubenswrapper[5118]: + source /usr/local/bin/container-scripts/functions Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNBridge=br-int Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNRemote=tcp:localhost:6642 Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNEncapType=geneve Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNAvailabilityZones= Feb 23 07:10:00 crc kubenswrapper[5118]: ++ EnableChassisAsGateway=true Feb 23 07:10:00 crc kubenswrapper[5118]: ++ PhysicalNetworks= Feb 23 07:10:00 crc kubenswrapper[5118]: ++ OVNHostName= Feb 23 07:10:00 crc kubenswrapper[5118]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 23 07:10:00 crc kubenswrapper[5118]: ++ ovs_dir=/var/lib/openvswitch Feb 23 07:10:00 crc kubenswrapper[5118]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 23 07:10:00 crc kubenswrapper[5118]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 23 07:10:00 crc kubenswrapper[5118]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:10:00 crc kubenswrapper[5118]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:10:00 crc kubenswrapper[5118]: + sleep 0.5 Feb 23 07:10:00 crc kubenswrapper[5118]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:10:00 crc kubenswrapper[5118]: + sleep 0.5 Feb 23 07:10:00 crc kubenswrapper[5118]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 23 07:10:00 crc kubenswrapper[5118]: + cleanup_ovsdb_server_semaphore Feb 23 07:10:00 crc kubenswrapper[5118]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 23 07:10:00 crc kubenswrapper[5118]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 23 07:10:00 crc kubenswrapper[5118]: > pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" containerID="cri-o://b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.936606 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" containerID="cri-o://b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" gracePeriod=29 Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.946323 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1b3633e5-65f3-41c8-be57-5c4e28227ec9" (UID: "1b3633e5-65f3-41c8-be57-5c4e28227ec9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.946680 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config\") pod \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\" (UID: \"1b3633e5-65f3-41c8-be57-5c4e28227ec9\") " Feb 23 07:10:00 crc kubenswrapper[5118]: W0223 07:10:00.946915 5118 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1b3633e5-65f3-41c8-be57-5c4e28227ec9/volumes/kubernetes.io~configmap/openstack-config Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.946953 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1b3633e5-65f3-41c8-be57-5c4e28227ec9" (UID: "1b3633e5-65f3-41c8-be57-5c4e28227ec9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.965736 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.965791 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.985253 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.987385 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:10:00 crc kubenswrapper[5118]: I0223 07:10:00.987686 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="334e9392-6a5f-4aa8-83d7-41e26e94dd32" containerName="nova-cell0-conductor-conductor" containerID="cri-o://6820419b366ae8b47d31039250fb3d906f1f99ec9b556bf263a6f996823a2472" gracePeriod=30 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.014743 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4tnnx"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.018106 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "015de465-4df3-4178-b28b-3dd5ec0f37aa" (UID: "015de465-4df3-4178-b28b-3dd5ec0f37aa"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.019401 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4tnnx"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.024131 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.024346 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="04ba04e0-7d62-472f-ab31-c41f926c93e7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0751ec02f589e9689680bdbc3c817812f2676d3d07252002303baef1e65fe57b" gracePeriod=30 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.032624 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.032907 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cf1e2e3d-0fc3-474a-a15d-6808347c8240" containerName="nova-scheduler-scheduler" containerID="cri-o://399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d" gracePeriod=30 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.034519 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-284f-account-create-update-75k6g"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.044215 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.047207 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.057471 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "015de465-4df3-4178-b28b-3dd5ec0f37aa" (UID: "015de465-4df3-4178-b28b-3dd5ec0f37aa"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.060509 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-587b7cf474-nwcvp"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.061326 5118 generic.go:334] "Generic (PLEG): container finished" podID="e66bcbb5-075a-4a87-981c-0dc608f19742" containerID="c1cb45006da98332eb6cd8b2fbe6eb461c5eecf7073435dc50db5dc59fb5544b" exitCode=143 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.061427 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69db8f76f-xbrx7" event={"ID":"e66bcbb5-075a-4a87-981c-0dc608f19742","Type":"ContainerDied","Data":"c1cb45006da98332eb6cd8b2fbe6eb461c5eecf7073435dc50db5dc59fb5544b"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.064393 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-skw6h"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.067464 5118 generic.go:334] "Generic (PLEG): container finished" podID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerID="27b321bfffbb4bd657b82546013c7861bf1c2f5af05ee3956e5a515a75954c59" exitCode=143 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.067526 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5df75dfc9b-mpgf2" event={"ID":"95c7c403-ece4-4778-9a1c-25dbc355a0bf","Type":"ContainerDied","Data":"27b321bfffbb4bd657b82546013c7861bf1c2f5af05ee3956e5a515a75954c59"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.069545 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff79df5-722f-4ea5-91a2-8368d8eeee99","Type":"ContainerDied","Data":"e6d523e4b486df6459175b76b433d0f80a8af70c1a832712e0f2bf145831fe81"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.069570 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6d523e4b486df6459175b76b433d0f80a8af70c1a832712e0f2bf145831fe81" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.070534 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-284f-account-create-update-75k6g" event={"ID":"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d","Type":"ContainerStarted","Data":"eff661844897b462a2ba9b194595054fb6162e57dd1a1ba95dab0640d5674a53"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.073345 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.073378 5118 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.073391 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/015de465-4df3-4178-b28b-3dd5ec0f37aa-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.074413 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1b3633e5-65f3-41c8-be57-5c4e28227ec9" (UID: "1b3633e5-65f3-41c8-be57-5c4e28227ec9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.074555 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": read tcp 10.217.0.2:55974->10.217.0.163:8776: read: connection reset by peer" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.094768 5118 generic.go:334] "Generic (PLEG): container finished" podID="68f11050-5931-4be3-8e5b-194035e88020" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.094895 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dvc2v" event={"ID":"68f11050-5931-4be3-8e5b-194035e88020","Type":"ContainerDied","Data":"b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.138658 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jtrcj_9568fac1-abdb-4b34-a0b7-e27d6c2183ee/openstack-network-exporter/0.log" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.139403 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jtrcj" event={"ID":"9568fac1-abdb-4b34-a0b7-e27d6c2183ee","Type":"ContainerDied","Data":"7d77ea7b34a758bfcf2117aa91c7d2693cd803d6697653b510f7487074e9e561"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.140259 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d77ea7b34a758bfcf2117aa91c7d2693cd803d6697653b510f7487074e9e561" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.142851 5118 generic.go:334] "Generic (PLEG): container finished" podID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerID="af045486508fe8cab1c8b590668cb9fcd431b4bc730c81a19eb7129715e8530d" exitCode=143 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.142940 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cd013d81-347d-4c1c-9ccf-0f5e1a590755","Type":"ContainerDied","Data":"af045486508fe8cab1c8b590668cb9fcd431b4bc730c81a19eb7129715e8530d"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.156759 5118 generic.go:334] "Generic (PLEG): container finished" podID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerID="d26689d86956826776f6a68df424f29414b578bdb30f6c4e05418d6010a876b3" exitCode=143 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.156913 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84fdf432-1886-4e91-bd3c-bca6f1b90c3a","Type":"ContainerDied","Data":"d26689d86956826776f6a68df424f29414b578bdb30f6c4e05418d6010a876b3"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.159683 5118 scope.go:117] "RemoveContainer" containerID="4ba34d66d196c48e8577f6109b59eff4f5b86c53ed431eae0c2091d19a42b92c" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.159873 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.179577 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b3633e5-65f3-41c8-be57-5c4e28227ec9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.180481 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.180717 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data podName:e3b37356-5c38-40b3-af55-4f25a2f16b21 nodeName:}" failed. No retries permitted until 2026-02-23 07:10:01.680692229 +0000 UTC m=+1464.684476802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data") pod "rabbitmq-server-0" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21") : configmap "rabbitmq-config-data" not found Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.187057 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_015de465-4df3-4178-b28b-3dd5ec0f37aa/ovsdbserver-nb/0.log" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.187388 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"015de465-4df3-4178-b28b-3dd5ec0f37aa","Type":"ContainerDied","Data":"aebfe7f2b2170606fd6785625c51c21b5631b8b51e702b5e19689be0347c590b"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.187517 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.259348 5118 generic.go:334] "Generic (PLEG): container finished" podID="cf1c2052-6563-45c5-888c-f7a153225f83" containerID="8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.259505 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ff569978f-gwmwn" event={"ID":"cf1c2052-6563-45c5-888c-f7a153225f83","Type":"ContainerDied","Data":"8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.280523 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" event={"ID":"fa65e5e6-1e90-407a-a462-c8ef3e406df3","Type":"ContainerStarted","Data":"2c591f552ee5660b09ac0ca89d5f086d3a3684d444411ab1b547f0a9ceef6efb"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.290610 5118 generic.go:334] "Generic (PLEG): container finished" podID="08b38d85-cf57-41a9-9779-1593300b77a3" containerID="838eea4ca37b326a64d5d678a35327acd726a105a4ac2e5f2979892d448df136" exitCode=143 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.290688 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" event={"ID":"08b38d85-cf57-41a9-9779-1593300b77a3","Type":"ContainerDied","Data":"838eea4ca37b326a64d5d678a35327acd726a105a4ac2e5f2979892d448df136"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.292718 5118 generic.go:334] "Generic (PLEG): container finished" podID="376ff246-417d-442a-83d1-1579abd318ba" containerID="7aa9aba83a6a4faa226f1ac01b3aa1d78be1db352ff3cdcb6601be775a19cd5c" exitCode=143 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.292767 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"376ff246-417d-442a-83d1-1579abd318ba","Type":"ContainerDied","Data":"7aa9aba83a6a4faa226f1ac01b3aa1d78be1db352ff3cdcb6601be775a19cd5c"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.320459 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-skw6h" event={"ID":"5e254320-082c-442b-a1a9-4b7fafe2c556","Type":"ContainerStarted","Data":"5f192cafde438fd57c36a523384e8d786abec6794e339c9dc4d5871e4051c5cb"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.368335 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2474920d-9d8a-4fc8-a8bc-7844ed0ef139/ovsdbserver-sb/0.log" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.368439 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.392783 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.395411 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.407035 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422451 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422493 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422501 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422511 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422519 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422529 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422536 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422545 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422555 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422562 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422569 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422576 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422583 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422590 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422643 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422679 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422689 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422700 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422710 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422719 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422728 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422738 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422763 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422777 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422786 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422797 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422806 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.422817 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.426520 5118 scope.go:117] "RemoveContainer" containerID="2ddc356225a64a2567d2c939ebf5232096d7165e8a350bbdddf0c8812106b5e2" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.433610 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.442026 5118 generic.go:334] "Generic (PLEG): container finished" podID="e75b838e-decf-4583-8b96-a41f54e2a654" containerID="66b033f5cdba1801f896cac33e18ec0b1d1f6261e8ea02c514f953da7c1c5a84" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.442058 5118 generic.go:334] "Generic (PLEG): container finished" podID="e75b838e-decf-4583-8b96-a41f54e2a654" containerID="d4786168035f880a50d9daf3c146a82f881d078e61fee117923247704b6bc372" exitCode=0 Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.442137 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77bd586555-4s2g7" event={"ID":"e75b838e-decf-4583-8b96-a41f54e2a654","Type":"ContainerDied","Data":"66b033f5cdba1801f896cac33e18ec0b1d1f6261e8ea02c514f953da7c1c5a84"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.442170 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77bd586555-4s2g7" event={"ID":"e75b838e-decf-4583-8b96-a41f54e2a654","Type":"ContainerDied","Data":"d4786168035f880a50d9daf3c146a82f881d078e61fee117923247704b6bc372"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.460859 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jtrcj_9568fac1-abdb-4b34-a0b7-e27d6c2183ee/openstack-network-exporter/0.log" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.460961 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.487528 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2474920d-9d8a-4fc8-a8bc-7844ed0ef139/ovsdbserver-sb/0.log" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.487638 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2474920d-9d8a-4fc8-a8bc-7844ed0ef139","Type":"ContainerDied","Data":"beaf5e0519e34249b5662fadd22cd742ef8a03b6ba0e170d7b95319af9915030"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.487754 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.488853 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7gnk\" (UniqueName: \"kubernetes.io/projected/2ff79df5-722f-4ea5-91a2-8368d8eeee99-kube-api-access-x7gnk\") pod \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.488886 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-swift-storage-0\") pod \"15482ef3-bf3a-4442-9b2e-89222e09d218\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.488910 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zckmn\" (UniqueName: \"kubernetes.io/projected/15482ef3-bf3a-4442-9b2e-89222e09d218-kube-api-access-zckmn\") pod \"15482ef3-bf3a-4442-9b2e-89222e09d218\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.488963 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdb-rundir\") pod \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489073 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-config\") pod \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489129 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-metrics-certs-tls-certs\") pod \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489199 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data-custom\") pod \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489257 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489311 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-combined-ca-bundle\") pod \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489372 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-config\") pod \"15482ef3-bf3a-4442-9b2e-89222e09d218\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489438 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-sb\") pod \"15482ef3-bf3a-4442-9b2e-89222e09d218\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489476 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff79df5-722f-4ea5-91a2-8368d8eeee99-etc-machine-id\") pod \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489500 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdbserver-sb-tls-certs\") pod \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489530 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dcvv\" (UniqueName: \"kubernetes.io/projected/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-kube-api-access-9dcvv\") pod \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489552 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-svc\") pod \"15482ef3-bf3a-4442-9b2e-89222e09d218\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489587 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data\") pod \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489628 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-nb\") pod \"15482ef3-bf3a-4442-9b2e-89222e09d218\" (UID: \"15482ef3-bf3a-4442-9b2e-89222e09d218\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489720 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-combined-ca-bundle\") pod \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.489748 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-scripts\") pod \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\" (UID: \"2474920d-9d8a-4fc8-a8bc-7844ed0ef139\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.490079 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-scripts\") pod \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\" (UID: \"2ff79df5-722f-4ea5-91a2-8368d8eeee99\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.490785 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "2474920d-9d8a-4fc8-a8bc-7844ed0ef139" (UID: "2474920d-9d8a-4fc8-a8bc-7844ed0ef139"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.492746 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ff79df5-722f-4ea5-91a2-8368d8eeee99-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2ff79df5-722f-4ea5-91a2-8368d8eeee99" (UID: "2ff79df5-722f-4ea5-91a2-8368d8eeee99"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.497690 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-config" (OuterVolumeSpecName: "config") pod "2474920d-9d8a-4fc8-a8bc-7844ed0ef139" (UID: "2474920d-9d8a-4fc8-a8bc-7844ed0ef139"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.501411 5118 scope.go:117] "RemoveContainer" containerID="49b70411a72db2ea12793c02e06b682bbf363fae840fa8bc8f2d66bc890a6eef" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.504772 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-scripts" (OuterVolumeSpecName: "scripts") pod "2474920d-9d8a-4fc8-a8bc-7844ed0ef139" (UID: "2474920d-9d8a-4fc8-a8bc-7844ed0ef139"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.509616 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" event={"ID":"15482ef3-bf3a-4442-9b2e-89222e09d218","Type":"ContainerDied","Data":"ef247005c027dd1fcb615daca58067a68edf7c681dbfb3756e7e42eb7e3e5240"} Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.509759 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.511176 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-scripts" (OuterVolumeSpecName: "scripts") pod "2ff79df5-722f-4ea5-91a2-8368d8eeee99" (UID: "2ff79df5-722f-4ea5-91a2-8368d8eeee99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.512365 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15482ef3-bf3a-4442-9b2e-89222e09d218-kube-api-access-zckmn" (OuterVolumeSpecName: "kube-api-access-zckmn") pod "15482ef3-bf3a-4442-9b2e-89222e09d218" (UID: "15482ef3-bf3a-4442-9b2e-89222e09d218"). InnerVolumeSpecName "kube-api-access-zckmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.516731 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-kube-api-access-9dcvv" (OuterVolumeSpecName: "kube-api-access-9dcvv") pod "2474920d-9d8a-4fc8-a8bc-7844ed0ef139" (UID: "2474920d-9d8a-4fc8-a8bc-7844ed0ef139"). InnerVolumeSpecName "kube-api-access-9dcvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.522492 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff79df5-722f-4ea5-91a2-8368d8eeee99-kube-api-access-x7gnk" (OuterVolumeSpecName: "kube-api-access-x7gnk") pod "2ff79df5-722f-4ea5-91a2-8368d8eeee99" (UID: "2ff79df5-722f-4ea5-91a2-8368d8eeee99"). InnerVolumeSpecName "kube-api-access-x7gnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.540745 5118 scope.go:117] "RemoveContainer" containerID="4bcc61517bff899445a02cd4853722bfdc2eee81bfb4ecb4fabf4e9cdeba87db" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.544393 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2ff79df5-722f-4ea5-91a2-8368d8eeee99" (UID: "2ff79df5-722f-4ea5-91a2-8368d8eeee99"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.553708 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "2474920d-9d8a-4fc8-a8bc-7844ed0ef139" (UID: "2474920d-9d8a-4fc8-a8bc-7844ed0ef139"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.558693 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.591957 5118 scope.go:117] "RemoveContainer" containerID="ea83eec64cea8c5567dc4865cb628c297f434268faaa7b0db99d30005f0be35d" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.592819 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-combined-ca-bundle\") pod \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.592864 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovs-rundir\") pod \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.592991 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-metrics-certs-tls-certs\") pod \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.593059 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovn-rundir\") pod \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.593218 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "9568fac1-abdb-4b34-a0b7-e27d6c2183ee" (UID: "9568fac1-abdb-4b34-a0b7-e27d6c2183ee"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.593280 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-config\") pod \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.593470 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "9568fac1-abdb-4b34-a0b7-e27d6c2183ee" (UID: "9568fac1-abdb-4b34-a0b7-e27d6c2183ee"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.593564 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxghk\" (UniqueName: \"kubernetes.io/projected/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-kube-api-access-xxghk\") pod \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\" (UID: \"9568fac1-abdb-4b34-a0b7-e27d6c2183ee\") " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.594943 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595027 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595089 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7gnk\" (UniqueName: \"kubernetes.io/projected/2ff79df5-722f-4ea5-91a2-8368d8eeee99-kube-api-access-x7gnk\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595163 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zckmn\" (UniqueName: \"kubernetes.io/projected/15482ef3-bf3a-4442-9b2e-89222e09d218-kube-api-access-zckmn\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595230 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595289 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595344 5118 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595404 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595482 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595541 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595604 5118 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff79df5-722f-4ea5-91a2-8368d8eeee99-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.595665 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dcvv\" (UniqueName: \"kubernetes.io/projected/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-kube-api-access-9dcvv\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.630060 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-config" (OuterVolumeSpecName: "config") pod "9568fac1-abdb-4b34-a0b7-e27d6c2183ee" (UID: "9568fac1-abdb-4b34-a0b7-e27d6c2183ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.631431 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-kube-api-access-xxghk" (OuterVolumeSpecName: "kube-api-access-xxghk") pod "9568fac1-abdb-4b34-a0b7-e27d6c2183ee" (UID: "9568fac1-abdb-4b34-a0b7-e27d6c2183ee"). InnerVolumeSpecName "kube-api-access-xxghk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.645008 5118 scope.go:117] "RemoveContainer" containerID="193b69117be5af96f8bada0bd3b6b78639823387315ad23c2241b27eb7e2cc5a" Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.674559 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.674970 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.675268 5118 scope.go:117] "RemoveContainer" containerID="136f6da0caaf2340fd15c1a32301bdddbb712bf41669c14f592aaf8d43d192b1" Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.675366 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.679595 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.679729 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.681420 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.682449 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.682488 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovs-vswitchd" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.698372 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.698415 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxghk\" (UniqueName: \"kubernetes.io/projected/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-kube-api-access-xxghk\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.698554 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 07:10:01 crc kubenswrapper[5118]: E0223 07:10:01.698618 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data podName:e3b37356-5c38-40b3-af55-4f25a2f16b21 nodeName:}" failed. No retries permitted until 2026-02-23 07:10:02.698596363 +0000 UTC m=+1465.702380936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data") pod "rabbitmq-server-0" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21") : configmap "rabbitmq-config-data" not found Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.710918 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015de465-4df3-4178-b28b-3dd5ec0f37aa" path="/var/lib/kubelet/pods/015de465-4df3-4178-b28b-3dd5ec0f37aa/volumes" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.712588 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3633e5-65f3-41c8-be57-5c4e28227ec9" path="/var/lib/kubelet/pods/1b3633e5-65f3-41c8-be57-5c4e28227ec9/volumes" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.713513 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="262a3252-99e5-4a2e-8164-c29c6e9b7764" path="/var/lib/kubelet/pods/262a3252-99e5-4a2e-8164-c29c6e9b7764/volumes" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.714740 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0f3a81-94a7-4553-abc5-d58e0150aeea" path="/var/lib/kubelet/pods/2e0f3a81-94a7-4553-abc5-d58e0150aeea/volumes" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.715418 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488" path="/var/lib/kubelet/pods/6b574a0c-8f8a-4db9-b5bd-cfe8ea5fd488/volumes" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.716614 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bdcb4a-de40-40a6-bba6-8010544c618f" path="/var/lib/kubelet/pods/71bdcb4a-de40-40a6-bba6-8010544c618f/volumes" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.719667 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e0be85-acce-4ee5-a56a-22f60082695d" path="/var/lib/kubelet/pods/74e0be85-acce-4ee5-a56a-22f60082695d/volumes" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.720392 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="814f160f-4a46-433d-8abb-bdf4b6ce65d3" path="/var/lib/kubelet/pods/814f160f-4a46-433d-8abb-bdf4b6ce65d3/volumes" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.720932 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84786dbb-5d55-4ca5-bceb-b66bd83d6c06" path="/var/lib/kubelet/pods/84786dbb-5d55-4ca5-bceb-b66bd83d6c06/volumes" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.721471 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ff3ee59-87cc-452f-a176-37b6f8d4307b" path="/var/lib/kubelet/pods/8ff3ee59-87cc-452f-a176-37b6f8d4307b/volumes" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.844758 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.905222 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:01 crc kubenswrapper[5118]: I0223 07:10:01.908509 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9568fac1-abdb-4b34-a0b7-e27d6c2183ee" (UID: "9568fac1-abdb-4b34-a0b7-e27d6c2183ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.008530 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.052775 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtk98"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.077237 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2474920d-9d8a-4fc8-a8bc-7844ed0ef139" (UID: "2474920d-9d8a-4fc8-a8bc-7844ed0ef139"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.119400 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9656-account-create-update-dshm9"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.138292 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2de9-account-create-update-7hqls"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.163916 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.175224 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6db8-account-create-update-2ckqc"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.179349 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8c77bbddd-85rm4"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.196235 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a094-account-create-update-5m7nf"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.211741 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2e07-account-create-update-9hjkr"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.232629 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-75595465d9-2pkqt"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.246945 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f94b-account-create-update-cvr4d"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.254599 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-241c-account-create-update-2sgxt"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.359913 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="04ba04e0-7d62-472f-ab31-c41f926c93e7" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.194:6080/vnc_lite.html\": dial tcp 10.217.0.194:6080: connect: connection refused" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.404722 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "15482ef3-bf3a-4442-9b2e-89222e09d218" (UID: "15482ef3-bf3a-4442-9b2e-89222e09d218"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.471509 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9568fac1-abdb-4b34-a0b7-e27d6c2183ee" (UID: "9568fac1-abdb-4b34-a0b7-e27d6c2183ee"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.480556 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-config" (OuterVolumeSpecName: "config") pod "15482ef3-bf3a-4442-9b2e-89222e09d218" (UID: "15482ef3-bf3a-4442-9b2e-89222e09d218"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.482064 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "15482ef3-bf3a-4442-9b2e-89222e09d218" (UID: "15482ef3-bf3a-4442-9b2e-89222e09d218"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.482912 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.482935 5118 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9568fac1-abdb-4b34-a0b7-e27d6c2183ee-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.482946 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.482955 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.535758 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "2474920d-9d8a-4fc8-a8bc-7844ed0ef139" (UID: "2474920d-9d8a-4fc8-a8bc-7844ed0ef139"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.542672 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "15482ef3-bf3a-4442-9b2e-89222e09d218" (UID: "15482ef3-bf3a-4442-9b2e-89222e09d218"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.543006 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8c77bbddd-85rm4" event={"ID":"5f515e00-c6e0-4849-b073-64721780e216","Type":"ContainerStarted","Data":"4f65c7492b670e7aacb6eaa4d4d0afc013484e1f73eb58a4335ca6edeb4ee964"} Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.550179 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:10:02 crc kubenswrapper[5118]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: if [ -n "cinder" ]; then Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="cinder" Feb 23 07:10:02 crc kubenswrapper[5118]: else Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="*" Feb 23 07:10:02 crc kubenswrapper[5118]: fi Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: # going for maximum compatibility here: Feb 23 07:10:02 crc kubenswrapper[5118]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:10:02 crc kubenswrapper[5118]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:10:02 crc kubenswrapper[5118]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:10:02 crc kubenswrapper[5118]: # support updates Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.551869 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-6db8-account-create-update-2ckqc" podUID="2c9e11a2-4b8f-4578-8c92-7d3e06258800" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.553068 5118 generic.go:334] "Generic (PLEG): container finished" podID="178ef478-d8d3-49a5-9188-9970d3859049" containerID="e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818" exitCode=0 Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.553188 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"178ef478-d8d3-49a5-9188-9970d3859049","Type":"ContainerDied","Data":"e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818"} Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.567424 5118 generic.go:334] "Generic (PLEG): container finished" podID="5e254320-082c-442b-a1a9-4b7fafe2c556" containerID="ebff57dc3960e140d5d736483b49e68e1f66fb01b96162a5b943eef862f02286" exitCode=1 Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.567505 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-skw6h" event={"ID":"5e254320-082c-442b-a1a9-4b7fafe2c556","Type":"ContainerDied","Data":"ebff57dc3960e140d5d736483b49e68e1f66fb01b96162a5b943eef862f02286"} Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.568232 5118 scope.go:117] "RemoveContainer" containerID="ebff57dc3960e140d5d736483b49e68e1f66fb01b96162a5b943eef862f02286" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.585680 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:10:02 crc kubenswrapper[5118]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: if [ -n "nova_api" ]; then Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="nova_api" Feb 23 07:10:02 crc kubenswrapper[5118]: else Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="*" Feb 23 07:10:02 crc kubenswrapper[5118]: fi Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: # going for maximum compatibility here: Feb 23 07:10:02 crc kubenswrapper[5118]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:10:02 crc kubenswrapper[5118]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:10:02 crc kubenswrapper[5118]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:10:02 crc kubenswrapper[5118]: # support updates Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.592410 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:10:02 crc kubenswrapper[5118]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: if [ -n "nova_cell0" ]; then Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="nova_cell0" Feb 23 07:10:02 crc kubenswrapper[5118]: else Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="*" Feb 23 07:10:02 crc kubenswrapper[5118]: fi Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: # going for maximum compatibility here: Feb 23 07:10:02 crc kubenswrapper[5118]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:10:02 crc kubenswrapper[5118]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:10:02 crc kubenswrapper[5118]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:10:02 crc kubenswrapper[5118]: # support updates Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.592499 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-2de9-account-create-update-7hqls" podUID="652f586b-2ae4-4a45-bc82-01c65ec27696" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.594250 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-9656-account-create-update-dshm9" podUID="8a26f9ed-63ef-4fa5-934f-e1190d79cf85" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.595045 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-284f-account-create-update-75k6g" event={"ID":"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d","Type":"ContainerDied","Data":"eff661844897b462a2ba9b194595054fb6162e57dd1a1ba95dab0640d5674a53"} Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.595077 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eff661844897b462a2ba9b194595054fb6162e57dd1a1ba95dab0640d5674a53" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.598913 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.598937 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.610319 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "15482ef3-bf3a-4442-9b2e-89222e09d218" (UID: "15482ef3-bf3a-4442-9b2e-89222e09d218"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.624336 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:10:02 crc kubenswrapper[5118]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: if [ -n "neutron" ]; then Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="neutron" Feb 23 07:10:02 crc kubenswrapper[5118]: else Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="*" Feb 23 07:10:02 crc kubenswrapper[5118]: fi Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: # going for maximum compatibility here: Feb 23 07:10:02 crc kubenswrapper[5118]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:10:02 crc kubenswrapper[5118]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:10:02 crc kubenswrapper[5118]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:10:02 crc kubenswrapper[5118]: # support updates Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.625978 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-f94b-account-create-update-cvr4d" podUID="bf376c5f-cc04-4733-9968-e199472b4241" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.641377 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2de9-account-create-update-7hqls" event={"ID":"652f586b-2ae4-4a45-bc82-01c65ec27696","Type":"ContainerStarted","Data":"c58327e6896b511018d79999f3fdccd646c957fe6270c09cf1fba2e639f96da6"} Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.660018 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtk98" event={"ID":"8a98024d-a91d-4769-9ec8-5537f7d6c20f","Type":"ContainerStarted","Data":"76405e757f3b71bdf4e65919529ef87fb707a2c251ee9f626bb7d91a76d742ca"} Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.686621 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2474920d-9d8a-4fc8-a8bc-7844ed0ef139" (UID: "2474920d-9d8a-4fc8-a8bc-7844ed0ef139"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.701219 5118 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/15482ef3-bf3a-4442-9b2e-89222e09d218-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.701265 5118 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2474920d-9d8a-4fc8-a8bc-7844ed0ef139-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.701936 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.702057 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data podName:e3b37356-5c38-40b3-af55-4f25a2f16b21 nodeName:}" failed. No retries permitted until 2026-02-23 07:10:04.702031328 +0000 UTC m=+1467.705815901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data") pod "rabbitmq-server-0" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21") : configmap "rabbitmq-config-data" not found Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.708289 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff79df5-722f-4ea5-91a2-8368d8eeee99" (UID: "2ff79df5-722f-4ea5-91a2-8368d8eeee99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.710508 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77bd586555-4s2g7" event={"ID":"e75b838e-decf-4583-8b96-a41f54e2a654","Type":"ContainerDied","Data":"8ae6db35f5b3512c867b9317844df3160b3a086707b468e0c8be638a0ef9f66a"} Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.710566 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ae6db35f5b3512c867b9317844df3160b3a086707b468e0c8be638a0ef9f66a" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.715244 5118 generic.go:334] "Generic (PLEG): container finished" podID="04ba04e0-7d62-472f-ab31-c41f926c93e7" containerID="0751ec02f589e9689680bdbc3c817812f2676d3d07252002303baef1e65fe57b" exitCode=0 Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.715394 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"04ba04e0-7d62-472f-ab31-c41f926c93e7","Type":"ContainerDied","Data":"0751ec02f589e9689680bdbc3c817812f2676d3d07252002303baef1e65fe57b"} Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.719257 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9656-account-create-update-dshm9" event={"ID":"8a26f9ed-63ef-4fa5-934f-e1190d79cf85","Type":"ContainerStarted","Data":"0769b7bc6afe339adb9ddf80dce8375665e48a93dfa9d06b8cff2989b0702d93"} Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.738244 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:10:02 crc kubenswrapper[5118]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: if [ -n "barbican" ]; then Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="barbican" Feb 23 07:10:02 crc kubenswrapper[5118]: else Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="*" Feb 23 07:10:02 crc kubenswrapper[5118]: fi Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: # going for maximum compatibility here: Feb 23 07:10:02 crc kubenswrapper[5118]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:10:02 crc kubenswrapper[5118]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:10:02 crc kubenswrapper[5118]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:10:02 crc kubenswrapper[5118]: # support updates Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.741737 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:10:02 crc kubenswrapper[5118]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: if [ -n "nova_cell1" ]; then Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="nova_cell1" Feb 23 07:10:02 crc kubenswrapper[5118]: else Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="*" Feb 23 07:10:02 crc kubenswrapper[5118]: fi Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: # going for maximum compatibility here: Feb 23 07:10:02 crc kubenswrapper[5118]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:10:02 crc kubenswrapper[5118]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:10:02 crc kubenswrapper[5118]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:10:02 crc kubenswrapper[5118]: # support updates Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.743326 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-241c-account-create-update-2sgxt" podUID="0079e5d0-c2f8-43f7-8dad-207eaedca4d6" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.744551 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-a094-account-create-update-5m7nf" podUID="9b555633-a53e-4689-b746-98bd29e6742e" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.755234 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data" (OuterVolumeSpecName: "config-data") pod "2ff79df5-722f-4ea5-91a2-8368d8eeee99" (UID: "2ff79df5-722f-4ea5-91a2-8368d8eeee99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.759874 5118 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 23 07:10:02 crc kubenswrapper[5118]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: if [ -n "placement" ]; then Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="placement" Feb 23 07:10:02 crc kubenswrapper[5118]: else Feb 23 07:10:02 crc kubenswrapper[5118]: GRANT_DATABASE="*" Feb 23 07:10:02 crc kubenswrapper[5118]: fi Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: # going for maximum compatibility here: Feb 23 07:10:02 crc kubenswrapper[5118]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 23 07:10:02 crc kubenswrapper[5118]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 23 07:10:02 crc kubenswrapper[5118]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 23 07:10:02 crc kubenswrapper[5118]: # support updates Feb 23 07:10:02 crc kubenswrapper[5118]: Feb 23 07:10:02 crc kubenswrapper[5118]: $MYSQL_CMD < logger="UnhandledError" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.761973 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-2e07-account-create-update-9hjkr" podUID="745f142b-ee2d-4354-98b3-2b6cd13e3b5e" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.804896 5118 generic.go:334] "Generic (PLEG): container finished" podID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerID="50ca6bcdb97f2c6020994172d7c32e3ae68a54330a49a74eecef1ee656c8e4db" exitCode=0 Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.805039 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f424d603-7efb-4075-9a9b-5117dec09a6a","Type":"ContainerDied","Data":"50ca6bcdb97f2c6020994172d7c32e3ae68a54330a49a74eecef1ee656c8e4db"} Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.805081 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f424d603-7efb-4075-9a9b-5117dec09a6a","Type":"ContainerDied","Data":"838c10107267d6ca2357c09d772e877661a99d19bf12df6a96b83b2bece06c6f"} Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.805113 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="838c10107267d6ca2357c09d772e877661a99d19bf12df6a96b83b2bece06c6f" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.805749 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.805802 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff79df5-722f-4ea5-91a2-8368d8eeee99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.884066 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70 is running failed: container process not found" containerID="896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.910836 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70 is running failed: container process not found" containerID="896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.910877 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" event={"ID":"fa65e5e6-1e90-407a-a462-c8ef3e406df3","Type":"ContainerStarted","Data":"3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297"} Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.919059 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70 is running failed: container process not found" containerID="896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:10:02 crc kubenswrapper[5118]: E0223 07:10:02.919185 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="460a8b7a-b61f-4f56-889e-54b5c2346679" containerName="nova-cell1-conductor-conductor" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.946512 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jtrcj" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.946662 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6db8-account-create-update-2ckqc" event={"ID":"2c9e11a2-4b8f-4578-8c92-7d3e06258800","Type":"ContainerStarted","Data":"7a8c9837986fa48aab8305e888ed5c6d683ef9586a5ec306bfcfa326a7564f1e"} Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.947047 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.982128 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:10:02 crc kubenswrapper[5118]: I0223 07:10:02.982516 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.057069 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e3b37356-5c38-40b3-af55-4f25a2f16b21" containerName="rabbitmq" containerID="cri-o://0a5be7ec0d548d228afd1de2a7afc107749b13909039d656bc1026e5bcf306a3" gracePeriod=604800 Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.204113 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6ff569978f-gwmwn" podUID="cf1c2052-6563-45c5-888c-f7a153225f83" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.164:9696/\": dial tcp 10.217.0.164:9696: connect: connection refused" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.269314 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.382603 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.383045 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="ceilometer-central-agent" containerID="cri-o://ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8" gracePeriod=30 Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.383617 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="proxy-httpd" containerID="cri-o://fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274" gracePeriod=30 Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.383688 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="sg-core" containerID="cri-o://807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18" gracePeriod=30 Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.383743 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="ceilometer-notification-agent" containerID="cri-o://5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7" gracePeriod=30 Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.430580 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.430934 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b6c8ab95-d7a9-4f39-abff-bd8fd89590ed" containerName="kube-state-metrics" containerID="cri-o://f9fbd71a03f2a93ba309cdce906910ae6003771c797503a1153318d351d6f24a" gracePeriod=30 Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.433708 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-internal-tls-certs\") pod \"f424d603-7efb-4075-9a9b-5117dec09a6a\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.433787 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f424d603-7efb-4075-9a9b-5117dec09a6a-logs\") pod \"f424d603-7efb-4075-9a9b-5117dec09a6a\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.433806 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-scripts\") pod \"f424d603-7efb-4075-9a9b-5117dec09a6a\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.433972 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-combined-ca-bundle\") pod \"f424d603-7efb-4075-9a9b-5117dec09a6a\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.433998 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-public-tls-certs\") pod \"f424d603-7efb-4075-9a9b-5117dec09a6a\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.434025 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j6zj\" (UniqueName: \"kubernetes.io/projected/f424d603-7efb-4075-9a9b-5117dec09a6a-kube-api-access-5j6zj\") pod \"f424d603-7efb-4075-9a9b-5117dec09a6a\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.434110 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data\") pod \"f424d603-7efb-4075-9a9b-5117dec09a6a\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.434205 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data-custom\") pod \"f424d603-7efb-4075-9a9b-5117dec09a6a\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.434235 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f424d603-7efb-4075-9a9b-5117dec09a6a-etc-machine-id\") pod \"f424d603-7efb-4075-9a9b-5117dec09a6a\" (UID: \"f424d603-7efb-4075-9a9b-5117dec09a6a\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.434928 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f424d603-7efb-4075-9a9b-5117dec09a6a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f424d603-7efb-4075-9a9b-5117dec09a6a" (UID: "f424d603-7efb-4075-9a9b-5117dec09a6a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.441270 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f424d603-7efb-4075-9a9b-5117dec09a6a-logs" (OuterVolumeSpecName: "logs") pod "f424d603-7efb-4075-9a9b-5117dec09a6a" (UID: "f424d603-7efb-4075-9a9b-5117dec09a6a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.465894 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f424d603-7efb-4075-9a9b-5117dec09a6a-kube-api-access-5j6zj" (OuterVolumeSpecName: "kube-api-access-5j6zj") pod "f424d603-7efb-4075-9a9b-5117dec09a6a" (UID: "f424d603-7efb-4075-9a9b-5117dec09a6a"). InnerVolumeSpecName "kube-api-access-5j6zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.485801 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-scripts" (OuterVolumeSpecName: "scripts") pod "f424d603-7efb-4075-9a9b-5117dec09a6a" (UID: "f424d603-7efb-4075-9a9b-5117dec09a6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:03 crc kubenswrapper[5118]: E0223 07:10:03.486075 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6820419b366ae8b47d31039250fb3d906f1f99ec9b556bf263a6f996823a2472" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.499600 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f424d603-7efb-4075-9a9b-5117dec09a6a" (UID: "f424d603-7efb-4075-9a9b-5117dec09a6a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.527992 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-284f-account-create-update-75k6g" Feb 23 07:10:03 crc kubenswrapper[5118]: E0223 07:10:03.549481 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6820419b366ae8b47d31039250fb3d906f1f99ec9b556bf263a6f996823a2472" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.552920 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.552958 5118 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f424d603-7efb-4075-9a9b-5117dec09a6a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.552967 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f424d603-7efb-4075-9a9b-5117dec09a6a-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.552975 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.552987 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j6zj\" (UniqueName: \"kubernetes.io/projected/f424d603-7efb-4075-9a9b-5117dec09a6a-kube-api-access-5j6zj\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:03 crc kubenswrapper[5118]: E0223 07:10:03.573973 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6820419b366ae8b47d31039250fb3d906f1f99ec9b556bf263a6f996823a2472" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 07:10:03 crc kubenswrapper[5118]: E0223 07:10:03.574036 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="334e9392-6a5f-4aa8-83d7-41e26e94dd32" containerName="nova-cell0-conductor-conductor" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.583001 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.600023 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.644332 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-tlrdl"] Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.657687 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2gvm\" (UniqueName: \"kubernetes.io/projected/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-kube-api-access-g2gvm\") pod \"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d\" (UID: \"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.658024 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-operator-scripts\") pod \"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d\" (UID: \"2f6bea7d-670d-4ade-a7d6-0fea6a7e503d\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.659649 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f6bea7d-670d-4ade-a7d6-0fea6a7e503d" (UID: "2f6bea7d-670d-4ade-a7d6-0fea6a7e503d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.722467 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.745996 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" path="/var/lib/kubelet/pods/2474920d-9d8a-4fc8-a8bc-7844ed0ef139/volumes" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.765072 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.823134 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f424d603-7efb-4075-9a9b-5117dec09a6a" (UID: "f424d603-7efb-4075-9a9b-5117dec09a6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.856150 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-kube-api-access-g2gvm" (OuterVolumeSpecName: "kube-api-access-g2gvm") pod "2f6bea7d-670d-4ade-a7d6-0fea6a7e503d" (UID: "2f6bea7d-670d-4ade-a7d6-0fea6a7e503d"). InnerVolumeSpecName "kube-api-access-g2gvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.868129 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-log-httpd\") pod \"e75b838e-decf-4583-8b96-a41f54e2a654\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.868205 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-internal-tls-certs\") pod \"e75b838e-decf-4583-8b96-a41f54e2a654\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.868261 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-public-tls-certs\") pod \"e75b838e-decf-4583-8b96-a41f54e2a654\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.868317 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-combined-ca-bundle\") pod \"e75b838e-decf-4583-8b96-a41f54e2a654\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.868382 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-etc-swift\") pod \"e75b838e-decf-4583-8b96-a41f54e2a654\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.868540 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-config-data\") pod \"e75b838e-decf-4583-8b96-a41f54e2a654\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.868577 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-run-httpd\") pod \"e75b838e-decf-4583-8b96-a41f54e2a654\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.868633 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg2xq\" (UniqueName: \"kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-kube-api-access-gg2xq\") pod \"e75b838e-decf-4583-8b96-a41f54e2a654\" (UID: \"e75b838e-decf-4583-8b96-a41f54e2a654\") " Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.869221 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.869245 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2gvm\" (UniqueName: \"kubernetes.io/projected/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d-kube-api-access-g2gvm\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.879793 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e75b838e-decf-4583-8b96-a41f54e2a654" (UID: "e75b838e-decf-4583-8b96-a41f54e2a654"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.898476 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e75b838e-decf-4583-8b96-a41f54e2a654" (UID: "e75b838e-decf-4583-8b96-a41f54e2a654"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.927710 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": dial tcp 10.217.0.201:8775: connect: connection refused" Feb 23 07:10:03 crc kubenswrapper[5118]: I0223 07:10:03.928171 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": dial tcp 10.217.0.201:8775: connect: connection refused" Feb 23 07:10:03 crc kubenswrapper[5118]: E0223 07:10:03.938062 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d is running failed: container process not found" containerID="399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:10:03 crc kubenswrapper[5118]: E0223 07:10:03.941162 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d is running failed: container process not found" containerID="399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:10:03 crc kubenswrapper[5118]: E0223 07:10:03.942951 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d is running failed: container process not found" containerID="399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 07:10:03 crc kubenswrapper[5118]: E0223 07:10:03.942984 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="cf1e2e3d-0fc3-474a-a15d-6808347c8240" containerName="nova-scheduler-scheduler" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.011893 5118 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.011942 5118 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e75b838e-decf-4583-8b96-a41f54e2a654-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.015521 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-kube-api-access-gg2xq" (OuterVolumeSpecName: "kube-api-access-gg2xq") pod "e75b838e-decf-4583-8b96-a41f54e2a654" (UID: "e75b838e-decf-4583-8b96-a41f54e2a654"). InnerVolumeSpecName "kube-api-access-gg2xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.040937 5118 generic.go:334] "Generic (PLEG): container finished" podID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerID="fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274" exitCode=0 Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.040974 5118 generic.go:334] "Generic (PLEG): container finished" podID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerID="807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18" exitCode=2 Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.046675 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e75b838e-decf-4583-8b96-a41f54e2a654" (UID: "e75b838e-decf-4583-8b96-a41f54e2a654"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.089469 5118 generic.go:334] "Generic (PLEG): container finished" podID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" containerID="bf1c589b730495897b68ee4004af7ede183db68decb4860331926ec63e038b03" exitCode=0 Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.124633 5118 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.124686 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg2xq\" (UniqueName: \"kubernetes.io/projected/e75b838e-decf-4583-8b96-a41f54e2a654-kube-api-access-gg2xq\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.126743 5118 generic.go:334] "Generic (PLEG): container finished" podID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerID="63d5f3f40e0bd27af62b5fc9b31aa60381f047575c6ba94cbf1284e1e6bbf343" exitCode=0 Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.217062 5118 generic.go:334] "Generic (PLEG): container finished" podID="2a79f618-3555-44a5-8c52-ec9120261645" containerID="4000f09676001090ea18cdb7df7926a1a769a2407d4b9008c3f8702217c3491e" exitCode=0 Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.299948 5118 generic.go:334] "Generic (PLEG): container finished" podID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerID="5285ecb4115449aa0975018a2c7a4ad15449770fc6a16a5b4cc0704a6eea208b" exitCode=0 Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.320786 5118 generic.go:334] "Generic (PLEG): container finished" podID="460a8b7a-b61f-4f56-889e-54b5c2346679" containerID="896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70" exitCode=0 Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.419552 5118 generic.go:334] "Generic (PLEG): container finished" podID="b6c8ab95-d7a9-4f39-abff-bd8fd89590ed" containerID="f9fbd71a03f2a93ba309cdce906910ae6003771c797503a1153318d351d6f24a" exitCode=2 Feb 23 07:10:04 crc kubenswrapper[5118]: E0223 07:10:04.440334 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:10:04 crc kubenswrapper[5118]: E0223 07:10:04.440409 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data podName:5721793b-d753-4519-b484-fa9cb958def9 nodeName:}" failed. No retries permitted until 2026-02-23 07:10:12.440392211 +0000 UTC m=+1475.444176784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data") pod "rabbitmq-cell1-server-0" (UID: "5721793b-d753-4519-b484-fa9cb958def9") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.472767 5118 generic.go:334] "Generic (PLEG): container finished" podID="cf1e2e3d-0fc3-474a-a15d-6808347c8240" containerID="399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d" exitCode=0 Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.479021 5118 generic.go:334] "Generic (PLEG): container finished" podID="376ff246-417d-442a-83d1-1579abd318ba" containerID="f79ac5b11fb0deba4274f00ce33e6c2a41832523dee197459bcb484c55984024" exitCode=0 Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.488041 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-284f-account-create-update-75k6g" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.488479 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.488580 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77bd586555-4s2g7" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.684478 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5df75dfc9b-mpgf2" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:53588->10.217.0.159:9311: read: connection reset by peer" Feb 23 07:10:04 crc kubenswrapper[5118]: I0223 07:10:04.684541 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5df75dfc9b-mpgf2" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:53586->10.217.0.159:9311: read: connection reset by peer" Feb 23 07:10:04 crc kubenswrapper[5118]: E0223 07:10:04.749065 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 07:10:04 crc kubenswrapper[5118]: E0223 07:10:04.749142 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data podName:e3b37356-5c38-40b3-af55-4f25a2f16b21 nodeName:}" failed. No retries permitted until 2026-02-23 07:10:08.749127212 +0000 UTC m=+1471.752911785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data") pod "rabbitmq-server-0" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21") : configmap "rabbitmq-config-data" not found Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.061207 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f424d603-7efb-4075-9a9b-5117dec09a6a" (UID: "f424d603-7efb-4075-9a9b-5117dec09a6a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.161419 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.173970 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e75b838e-decf-4583-8b96-a41f54e2a654" (UID: "e75b838e-decf-4583-8b96-a41f54e2a654"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.185835 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data" (OuterVolumeSpecName: "config-data") pod "f424d603-7efb-4075-9a9b-5117dec09a6a" (UID: "f424d603-7efb-4075-9a9b-5117dec09a6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.217657 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-config-data" (OuterVolumeSpecName: "config-data") pod "e75b838e-decf-4583-8b96-a41f54e2a654" (UID: "e75b838e-decf-4583-8b96-a41f54e2a654"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.276949 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.277014 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.277025 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.399319 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f424d603-7efb-4075-9a9b-5117dec09a6a" (UID: "f424d603-7efb-4075-9a9b-5117dec09a6a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.419156 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818 is running failed: container process not found" containerID="e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.421331 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818 is running failed: container process not found" containerID="e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.421737 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818 is running failed: container process not found" containerID="e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.421773 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="178ef478-d8d3-49a5-9188-9970d3859049" containerName="galera" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.486682 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f424d603-7efb-4075-9a9b-5117dec09a6a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.489038 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e75b838e-decf-4583-8b96-a41f54e2a654" (UID: "e75b838e-decf-4583-8b96-a41f54e2a654"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.505496 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58f6456c9f-tlrdl" podUID="15482ef3-bf3a-4442-9b2e-89222e09d218" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: i/o timeout" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.528258 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e75b838e-decf-4583-8b96-a41f54e2a654" (UID: "e75b838e-decf-4583-8b96-a41f54e2a654"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.544466 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" podUID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" containerName="barbican-keystone-listener-log" containerID="cri-o://3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297" gracePeriod=30 Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.545014 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" podUID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" containerName="barbican-keystone-listener" containerID="cri-o://ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd" gracePeriod=30 Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.567196 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" podStartSLOduration=10.567179508 podStartE2EDuration="10.567179508s" podCreationTimestamp="2026-02-23 07:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:10:05.56482809 +0000 UTC m=+1468.568612663" watchObservedRunningTime="2026-02-23 07:10:05.567179508 +0000 UTC m=+1468.570964081" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.567908 5118 generic.go:334] "Generic (PLEG): container finished" podID="08b38d85-cf57-41a9-9779-1593300b77a3" containerID="4478c2eb2702eff226081dd62c590b24a317a608e7697bf1cd6f86ae6c3d71e1" exitCode=0 Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.585404 5118 generic.go:334] "Generic (PLEG): container finished" podID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerID="ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8" exitCode=0 Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.588596 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.588618 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e75b838e-decf-4583-8b96-a41f54e2a654-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.590554 5118 generic.go:334] "Generic (PLEG): container finished" podID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerID="7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b" exitCode=0 Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.604054 5118 generic.go:334] "Generic (PLEG): container finished" podID="334e9392-6a5f-4aa8-83d7-41e26e94dd32" containerID="6820419b366ae8b47d31039250fb3d906f1f99ec9b556bf263a6f996823a2472" exitCode=0 Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.676456 5118 generic.go:334] "Generic (PLEG): container finished" podID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerID="b78a6f1ffe539d4eb30bf74ad5a43f3802cf0350d1aba256152e434c6b28d94e" exitCode=0 Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.678683 5118 generic.go:334] "Generic (PLEG): container finished" podID="e66bcbb5-075a-4a87-981c-0dc608f19742" containerID="6562661aead1d6afb939f1cc5250e5487e9146e38f4c8f43650a040bdd480d69" exitCode=0 Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.952928 5118 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.256s" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.973737 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58f6456c9f-tlrdl"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.973782 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-241c-account-create-update-2sgxt" event={"ID":"0079e5d0-c2f8-43f7-8dad-207eaedca4d6","Type":"ContainerStarted","Data":"e1c3c27943fd922be8c877369c5e570f16319e733d265689f4387c2a51164658"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.973803 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.973818 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1011-account-create-update-vx6cd"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.973831 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1011-account-create-update-vx6cd"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.973841 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a7f53e-845e-4dfd-a80d-f790b60270fc","Type":"ContainerDied","Data":"fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.973855 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1011-account-create-update-86wgz"] Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974351 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015de465-4df3-4178-b28b-3dd5ec0f37aa" containerName="openstack-network-exporter" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974380 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="015de465-4df3-4178-b28b-3dd5ec0f37aa" containerName="openstack-network-exporter" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974403 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15482ef3-bf3a-4442-9b2e-89222e09d218" containerName="init" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974412 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="15482ef3-bf3a-4442-9b2e-89222e09d218" containerName="init" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974427 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" containerName="probe" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974434 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" containerName="probe" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974463 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015de465-4df3-4178-b28b-3dd5ec0f37aa" containerName="ovsdbserver-nb" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974472 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="015de465-4df3-4178-b28b-3dd5ec0f37aa" containerName="ovsdbserver-nb" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974486 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" containerName="proxy-server" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974499 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" containerName="proxy-server" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974515 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15482ef3-bf3a-4442-9b2e-89222e09d218" containerName="dnsmasq-dns" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974522 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="15482ef3-bf3a-4442-9b2e-89222e09d218" containerName="dnsmasq-dns" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974535 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerName="ovsdbserver-sb" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974545 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerName="ovsdbserver-sb" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974560 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" containerName="cinder-scheduler" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974572 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" containerName="cinder-scheduler" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974587 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerName="openstack-network-exporter" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974595 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerName="openstack-network-exporter" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974609 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerName="cinder-api-log" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974621 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerName="cinder-api-log" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974641 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9568fac1-abdb-4b34-a0b7-e27d6c2183ee" containerName="openstack-network-exporter" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974649 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9568fac1-abdb-4b34-a0b7-e27d6c2183ee" containerName="openstack-network-exporter" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974663 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" containerName="proxy-httpd" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974675 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" containerName="proxy-httpd" Feb 23 07:10:05 crc kubenswrapper[5118]: E0223 07:10:05.974689 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerName="cinder-api" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.974696 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerName="cinder-api" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975085 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="015de465-4df3-4178-b28b-3dd5ec0f37aa" containerName="ovsdbserver-nb" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975117 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerName="openstack-network-exporter" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975127 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" containerName="cinder-scheduler" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975143 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" containerName="proxy-server" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975163 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" containerName="proxy-httpd" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975174 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2474920d-9d8a-4fc8-a8bc-7844ed0ef139" containerName="ovsdbserver-sb" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975184 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9568fac1-abdb-4b34-a0b7-e27d6c2183ee" containerName="openstack-network-exporter" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975199 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerName="cinder-api-log" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975211 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="15482ef3-bf3a-4442-9b2e-89222e09d218" containerName="dnsmasq-dns" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975225 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" containerName="probe" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975236 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerName="cinder-api" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.975250 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="015de465-4df3-4178-b28b-3dd5ec0f37aa" containerName="openstack-network-exporter" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.978260 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="52ab08e4-a114-4c99-adc6-dc05f711d8d9" containerName="memcached" containerID="cri-o://275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97" gracePeriod=30 Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990469 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a7f53e-845e-4dfd-a80d-f790b60270fc","Type":"ContainerDied","Data":"807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990772 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5b85dd46-5prjs" event={"ID":"b0e825c7-deb0-41b5-b358-f23dcc0f1082","Type":"ContainerDied","Data":"bf1c589b730495897b68ee4004af7ede183db68decb4860331926ec63e038b03"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990791 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84fdf432-1886-4e91-bd3c-bca6f1b90c3a","Type":"ContainerDied","Data":"63d5f3f40e0bd27af62b5fc9b31aa60381f047575c6ba94cbf1284e1e6bbf343"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990806 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a79f618-3555-44a5-8c52-ec9120261645","Type":"ContainerDied","Data":"4000f09676001090ea18cdb7df7926a1a769a2407d4b9008c3f8702217c3491e"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990820 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cd013d81-347d-4c1c-9ccf-0f5e1a590755","Type":"ContainerDied","Data":"5285ecb4115449aa0975018a2c7a4ad15449770fc6a16a5b4cc0704a6eea208b"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990836 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"460a8b7a-b61f-4f56-889e-54b5c2346679","Type":"ContainerDied","Data":"896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990851 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"178ef478-d8d3-49a5-9188-9970d3859049","Type":"ContainerDied","Data":"11feabea80b251ccced75ac02e3dc37398e6a4b3a896a11729e74ab950ac1672"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990863 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11feabea80b251ccced75ac02e3dc37398e6a4b3a896a11729e74ab950ac1672" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990876 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1011-account-create-update-86wgz"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990891 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f94b-account-create-update-cvr4d" event={"ID":"bf376c5f-cc04-4733-9968-e199472b4241","Type":"ContainerStarted","Data":"f990503873217772bfd06775b0547cdc1a28b885ca6f13e6a8f42b0cfa842983"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990901 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-bk5mz"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990914 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75595465d9-2pkqt" event={"ID":"5cc63cb3-2efc-441f-bd3c-8a6af30b9524","Type":"ContainerStarted","Data":"2c6e64facf64b99522efa81e478fdc6768fd5a33b27ca954b45ee12797cc600c"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990928 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed","Type":"ContainerDied","Data":"f9fbd71a03f2a93ba309cdce906910ae6003771c797503a1153318d351d6f24a"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990941 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-45wnt"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990952 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"04ba04e0-7d62-472f-ab31-c41f926c93e7","Type":"ContainerDied","Data":"d97e95f2b2442bb38db35cd0f871789560e7f0e34611ea6eca024c22fe1d3893"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990963 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d97e95f2b2442bb38db35cd0f871789560e7f0e34611ea6eca024c22fe1d3893" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990973 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-bk5mz"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990986 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-45wnt"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990997 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9656-account-create-update-dshm9" event={"ID":"8a26f9ed-63ef-4fa5-934f-e1190d79cf85","Type":"ContainerDied","Data":"0769b7bc6afe339adb9ddf80dce8375665e48a93dfa9d06b8cff2989b0702d93"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991008 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0769b7bc6afe339adb9ddf80dce8375665e48a93dfa9d06b8cff2989b0702d93" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991018 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cf1e2e3d-0fc3-474a-a15d-6808347c8240","Type":"ContainerDied","Data":"399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991029 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-jtrcj"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991041 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-jtrcj"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991052 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2de9-account-create-update-7hqls" event={"ID":"652f586b-2ae4-4a45-bc82-01c65ec27696","Type":"ContainerDied","Data":"c58327e6896b511018d79999f3fdccd646c957fe6270c09cf1fba2e639f96da6"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991063 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c58327e6896b511018d79999f3fdccd646c957fe6270c09cf1fba2e639f96da6" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991072 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991088 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-777dc4b79-zkfms"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991128 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"376ff246-417d-442a-83d1-1579abd318ba","Type":"ContainerDied","Data":"f79ac5b11fb0deba4274f00ce33e6c2a41832523dee197459bcb484c55984024"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991141 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-brjbz"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.990698 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991397 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-777dc4b79-zkfms" podUID="92a80c23-cba1-417f-bbd5-5c5138c3664a" containerName="keystone-api" containerID="cri-o://7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3" gracePeriod=30 Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991153 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-brjbz"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991722 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2e07-account-create-update-9hjkr" event={"ID":"745f142b-ee2d-4354-98b3-2b6cd13e3b5e","Type":"ContainerStarted","Data":"53a97fa5dd9a00c6172f8ff9d24faa76cde2783e196a7b038edb0f8d29f78316"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991756 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.991782 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992455 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a094-account-create-update-5m7nf" event={"ID":"9b555633-a53e-4689-b746-98bd29e6742e","Type":"ContainerStarted","Data":"fa8b88995542b222aab867afd52a17fd4fba23c19731dc20293d2e7bcdc0528d"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992484 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1011-account-create-update-86wgz"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992506 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cd013d81-347d-4c1c-9ccf-0f5e1a590755","Type":"ContainerDied","Data":"53b693b3924399c70496c4b7866c72831302d508f70145998a682b5cd6cadbf2"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992525 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53b693b3924399c70496c4b7866c72831302d508f70145998a682b5cd6cadbf2" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992538 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" event={"ID":"fa65e5e6-1e90-407a-a462-c8ef3e406df3","Type":"ContainerStarted","Data":"ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992552 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-skw6h"] Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992573 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8c77bbddd-85rm4" event={"ID":"5f515e00-c6e0-4849-b073-64721780e216","Type":"ContainerStarted","Data":"a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992587 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" event={"ID":"08b38d85-cf57-41a9-9779-1593300b77a3","Type":"ContainerDied","Data":"4478c2eb2702eff226081dd62c590b24a317a608e7697bf1cd6f86ae6c3d71e1"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992612 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2e07-account-create-update-9hjkr" event={"ID":"745f142b-ee2d-4354-98b3-2b6cd13e3b5e","Type":"ContainerDied","Data":"53a97fa5dd9a00c6172f8ff9d24faa76cde2783e196a7b038edb0f8d29f78316"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992627 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53a97fa5dd9a00c6172f8ff9d24faa76cde2783e196a7b038edb0f8d29f78316" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992645 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a7f53e-845e-4dfd-a80d-f790b60270fc","Type":"ContainerDied","Data":"ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992682 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtk98" event={"ID":"8a98024d-a91d-4769-9ec8-5537f7d6c20f","Type":"ContainerDied","Data":"7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992699 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6db8-account-create-update-2ckqc" event={"ID":"2c9e11a2-4b8f-4578-8c92-7d3e06258800","Type":"ContainerDied","Data":"7a8c9837986fa48aab8305e888ed5c6d683ef9586a5ec306bfcfa326a7564f1e"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992713 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a8c9837986fa48aab8305e888ed5c6d683ef9586a5ec306bfcfa326a7564f1e" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992723 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"334e9392-6a5f-4aa8-83d7-41e26e94dd32","Type":"ContainerDied","Data":"6820419b366ae8b47d31039250fb3d906f1f99ec9b556bf263a6f996823a2472"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992739 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"460a8b7a-b61f-4f56-889e-54b5c2346679","Type":"ContainerDied","Data":"5fe9b8cda05c8c35fcafac2a173468abac5eccb15e3f49c0e175f828ab25acc4"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992753 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fe9b8cda05c8c35fcafac2a173468abac5eccb15e3f49c0e175f828ab25acc4" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992763 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-241c-account-create-update-2sgxt" event={"ID":"0079e5d0-c2f8-43f7-8dad-207eaedca4d6","Type":"ContainerDied","Data":"e1c3c27943fd922be8c877369c5e570f16319e733d265689f4387c2a51164658"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992773 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1c3c27943fd922be8c877369c5e570f16319e733d265689f4387c2a51164658" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992783 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"84fdf432-1886-4e91-bd3c-bca6f1b90c3a","Type":"ContainerDied","Data":"269735b34d3fb86fc641759aea22911ee76884e3a2348a8d5779bfe39a3512bc"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992795 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="269735b34d3fb86fc641759aea22911ee76884e3a2348a8d5779bfe39a3512bc" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992910 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2a79f618-3555-44a5-8c52-ec9120261645","Type":"ContainerDied","Data":"7d93bed45f12f635a4f79ad66401c18ff3cd1fb3434c49200ff33284cce196d3"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992926 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d93bed45f12f635a4f79ad66401c18ff3cd1fb3434c49200ff33284cce196d3" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992940 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cf1e2e3d-0fc3-474a-a15d-6808347c8240","Type":"ContainerDied","Data":"fcedf1092d1c13d1246bb56caf2b69ae2f4ec09b64162087349f53e983c0c268"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992954 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcedf1092d1c13d1246bb56caf2b69ae2f4ec09b64162087349f53e983c0c268" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992964 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f94b-account-create-update-cvr4d" event={"ID":"bf376c5f-cc04-4733-9968-e199472b4241","Type":"ContainerDied","Data":"f990503873217772bfd06775b0547cdc1a28b885ca6f13e6a8f42b0cfa842983"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992975 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f990503873217772bfd06775b0547cdc1a28b885ca6f13e6a8f42b0cfa842983" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992986 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75595465d9-2pkqt" event={"ID":"5cc63cb3-2efc-441f-bd3c-8a6af30b9524","Type":"ContainerStarted","Data":"1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.992999 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"376ff246-417d-442a-83d1-1579abd318ba","Type":"ContainerDied","Data":"a6a10ad7518edc61a27564a7c7336e6d8d6019edc247d01835baad623ae51fc0"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.993014 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6a10ad7518edc61a27564a7c7336e6d8d6019edc247d01835baad623ae51fc0" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.993026 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed","Type":"ContainerDied","Data":"dbd9f4de7ae688bd647f75ec9ac708ff74b177712c8b7acc2f8cbd82f2a1bdec"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.993040 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbd9f4de7ae688bd647f75ec9ac708ff74b177712c8b7acc2f8cbd82f2a1bdec" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.993050 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a094-account-create-update-5m7nf" event={"ID":"9b555633-a53e-4689-b746-98bd29e6742e","Type":"ContainerDied","Data":"fa8b88995542b222aab867afd52a17fd4fba23c19731dc20293d2e7bcdc0528d"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.993061 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa8b88995542b222aab867afd52a17fd4fba23c19731dc20293d2e7bcdc0528d" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.993209 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5b85dd46-5prjs" event={"ID":"b0e825c7-deb0-41b5-b358-f23dcc0f1082","Type":"ContainerDied","Data":"13124c55d77e72251209f54aed646d929192a57cac960f30ab583ef76c0b0135"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.993224 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13124c55d77e72251209f54aed646d929192a57cac960f30ab583ef76c0b0135" Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.993236 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5df75dfc9b-mpgf2" event={"ID":"95c7c403-ece4-4778-9a1c-25dbc355a0bf","Type":"ContainerDied","Data":"b78a6f1ffe539d4eb30bf74ad5a43f3802cf0350d1aba256152e434c6b28d94e"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.993412 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69db8f76f-xbrx7" event={"ID":"e66bcbb5-075a-4a87-981c-0dc608f19742","Type":"ContainerDied","Data":"6562661aead1d6afb939f1cc5250e5487e9146e38f4c8f43650a040bdd480d69"} Feb 23 07:10:05 crc kubenswrapper[5118]: I0223 07:10:05.993620 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.029477 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.077740 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.107262 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9656-account-create-update-dshm9" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.108516 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts\") pod \"keystone-1011-account-create-update-86wgz\" (UID: \"ce9241f2-96be-4510-8216-34293762880a\") " pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.108810 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl6m8\" (UniqueName: \"kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8\") pod \"keystone-1011-account-create-update-86wgz\" (UID: \"ce9241f2-96be-4510-8216-34293762880a\") " pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.129411 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2de9-account-create-update-7hqls" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.151893 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6db8-account-create-update-2ckqc" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.184931 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.185077 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.186051 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.194035 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.207172 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211287 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/652f586b-2ae4-4a45-bc82-01c65ec27696-operator-scripts\") pod \"652f586b-2ae4-4a45-bc82-01c65ec27696\" (UID: \"652f586b-2ae4-4a45-bc82-01c65ec27696\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211333 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-vencrypt-tls-certs\") pod \"04ba04e0-7d62-472f-ab31-c41f926c93e7\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211388 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-nova-novncproxy-tls-certs\") pod \"04ba04e0-7d62-472f-ab31-c41f926c93e7\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211419 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j74dw\" (UniqueName: \"kubernetes.io/projected/04ba04e0-7d62-472f-ab31-c41f926c93e7-kube-api-access-j74dw\") pod \"04ba04e0-7d62-472f-ab31-c41f926c93e7\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211477 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwzxf\" (UniqueName: \"kubernetes.io/projected/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-kube-api-access-fwzxf\") pod \"8a26f9ed-63ef-4fa5-934f-e1190d79cf85\" (UID: \"8a26f9ed-63ef-4fa5-934f-e1190d79cf85\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211533 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/178ef478-d8d3-49a5-9188-9970d3859049-config-data-generated\") pod \"178ef478-d8d3-49a5-9188-9970d3859049\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211553 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-operator-scripts\") pod \"8a26f9ed-63ef-4fa5-934f-e1190d79cf85\" (UID: \"8a26f9ed-63ef-4fa5-934f-e1190d79cf85\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211605 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-kolla-config\") pod \"178ef478-d8d3-49a5-9188-9970d3859049\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211626 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvnw7\" (UniqueName: \"kubernetes.io/projected/652f586b-2ae4-4a45-bc82-01c65ec27696-kube-api-access-wvnw7\") pod \"652f586b-2ae4-4a45-bc82-01c65ec27696\" (UID: \"652f586b-2ae4-4a45-bc82-01c65ec27696\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211678 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-galera-tls-certs\") pod \"178ef478-d8d3-49a5-9188-9970d3859049\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211753 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-config-data-default\") pod \"178ef478-d8d3-49a5-9188-9970d3859049\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211783 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-operator-scripts\") pod \"178ef478-d8d3-49a5-9188-9970d3859049\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211852 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k2sk\" (UniqueName: \"kubernetes.io/projected/178ef478-d8d3-49a5-9188-9970d3859049-kube-api-access-6k2sk\") pod \"178ef478-d8d3-49a5-9188-9970d3859049\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.211915 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-config-data\") pod \"04ba04e0-7d62-472f-ab31-c41f926c93e7\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.212055 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-combined-ca-bundle\") pod \"178ef478-d8d3-49a5-9188-9970d3859049\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.214363 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/178ef478-d8d3-49a5-9188-9970d3859049-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "178ef478-d8d3-49a5-9188-9970d3859049" (UID: "178ef478-d8d3-49a5-9188-9970d3859049"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.216423 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "178ef478-d8d3-49a5-9188-9970d3859049" (UID: "178ef478-d8d3-49a5-9188-9970d3859049"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.217909 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "178ef478-d8d3-49a5-9188-9970d3859049" (UID: "178ef478-d8d3-49a5-9188-9970d3859049"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.218758 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "178ef478-d8d3-49a5-9188-9970d3859049" (UID: "178ef478-d8d3-49a5-9188-9970d3859049"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.212088 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"178ef478-d8d3-49a5-9188-9970d3859049\" (UID: \"178ef478-d8d3-49a5-9188-9970d3859049\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.221339 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/652f586b-2ae4-4a45-bc82-01c65ec27696-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "652f586b-2ae4-4a45-bc82-01c65ec27696" (UID: "652f586b-2ae4-4a45-bc82-01c65ec27696"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.221385 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a26f9ed-63ef-4fa5-934f-e1190d79cf85" (UID: "8a26f9ed-63ef-4fa5-934f-e1190d79cf85"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.223243 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-combined-ca-bundle\") pod \"04ba04e0-7d62-472f-ab31-c41f926c93e7\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.226174 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.227006 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ba04e0-7d62-472f-ab31-c41f926c93e7-kube-api-access-j74dw" (OuterVolumeSpecName: "kube-api-access-j74dw") pod "04ba04e0-7d62-472f-ab31-c41f926c93e7" (UID: "04ba04e0-7d62-472f-ab31-c41f926c93e7"). InnerVolumeSpecName "kube-api-access-j74dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.232500 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts\") pod \"keystone-1011-account-create-update-86wgz\" (UID: \"ce9241f2-96be-4510-8216-34293762880a\") " pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.232751 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl6m8\" (UniqueName: \"kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8\") pod \"keystone-1011-account-create-update-86wgz\" (UID: \"ce9241f2-96be-4510-8216-34293762880a\") " pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.232906 5118 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.233072 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts podName:ce9241f2-96be-4510-8216-34293762880a nodeName:}" failed. No retries permitted until 2026-02-23 07:10:06.733053812 +0000 UTC m=+1469.736838455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts") pod "keystone-1011-account-create-update-86wgz" (UID: "ce9241f2-96be-4510-8216-34293762880a") : configmap "openstack-scripts" not found Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.233454 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/652f586b-2ae4-4a45-bc82-01c65ec27696-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.233481 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j74dw\" (UniqueName: \"kubernetes.io/projected/04ba04e0-7d62-472f-ab31-c41f926c93e7-kube-api-access-j74dw\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.233496 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/178ef478-d8d3-49a5-9188-9970d3859049-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.233523 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.233533 5118 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.233542 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.233553 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178ef478-d8d3-49a5-9188-9970d3859049-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.239077 5118 projected.go:194] Error preparing data for projected volume kube-api-access-wl6m8 for pod openstack/keystone-1011-account-create-update-86wgz: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.239210 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8 podName:ce9241f2-96be-4510-8216-34293762880a nodeName:}" failed. No retries permitted until 2026-02-23 07:10:06.739196061 +0000 UTC m=+1469.742980634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wl6m8" (UniqueName: "kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8") pod "keystone-1011-account-create-update-86wgz" (UID: "ce9241f2-96be-4510-8216-34293762880a") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.239783 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-284f-account-create-update-75k6g"] Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.240794 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.251442 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/178ef478-d8d3-49a5-9188-9970d3859049-kube-api-access-6k2sk" (OuterVolumeSpecName: "kube-api-access-6k2sk") pod "178ef478-d8d3-49a5-9188-9970d3859049" (UID: "178ef478-d8d3-49a5-9188-9970d3859049"). InnerVolumeSpecName "kube-api-access-6k2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.253434 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-284f-account-create-update-75k6g"] Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.263836 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-kube-api-access-fwzxf" (OuterVolumeSpecName: "kube-api-access-fwzxf") pod "8a26f9ed-63ef-4fa5-934f-e1190d79cf85" (UID: "8a26f9ed-63ef-4fa5-934f-e1190d79cf85"). InnerVolumeSpecName "kube-api-access-fwzxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.269192 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652f586b-2ae4-4a45-bc82-01c65ec27696-kube-api-access-wvnw7" (OuterVolumeSpecName: "kube-api-access-wvnw7") pod "652f586b-2ae4-4a45-bc82-01c65ec27696" (UID: "652f586b-2ae4-4a45-bc82-01c65ec27696"). InnerVolumeSpecName "kube-api-access-wvnw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.282817 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-77bd586555-4s2g7"] Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.295173 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-77bd586555-4s2g7"] Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.315792 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "mysql-db") pod "178ef478-d8d3-49a5-9188-9970d3859049" (UID: "178ef478-d8d3-49a5-9188-9970d3859049"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334202 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-logs\") pod \"2a79f618-3555-44a5-8c52-ec9120261645\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334253 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-public-tls-certs\") pod \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334281 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-config-data\") pod \"2a79f618-3555-44a5-8c52-ec9120261645\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334328 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-config-data\") pod \"460a8b7a-b61f-4f56-889e-54b5c2346679\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334348 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-combined-ca-bundle\") pod \"2a79f618-3555-44a5-8c52-ec9120261645\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334367 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmd9j\" (UniqueName: \"kubernetes.io/projected/460a8b7a-b61f-4f56-889e-54b5c2346679-kube-api-access-bmd9j\") pod \"460a8b7a-b61f-4f56-889e-54b5c2346679\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334399 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8fgs\" (UniqueName: \"kubernetes.io/projected/2c9e11a2-4b8f-4578-8c92-7d3e06258800-kube-api-access-j8fgs\") pod \"2c9e11a2-4b8f-4578-8c92-7d3e06258800\" (UID: \"2c9e11a2-4b8f-4578-8c92-7d3e06258800\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334439 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-combined-ca-bundle\") pod \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334464 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdnj8\" (UniqueName: \"kubernetes.io/projected/2a79f618-3555-44a5-8c52-ec9120261645-kube-api-access-wdnj8\") pod \"2a79f618-3555-44a5-8c52-ec9120261645\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334485 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0e825c7-deb0-41b5-b358-f23dcc0f1082-logs\") pod \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334503 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-httpd-run\") pod \"2a79f618-3555-44a5-8c52-ec9120261645\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334533 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c9e11a2-4b8f-4578-8c92-7d3e06258800-operator-scripts\") pod \"2c9e11a2-4b8f-4578-8c92-7d3e06258800\" (UID: \"2c9e11a2-4b8f-4578-8c92-7d3e06258800\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334602 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-config-data\") pod \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334647 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-internal-tls-certs\") pod \"2a79f618-3555-44a5-8c52-ec9120261645\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334667 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-combined-ca-bundle\") pod \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334720 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-internal-tls-certs\") pod \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334761 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"2a79f618-3555-44a5-8c52-ec9120261645\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334808 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbrhk\" (UniqueName: \"kubernetes.io/projected/cf1e2e3d-0fc3-474a-a15d-6808347c8240-kube-api-access-rbrhk\") pod \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\" (UID: \"cf1e2e3d-0fc3-474a-a15d-6808347c8240\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334826 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-combined-ca-bundle\") pod \"460a8b7a-b61f-4f56-889e-54b5c2346679\" (UID: \"460a8b7a-b61f-4f56-889e-54b5c2346679\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334854 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-scripts\") pod \"2a79f618-3555-44a5-8c52-ec9120261645\" (UID: \"2a79f618-3555-44a5-8c52-ec9120261645\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334872 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-scripts\") pod \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334903 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54xcp\" (UniqueName: \"kubernetes.io/projected/b0e825c7-deb0-41b5-b358-f23dcc0f1082-kube-api-access-54xcp\") pod \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.334938 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-config-data\") pod \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.336062 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.336090 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwzxf\" (UniqueName: \"kubernetes.io/projected/8a26f9ed-63ef-4fa5-934f-e1190d79cf85-kube-api-access-fwzxf\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.336118 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvnw7\" (UniqueName: \"kubernetes.io/projected/652f586b-2ae4-4a45-bc82-01c65ec27696-kube-api-access-wvnw7\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.336127 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k2sk\" (UniqueName: \"kubernetes.io/projected/178ef478-d8d3-49a5-9188-9970d3859049-kube-api-access-6k2sk\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.336345 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2a79f618-3555-44a5-8c52-ec9120261645" (UID: "2a79f618-3555-44a5-8c52-ec9120261645"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.336742 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-logs" (OuterVolumeSpecName: "logs") pod "2a79f618-3555-44a5-8c52-ec9120261645" (UID: "2a79f618-3555-44a5-8c52-ec9120261645"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.340699 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0e825c7-deb0-41b5-b358-f23dcc0f1082-logs" (OuterVolumeSpecName: "logs") pod "b0e825c7-deb0-41b5-b358-f23dcc0f1082" (UID: "b0e825c7-deb0-41b5-b358-f23dcc0f1082"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.340797 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9e11a2-4b8f-4578-8c92-7d3e06258800-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c9e11a2-4b8f-4578-8c92-7d3e06258800" (UID: "2c9e11a2-4b8f-4578-8c92-7d3e06258800"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.369215 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1e2e3d-0fc3-474a-a15d-6808347c8240-kube-api-access-rbrhk" (OuterVolumeSpecName: "kube-api-access-rbrhk") pod "cf1e2e3d-0fc3-474a-a15d-6808347c8240" (UID: "cf1e2e3d-0fc3-474a-a15d-6808347c8240"). InnerVolumeSpecName "kube-api-access-rbrhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.371792 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460a8b7a-b61f-4f56-889e-54b5c2346679-kube-api-access-bmd9j" (OuterVolumeSpecName: "kube-api-access-bmd9j") pod "460a8b7a-b61f-4f56-889e-54b5c2346679" (UID: "460a8b7a-b61f-4f56-889e-54b5c2346679"). InnerVolumeSpecName "kube-api-access-bmd9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.386753 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c9e11a2-4b8f-4578-8c92-7d3e06258800-kube-api-access-j8fgs" (OuterVolumeSpecName: "kube-api-access-j8fgs") pod "2c9e11a2-4b8f-4578-8c92-7d3e06258800" (UID: "2c9e11a2-4b8f-4578-8c92-7d3e06258800"). InnerVolumeSpecName "kube-api-access-j8fgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.397275 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a79f618-3555-44a5-8c52-ec9120261645-kube-api-access-wdnj8" (OuterVolumeSpecName: "kube-api-access-wdnj8") pod "2a79f618-3555-44a5-8c52-ec9120261645" (UID: "2a79f618-3555-44a5-8c52-ec9120261645"). InnerVolumeSpecName "kube-api-access-wdnj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.398641 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-scripts" (OuterVolumeSpecName: "scripts") pod "b0e825c7-deb0-41b5-b358-f23dcc0f1082" (UID: "b0e825c7-deb0-41b5-b358-f23dcc0f1082"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.398754 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-scripts" (OuterVolumeSpecName: "scripts") pod "2a79f618-3555-44a5-8c52-ec9120261645" (UID: "2a79f618-3555-44a5-8c52-ec9120261645"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.406027 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0e825c7-deb0-41b5-b358-f23dcc0f1082-kube-api-access-54xcp" (OuterVolumeSpecName: "kube-api-access-54xcp") pod "b0e825c7-deb0-41b5-b358-f23dcc0f1082" (UID: "b0e825c7-deb0-41b5-b358-f23dcc0f1082"). InnerVolumeSpecName "kube-api-access-54xcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.408461 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "2a79f618-3555-44a5-8c52-ec9120261645" (UID: "2a79f618-3555-44a5-8c52-ec9120261645"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.436738 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-combined-ca-bundle\") pod \"376ff246-417d-442a-83d1-1579abd318ba\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.436863 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-httpd-run\") pod \"376ff246-417d-442a-83d1-1579abd318ba\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.436913 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-config-data\") pod \"376ff246-417d-442a-83d1-1579abd318ba\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437007 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-public-tls-certs\") pod \"376ff246-417d-442a-83d1-1579abd318ba\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437034 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdr4s\" (UniqueName: \"kubernetes.io/projected/376ff246-417d-442a-83d1-1579abd318ba-kube-api-access-kdr4s\") pod \"376ff246-417d-442a-83d1-1579abd318ba\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437090 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-scripts\") pod \"376ff246-417d-442a-83d1-1579abd318ba\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437167 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"376ff246-417d-442a-83d1-1579abd318ba\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437248 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-logs\") pod \"376ff246-417d-442a-83d1-1579abd318ba\" (UID: \"376ff246-417d-442a-83d1-1579abd318ba\") " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437650 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437663 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbrhk\" (UniqueName: \"kubernetes.io/projected/cf1e2e3d-0fc3-474a-a15d-6808347c8240-kube-api-access-rbrhk\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437675 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437684 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437692 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54xcp\" (UniqueName: \"kubernetes.io/projected/b0e825c7-deb0-41b5-b358-f23dcc0f1082-kube-api-access-54xcp\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437702 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437711 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmd9j\" (UniqueName: \"kubernetes.io/projected/460a8b7a-b61f-4f56-889e-54b5c2346679-kube-api-access-bmd9j\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437719 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8fgs\" (UniqueName: \"kubernetes.io/projected/2c9e11a2-4b8f-4578-8c92-7d3e06258800-kube-api-access-j8fgs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437728 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdnj8\" (UniqueName: \"kubernetes.io/projected/2a79f618-3555-44a5-8c52-ec9120261645-kube-api-access-wdnj8\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437736 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0e825c7-deb0-41b5-b358-f23dcc0f1082-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437743 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a79f618-3555-44a5-8c52-ec9120261645-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.437752 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c9e11a2-4b8f-4578-8c92-7d3e06258800-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.439157 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "376ff246-417d-442a-83d1-1579abd318ba" (UID: "376ff246-417d-442a-83d1-1579abd318ba"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.442526 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-logs" (OuterVolumeSpecName: "logs") pod "376ff246-417d-442a-83d1-1579abd318ba" (UID: "376ff246-417d-442a-83d1-1579abd318ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.482588 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "376ff246-417d-442a-83d1-1579abd318ba" (UID: "376ff246-417d-442a-83d1-1579abd318ba"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.486344 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-scripts" (OuterVolumeSpecName: "scripts") pod "376ff246-417d-442a-83d1-1579abd318ba" (UID: "376ff246-417d-442a-83d1-1579abd318ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.488180 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/376ff246-417d-442a-83d1-1579abd318ba-kube-api-access-kdr4s" (OuterVolumeSpecName: "kube-api-access-kdr4s") pod "376ff246-417d-442a-83d1-1579abd318ba" (UID: "376ff246-417d-442a-83d1-1579abd318ba"). InnerVolumeSpecName "kube-api-access-kdr4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.507142 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="d08c9f04-59d9-4892-a540-5c892c604a71" containerName="galera" containerID="cri-o://0123533d83aacb1fef7bf221a008c5a2e879c0ab52103eda69db30b401c15faa" gracePeriod=30 Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.540543 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.540587 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdr4s\" (UniqueName: \"kubernetes.io/projected/376ff246-417d-442a-83d1-1579abd318ba-kube-api-access-kdr4s\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.540599 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.540625 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.540640 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/376ff246-417d-442a-83d1-1579abd318ba-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.542226 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "460a8b7a-b61f-4f56-889e-54b5c2346679" (UID: "460a8b7a-b61f-4f56-889e-54b5c2346679"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.597593 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.644878 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.644908 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.674889 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.675047 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.675875 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "376ff246-417d-442a-83d1-1579abd318ba" (UID: "376ff246-417d-442a-83d1-1579abd318ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.676347 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.676592 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.676748 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.676775 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.689031 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.689211 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovs-vswitchd" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.745718 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-f8rdg" podUID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" containerName="ovn-controller" probeResult="failure" output="command timed out" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.745940 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-config-data" (OuterVolumeSpecName: "config-data") pod "460a8b7a-b61f-4f56-889e-54b5c2346679" (UID: "460a8b7a-b61f-4f56-889e-54b5c2346679"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.751471 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts\") pod \"keystone-1011-account-create-update-86wgz\" (UID: \"ce9241f2-96be-4510-8216-34293762880a\") " pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.751527 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl6m8\" (UniqueName: \"kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8\") pod \"keystone-1011-account-create-update-86wgz\" (UID: \"ce9241f2-96be-4510-8216-34293762880a\") " pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.751620 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.751631 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/460a8b7a-b61f-4f56-889e-54b5c2346679-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.751628 5118 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.751703 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts podName:ce9241f2-96be-4510-8216-34293762880a nodeName:}" failed. No retries permitted until 2026-02-23 07:10:07.751684774 +0000 UTC m=+1470.755469347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts") pod "keystone-1011-account-create-update-86wgz" (UID: "ce9241f2-96be-4510-8216-34293762880a") : configmap "openstack-scripts" not found Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.755218 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.199:3000/\": dial tcp 10.217.0.199:3000: connect: connection refused" Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.767060 5118 projected.go:194] Error preparing data for projected volume kube-api-access-wl6m8 for pod openstack/keystone-1011-account-create-update-86wgz: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:10:06 crc kubenswrapper[5118]: E0223 07:10:06.767146 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8 podName:ce9241f2-96be-4510-8216-34293762880a nodeName:}" failed. No retries permitted until 2026-02-23 07:10:07.767125069 +0000 UTC m=+1470.770909642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wl6m8" (UniqueName: "kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8") pod "keystone-1011-account-create-update-86wgz" (UID: "ce9241f2-96be-4510-8216-34293762880a") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.769640 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.776474 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" event={"ID":"08b38d85-cf57-41a9-9779-1593300b77a3","Type":"ContainerDied","Data":"c1a7cde756082566a8a408e773200dce857577a40647d552b637bbb7c31095d8"} Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.778436 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1a7cde756082566a8a408e773200dce857577a40647d552b637bbb7c31095d8" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.780167 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-config-data" (OuterVolumeSpecName: "config-data") pod "2a79f618-3555-44a5-8c52-ec9120261645" (UID: "2a79f618-3555-44a5-8c52-ec9120261645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.782291 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69db8f76f-xbrx7" event={"ID":"e66bcbb5-075a-4a87-981c-0dc608f19742","Type":"ContainerDied","Data":"ff230597bdf1525b2a5fda543a890742ba45b434150c6322561fdf78b0de4c0e"} Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.782323 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff230597bdf1525b2a5fda543a890742ba45b434150c6322561fdf78b0de4c0e" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.785465 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75595465d9-2pkqt" event={"ID":"5cc63cb3-2efc-441f-bd3c-8a6af30b9524","Type":"ContainerStarted","Data":"c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4"} Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.785605 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-75595465d9-2pkqt" podUID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" containerName="barbican-worker-log" containerID="cri-o://1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec" gracePeriod=30 Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.786130 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-75595465d9-2pkqt" podUID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" containerName="barbican-worker" containerID="cri-o://c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4" gracePeriod=30 Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.801673 5118 generic.go:334] "Generic (PLEG): container finished" podID="5e254320-082c-442b-a1a9-4b7fafe2c556" containerID="245fe6e0f678afc06eab998734d901141ec48691b8a044347126b5ed7fc789e4" exitCode=1 Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.801743 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-skw6h" event={"ID":"5e254320-082c-442b-a1a9-4b7fafe2c556","Type":"ContainerDied","Data":"245fe6e0f678afc06eab998734d901141ec48691b8a044347126b5ed7fc789e4"} Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.801777 5118 scope.go:117] "RemoveContainer" containerID="ebff57dc3960e140d5d736483b49e68e1f66fb01b96162a5b943eef862f02286" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.809857 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-75595465d9-2pkqt" podStartSLOduration=11.809841776 podStartE2EDuration="11.809841776s" podCreationTimestamp="2026-02-23 07:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:10:06.807330494 +0000 UTC m=+1469.811115067" watchObservedRunningTime="2026-02-23 07:10:06.809841776 +0000 UTC m=+1469.813626349" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.828287 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5df75dfc9b-mpgf2" event={"ID":"95c7c403-ece4-4778-9a1c-25dbc355a0bf","Type":"ContainerDied","Data":"7caaa1a5bca8189360df1fb61dbf27135b9ade5ae93377261f21e86ee5fb96a3"} Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.828444 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7caaa1a5bca8189360df1fb61dbf27135b9ade5ae93377261f21e86ee5fb96a3" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.846341 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "178ef478-d8d3-49a5-9188-9970d3859049" (UID: "178ef478-d8d3-49a5-9188-9970d3859049"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.853846 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.853874 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.853883 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.859502 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"334e9392-6a5f-4aa8-83d7-41e26e94dd32","Type":"ContainerDied","Data":"1fcd449de2364e242719177391062d4a455abf457bbd8dcfb5134cf4c7c827b9"} Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.859541 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fcd449de2364e242719177391062d4a455abf457bbd8dcfb5134cf4c7c827b9" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.863207 5118 generic.go:334] "Generic (PLEG): container finished" podID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" containerID="3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297" exitCode=143 Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.863242 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" event={"ID":"fa65e5e6-1e90-407a-a462-c8ef3e406df3","Type":"ContainerDied","Data":"3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297"} Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.869591 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2de9-account-create-update-7hqls" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.873190 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.873217 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.873275 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.873325 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8c77bbddd-85rm4" event={"ID":"5f515e00-c6e0-4849-b073-64721780e216","Type":"ContainerStarted","Data":"f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba"} Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.873399 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.873195 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6db8-account-create-update-2ckqc" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.873531 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.873531 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.873555 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5b85dd46-5prjs" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.873629 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9656-account-create-update-dshm9" Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.886852 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api-log" containerID="cri-o://a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f" gracePeriod=30 Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.887591 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api" containerID="cri-o://f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba" gracePeriod=30 Feb 23 07:10:06 crc kubenswrapper[5118]: I0223 07:10:06.888007 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:06.978480 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0e825c7-deb0-41b5-b358-f23dcc0f1082" (UID: "b0e825c7-deb0-41b5-b358-f23dcc0f1082"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:06.984560 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-config-data" (OuterVolumeSpecName: "config-data") pod "04ba04e0-7d62-472f-ab31-c41f926c93e7" (UID: "04ba04e0-7d62-472f-ab31-c41f926c93e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:06.984740 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-combined-ca-bundle\") pod \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\" (UID: \"b0e825c7-deb0-41b5-b358-f23dcc0f1082\") " Feb 23 07:10:07 crc kubenswrapper[5118]: W0223 07:10:06.984846 5118 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b0e825c7-deb0-41b5-b358-f23dcc0f1082/volumes/kubernetes.io~secret/combined-ca-bundle Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:06.984854 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0e825c7-deb0-41b5-b358-f23dcc0f1082" (UID: "b0e825c7-deb0-41b5-b358-f23dcc0f1082"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:06.984914 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-config-data\") pod \"04ba04e0-7d62-472f-ab31-c41f926c93e7\" (UID: \"04ba04e0-7d62-472f-ab31-c41f926c93e7\") " Feb 23 07:10:07 crc kubenswrapper[5118]: W0223 07:10:06.984999 5118 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/04ba04e0-7d62-472f-ab31-c41f926c93e7/volumes/kubernetes.io~secret/config-data Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:06.985006 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-config-data" (OuterVolumeSpecName: "config-data") pod "04ba04e0-7d62-472f-ab31-c41f926c93e7" (UID: "04ba04e0-7d62-472f-ab31-c41f926c93e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:06.985708 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:06.985723 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.022965 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8c77bbddd-85rm4" podStartSLOduration=11.022883804 podStartE2EDuration="11.022883804s" podCreationTimestamp="2026-02-23 07:09:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 07:10:06.986980192 +0000 UTC m=+1469.990764765" watchObservedRunningTime="2026-02-23 07:10:07.022883804 +0000 UTC m=+1470.026668377" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.065962 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf1e2e3d-0fc3-474a-a15d-6808347c8240" (UID: "cf1e2e3d-0fc3-474a-a15d-6808347c8240"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.091796 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.125494 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-f8rdg" podUID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" containerName="ovn-controller" probeResult="failure" output=< Feb 23 07:10:07 crc kubenswrapper[5118]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Feb 23 07:10:07 crc kubenswrapper[5118]: > Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.128526 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.164279 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04ba04e0-7d62-472f-ab31-c41f926c93e7" (UID: "04ba04e0-7d62-472f-ab31-c41f926c93e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.171235 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-config-data" (OuterVolumeSpecName: "config-data") pod "b0e825c7-deb0-41b5-b358-f23dcc0f1082" (UID: "b0e825c7-deb0-41b5-b358-f23dcc0f1082"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.184485 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b0e825c7-deb0-41b5-b358-f23dcc0f1082" (UID: "b0e825c7-deb0-41b5-b358-f23dcc0f1082"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.193593 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b0e825c7-deb0-41b5-b358-f23dcc0f1082" (UID: "b0e825c7-deb0-41b5-b358-f23dcc0f1082"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.194862 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.194895 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.194906 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.194935 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e825c7-deb0-41b5-b358-f23dcc0f1082-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.256340 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.258309 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-config-data" (OuterVolumeSpecName: "config-data") pod "376ff246-417d-442a-83d1-1579abd318ba" (UID: "376ff246-417d-442a-83d1-1579abd318ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.299009 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.299042 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.310199 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "04ba04e0-7d62-472f-ab31-c41f926c93e7" (UID: "04ba04e0-7d62-472f-ab31-c41f926c93e7"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.356418 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "376ff246-417d-442a-83d1-1579abd318ba" (UID: "376ff246-417d-442a-83d1-1579abd318ba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.361873 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-config-data" (OuterVolumeSpecName: "config-data") pod "cf1e2e3d-0fc3-474a-a15d-6808347c8240" (UID: "cf1e2e3d-0fc3-474a-a15d-6808347c8240"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.365161 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a79f618-3555-44a5-8c52-ec9120261645" (UID: "2a79f618-3555-44a5-8c52-ec9120261645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.370336 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "04ba04e0-7d62-472f-ab31-c41f926c93e7" (UID: "04ba04e0-7d62-472f-ab31-c41f926c93e7"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.393864 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "178ef478-d8d3-49a5-9188-9970d3859049" (UID: "178ef478-d8d3-49a5-9188-9970d3859049"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.400881 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2a79f618-3555-44a5-8c52-ec9120261645" (UID: "2a79f618-3555-44a5-8c52-ec9120261645"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.401403 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.401436 5118 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.401452 5118 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/04ba04e0-7d62-472f-ab31-c41f926c93e7-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.401463 5118 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ef478-d8d3-49a5-9188-9970d3859049-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.401476 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/376ff246-417d-442a-83d1-1579abd318ba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.401487 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1e2e3d-0fc3-474a-a15d-6808347c8240-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.401497 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a79f618-3555-44a5-8c52-ec9120261645-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.620304 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.696651 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.710057 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-combined-ca-bundle\") pod \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.710140 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-certs\") pod \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.710296 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-config\") pod \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.710558 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4492q\" (UniqueName: \"kubernetes.io/projected/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-api-access-4492q\") pod \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\" (UID: \"b6c8ab95-d7a9-4f39-abff-bd8fd89590ed\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.720881 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-api-access-4492q" (OuterVolumeSpecName: "kube-api-access-4492q") pod "b6c8ab95-d7a9-4f39-abff-bd8fd89590ed" (UID: "b6c8ab95-d7a9-4f39-abff-bd8fd89590ed"). InnerVolumeSpecName "kube-api-access-4492q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.727369 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0542fde3-000d-4897-9f0b-ec6a050ac5be" path="/var/lib/kubelet/pods/0542fde3-000d-4897-9f0b-ec6a050ac5be/volumes" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.727658 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4492q\" (UniqueName: \"kubernetes.io/projected/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-api-access-4492q\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.729513 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15482ef3-bf3a-4442-9b2e-89222e09d218" path="/var/lib/kubelet/pods/15482ef3-bf3a-4442-9b2e-89222e09d218/volumes" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.730282 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f6bea7d-670d-4ade-a7d6-0fea6a7e503d" path="/var/lib/kubelet/pods/2f6bea7d-670d-4ade-a7d6-0fea6a7e503d/volumes" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.730741 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff79df5-722f-4ea5-91a2-8368d8eeee99" path="/var/lib/kubelet/pods/2ff79df5-722f-4ea5-91a2-8368d8eeee99/volumes" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.731528 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.732782 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b7ca43-be43-4d92-9d24-6dbf40d3132d" path="/var/lib/kubelet/pods/49b7ca43-be43-4d92-9d24-6dbf40d3132d/volumes" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.733578 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f945343-e53c-4dc0-a8e9-4165dd32b8b8" path="/var/lib/kubelet/pods/5f945343-e53c-4dc0-a8e9-4165dd32b8b8/volumes" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.743124 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9568fac1-abdb-4b34-a0b7-e27d6c2183ee" path="/var/lib/kubelet/pods/9568fac1-abdb-4b34-a0b7-e27d6c2183ee/volumes" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.743670 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f94b-account-create-update-cvr4d" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.745216 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a094-account-create-update-5m7nf" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.745862 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-241c-account-create-update-2sgxt" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.746662 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d221d72b-6274-48da-900c-284185365e14" path="/var/lib/kubelet/pods/d221d72b-6274-48da-900c-284185365e14/volumes" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.747917 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" path="/var/lib/kubelet/pods/e75b838e-decf-4583-8b96-a41f54e2a654/volumes" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.749174 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" path="/var/lib/kubelet/pods/f424d603-7efb-4075-9a9b-5117dec09a6a/volumes" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.754822 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2e07-account-create-update-9hjkr" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.775834 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6c8ab95-d7a9-4f39-abff-bd8fd89590ed" (UID: "b6c8ab95-d7a9-4f39-abff-bd8fd89590ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.779466 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6db8-account-create-update-2ckqc"] Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.783901 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.789568 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6db8-account-create-update-2ckqc"] Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.795230 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "b6c8ab95-d7a9-4f39-abff-bd8fd89590ed" (UID: "b6c8ab95-d7a9-4f39-abff-bd8fd89590ed"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.798953 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.832786 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-internal-tls-certs\") pod \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.832850 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-config-data\") pod \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.832920 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-public-tls-certs\") pod \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.832943 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-config-data\") pod \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.832997 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-combined-ca-bundle\") pod \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.833062 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgdnn\" (UniqueName: \"kubernetes.io/projected/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-kube-api-access-vgdnn\") pod \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.833122 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjkg8\" (UniqueName: \"kubernetes.io/projected/cd013d81-347d-4c1c-9ccf-0f5e1a590755-kube-api-access-jjkg8\") pod \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.833152 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd013d81-347d-4c1c-9ccf-0f5e1a590755-logs\") pod \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.833187 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-nova-metadata-tls-certs\") pod \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.833210 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-logs\") pod \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\" (UID: \"84fdf432-1886-4e91-bd3c-bca6f1b90c3a\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.833230 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-combined-ca-bundle\") pod \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\" (UID: \"cd013d81-347d-4c1c-9ccf-0f5e1a590755\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.833588 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts\") pod \"keystone-1011-account-create-update-86wgz\" (UID: \"ce9241f2-96be-4510-8216-34293762880a\") " pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.833646 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl6m8\" (UniqueName: \"kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8\") pod \"keystone-1011-account-create-update-86wgz\" (UID: \"ce9241f2-96be-4510-8216-34293762880a\") " pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.833689 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.833699 5118 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: E0223 07:10:07.848143 5118 projected.go:194] Error preparing data for projected volume kube-api-access-wl6m8 for pod openstack/keystone-1011-account-create-update-86wgz: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:10:07 crc kubenswrapper[5118]: E0223 07:10:07.848203 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8 podName:ce9241f2-96be-4510-8216-34293762880a nodeName:}" failed. No retries permitted until 2026-02-23 07:10:09.848183876 +0000 UTC m=+1472.851968449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wl6m8" (UniqueName: "kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8") pod "keystone-1011-account-create-update-86wgz" (UID: "ce9241f2-96be-4510-8216-34293762880a") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.849292 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-logs" (OuterVolumeSpecName: "logs") pod "84fdf432-1886-4e91-bd3c-bca6f1b90c3a" (UID: "84fdf432-1886-4e91-bd3c-bca6f1b90c3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.850733 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "b6c8ab95-d7a9-4f39-abff-bd8fd89590ed" (UID: "b6c8ab95-d7a9-4f39-abff-bd8fd89590ed"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: E0223 07:10:07.852553 5118 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 23 07:10:07 crc kubenswrapper[5118]: E0223 07:10:07.852685 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts podName:ce9241f2-96be-4510-8216-34293762880a nodeName:}" failed. No retries permitted until 2026-02-23 07:10:09.852653844 +0000 UTC m=+1472.856438407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts") pod "keystone-1011-account-create-update-86wgz" (UID: "ce9241f2-96be-4510-8216-34293762880a") : configmap "openstack-scripts" not found Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.853073 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd013d81-347d-4c1c-9ccf-0f5e1a590755-logs" (OuterVolumeSpecName: "logs") pod "cd013d81-347d-4c1c-9ccf-0f5e1a590755" (UID: "cd013d81-347d-4c1c-9ccf-0f5e1a590755"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.861777 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:10:07 crc kubenswrapper[5118]: E0223 07:10:07.866676 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-wl6m8 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-1011-account-create-update-86wgz" podUID="ce9241f2-96be-4510-8216-34293762880a" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.873394 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd013d81-347d-4c1c-9ccf-0f5e1a590755-kube-api-access-jjkg8" (OuterVolumeSpecName: "kube-api-access-jjkg8") pod "cd013d81-347d-4c1c-9ccf-0f5e1a590755" (UID: "cd013d81-347d-4c1c-9ccf-0f5e1a590755"). InnerVolumeSpecName "kube-api-access-jjkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.878936 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-kube-api-access-vgdnn" (OuterVolumeSpecName: "kube-api-access-vgdnn") pod "84fdf432-1886-4e91-bd3c-bca6f1b90c3a" (UID: "84fdf432-1886-4e91-bd3c-bca6f1b90c3a"). InnerVolumeSpecName "kube-api-access-vgdnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.879464 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.907216 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd013d81-347d-4c1c-9ccf-0f5e1a590755" (UID: "cd013d81-347d-4c1c-9ccf-0f5e1a590755"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.913574 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f424d603-7efb-4075-9a9b-5117dec09a6a" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": dial tcp 10.217.0.163:8776: i/o timeout" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.927986 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-skw6h" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.933807 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-config-data" (OuterVolumeSpecName: "config-data") pod "84fdf432-1886-4e91-bd3c-bca6f1b90c3a" (UID: "84fdf432-1886-4e91-bd3c-bca6f1b90c3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934409 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-combined-ca-bundle\") pod \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934447 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c7c403-ece4-4778-9a1c-25dbc355a0bf-logs\") pod \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934570 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-internal-tls-certs\") pod \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934596 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88fwz\" (UniqueName: \"kubernetes.io/projected/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-kube-api-access-88fwz\") pod \"0079e5d0-c2f8-43f7-8dad-207eaedca4d6\" (UID: \"0079e5d0-c2f8-43f7-8dad-207eaedca4d6\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934648 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data\") pod \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934670 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztlj6\" (UniqueName: \"kubernetes.io/projected/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-kube-api-access-ztlj6\") pod \"745f142b-ee2d-4354-98b3-2b6cd13e3b5e\" (UID: \"745f142b-ee2d-4354-98b3-2b6cd13e3b5e\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934693 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-operator-scripts\") pod \"0079e5d0-c2f8-43f7-8dad-207eaedca4d6\" (UID: \"0079e5d0-c2f8-43f7-8dad-207eaedca4d6\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934711 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-combined-ca-bundle\") pod \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934728 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp268\" (UniqueName: \"kubernetes.io/projected/95c7c403-ece4-4778-9a1c-25dbc355a0bf-kube-api-access-kp268\") pod \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934752 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data-custom\") pod \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934769 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjhgn\" (UniqueName: \"kubernetes.io/projected/9b555633-a53e-4689-b746-98bd29e6742e-kube-api-access-xjhgn\") pod \"9b555633-a53e-4689-b746-98bd29e6742e\" (UID: \"9b555633-a53e-4689-b746-98bd29e6742e\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934805 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srdls\" (UniqueName: \"kubernetes.io/projected/bf376c5f-cc04-4733-9968-e199472b4241-kube-api-access-srdls\") pod \"bf376c5f-cc04-4733-9968-e199472b4241\" (UID: \"bf376c5f-cc04-4733-9968-e199472b4241\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934828 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf376c5f-cc04-4733-9968-e199472b4241-operator-scripts\") pod \"bf376c5f-cc04-4733-9968-e199472b4241\" (UID: \"bf376c5f-cc04-4733-9968-e199472b4241\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934852 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtg9b\" (UniqueName: \"kubernetes.io/projected/334e9392-6a5f-4aa8-83d7-41e26e94dd32-kube-api-access-xtg9b\") pod \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934867 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-public-tls-certs\") pod \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934914 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-config-data\") pod \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\" (UID: \"334e9392-6a5f-4aa8-83d7-41e26e94dd32\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934930 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b555633-a53e-4689-b746-98bd29e6742e-operator-scripts\") pod \"9b555633-a53e-4689-b746-98bd29e6742e\" (UID: \"9b555633-a53e-4689-b746-98bd29e6742e\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.934965 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-operator-scripts\") pod \"745f142b-ee2d-4354-98b3-2b6cd13e3b5e\" (UID: \"745f142b-ee2d-4354-98b3-2b6cd13e3b5e\") " Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.935312 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjkg8\" (UniqueName: \"kubernetes.io/projected/cd013d81-347d-4c1c-9ccf-0f5e1a590755-kube-api-access-jjkg8\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.935328 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd013d81-347d-4c1c-9ccf-0f5e1a590755-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.935339 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.935347 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.935868 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf376c5f-cc04-4733-9968-e199472b4241-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf376c5f-cc04-4733-9968-e199472b4241" (UID: "bf376c5f-cc04-4733-9968-e199472b4241"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.937807 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.939771 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0079e5d0-c2f8-43f7-8dad-207eaedca4d6" (UID: "0079e5d0-c2f8-43f7-8dad-207eaedca4d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.941569 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b555633-a53e-4689-b746-98bd29e6742e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b555633-a53e-4689-b746-98bd29e6742e" (UID: "9b555633-a53e-4689-b746-98bd29e6742e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.947019 5118 generic.go:334] "Generic (PLEG): container finished" podID="5f515e00-c6e0-4849-b073-64721780e216" containerID="a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f" exitCode=143 Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.947121 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8c77bbddd-85rm4" event={"ID":"5f515e00-c6e0-4849-b073-64721780e216","Type":"ContainerDied","Data":"a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f"} Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.948891 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-kube-api-access-88fwz" (OuterVolumeSpecName: "kube-api-access-88fwz") pod "0079e5d0-c2f8-43f7-8dad-207eaedca4d6" (UID: "0079e5d0-c2f8-43f7-8dad-207eaedca4d6"). InnerVolumeSpecName "kube-api-access-88fwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.953832 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "745f142b-ee2d-4354-98b3-2b6cd13e3b5e" (UID: "745f142b-ee2d-4354-98b3-2b6cd13e3b5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.953990 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf376c5f-cc04-4733-9968-e199472b4241-kube-api-access-srdls" (OuterVolumeSpecName: "kube-api-access-srdls") pod "bf376c5f-cc04-4733-9968-e199472b4241" (UID: "bf376c5f-cc04-4733-9968-e199472b4241"). InnerVolumeSpecName "kube-api-access-srdls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.956494 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c7c403-ece4-4778-9a1c-25dbc355a0bf-logs" (OuterVolumeSpecName: "logs") pod "95c7c403-ece4-4778-9a1c-25dbc355a0bf" (UID: "95c7c403-ece4-4778-9a1c-25dbc355a0bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.958308 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9656-account-create-update-dshm9"] Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.958484 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-kube-api-access-ztlj6" (OuterVolumeSpecName: "kube-api-access-ztlj6") pod "745f142b-ee2d-4354-98b3-2b6cd13e3b5e" (UID: "745f142b-ee2d-4354-98b3-2b6cd13e3b5e"). InnerVolumeSpecName "kube-api-access-ztlj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.962411 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c7c403-ece4-4778-9a1c-25dbc355a0bf-kube-api-access-kp268" (OuterVolumeSpecName: "kube-api-access-kp268") pod "95c7c403-ece4-4778-9a1c-25dbc355a0bf" (UID: "95c7c403-ece4-4778-9a1c-25dbc355a0bf"). InnerVolumeSpecName "kube-api-access-kp268". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.964494 5118 generic.go:334] "Generic (PLEG): container finished" podID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" containerID="1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec" exitCode=143 Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.964647 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75595465d9-2pkqt" event={"ID":"5cc63cb3-2efc-441f-bd3c-8a6af30b9524","Type":"ContainerDied","Data":"1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec"} Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.966369 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.967517 5118 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.967607 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgdnn\" (UniqueName: \"kubernetes.io/projected/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-kube-api-access-vgdnn\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.973474 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "95c7c403-ece4-4778-9a1c-25dbc355a0bf" (UID: "95c7c403-ece4-4778-9a1c-25dbc355a0bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.974113 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b555633-a53e-4689-b746-98bd29e6742e-kube-api-access-xjhgn" (OuterVolumeSpecName: "kube-api-access-xjhgn") pod "9b555633-a53e-4689-b746-98bd29e6742e" (UID: "9b555633-a53e-4689-b746-98bd29e6742e"). InnerVolumeSpecName "kube-api-access-xjhgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.975749 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-skw6h" event={"ID":"5e254320-082c-442b-a1a9-4b7fafe2c556","Type":"ContainerDied","Data":"5f192cafde438fd57c36a523384e8d786abec6794e339c9dc4d5871e4051c5cb"} Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.975792 5118 scope.go:117] "RemoveContainer" containerID="245fe6e0f678afc06eab998734d901141ec48691b8a044347126b5ed7fc789e4" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.975912 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-skw6h" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.992374 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9656-account-create-update-dshm9"] Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.993791 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/334e9392-6a5f-4aa8-83d7-41e26e94dd32-kube-api-access-xtg9b" (OuterVolumeSpecName: "kube-api-access-xtg9b") pod "334e9392-6a5f-4aa8-83d7-41e26e94dd32" (UID: "334e9392-6a5f-4aa8-83d7-41e26e94dd32"). InnerVolumeSpecName "kube-api-access-xtg9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:07 crc kubenswrapper[5118]: I0223 07:10:07.993843 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-config-data" (OuterVolumeSpecName: "config-data") pod "cd013d81-347d-4c1c-9ccf-0f5e1a590755" (UID: "cd013d81-347d-4c1c-9ccf-0f5e1a590755"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.027081 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84fdf432-1886-4e91-bd3c-bca6f1b90c3a" (UID: "84fdf432-1886-4e91-bd3c-bca6f1b90c3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.027175 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtk98" event={"ID":"8a98024d-a91d-4769-9ec8-5537f7d6c20f","Type":"ContainerStarted","Data":"614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47"} Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.066482 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070635 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data\") pod \"08b38d85-cf57-41a9-9779-1593300b77a3\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070679 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q9wp\" (UniqueName: \"kubernetes.io/projected/08b38d85-cf57-41a9-9779-1593300b77a3-kube-api-access-5q9wp\") pod \"08b38d85-cf57-41a9-9779-1593300b77a3\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070713 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4gmc\" (UniqueName: \"kubernetes.io/projected/5e254320-082c-442b-a1a9-4b7fafe2c556-kube-api-access-b4gmc\") pod \"5e254320-082c-442b-a1a9-4b7fafe2c556\" (UID: \"5e254320-082c-442b-a1a9-4b7fafe2c556\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070743 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data-custom\") pod \"08b38d85-cf57-41a9-9779-1593300b77a3\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070778 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-combined-ca-bundle\") pod \"e66bcbb5-075a-4a87-981c-0dc608f19742\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070795 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08b38d85-cf57-41a9-9779-1593300b77a3-logs\") pod \"08b38d85-cf57-41a9-9779-1593300b77a3\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070813 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data\") pod \"e66bcbb5-075a-4a87-981c-0dc608f19742\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070831 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e254320-082c-442b-a1a9-4b7fafe2c556-operator-scripts\") pod \"5e254320-082c-442b-a1a9-4b7fafe2c556\" (UID: \"5e254320-082c-442b-a1a9-4b7fafe2c556\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070854 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-combined-ca-bundle\") pod \"08b38d85-cf57-41a9-9779-1593300b77a3\" (UID: \"08b38d85-cf57-41a9-9779-1593300b77a3\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070873 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66bcbb5-075a-4a87-981c-0dc608f19742-logs\") pod \"e66bcbb5-075a-4a87-981c-0dc608f19742\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070889 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-config-data\") pod \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070930 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data-custom\") pod \"e66bcbb5-075a-4a87-981c-0dc608f19742\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070961 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsk5g\" (UniqueName: \"kubernetes.io/projected/e66bcbb5-075a-4a87-981c-0dc608f19742-kube-api-access-qsk5g\") pod \"e66bcbb5-075a-4a87-981c-0dc608f19742\" (UID: \"e66bcbb5-075a-4a87-981c-0dc608f19742\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.070995 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kolla-config\") pod \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071012 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-memcached-tls-certs\") pod \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071039 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8wl8\" (UniqueName: \"kubernetes.io/projected/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kube-api-access-r8wl8\") pod \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071079 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-combined-ca-bundle\") pod \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\" (UID: \"52ab08e4-a114-4c99-adc6-dc05f711d8d9\") " Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071397 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88fwz\" (UniqueName: \"kubernetes.io/projected/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-kube-api-access-88fwz\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071412 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071421 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztlj6\" (UniqueName: \"kubernetes.io/projected/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-kube-api-access-ztlj6\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071430 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0079e5d0-c2f8-43f7-8dad-207eaedca4d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071441 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp268\" (UniqueName: \"kubernetes.io/projected/95c7c403-ece4-4778-9a1c-25dbc355a0bf-kube-api-access-kp268\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071450 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071458 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjhgn\" (UniqueName: \"kubernetes.io/projected/9b555633-a53e-4689-b746-98bd29e6742e-kube-api-access-xjhgn\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071467 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srdls\" (UniqueName: \"kubernetes.io/projected/bf376c5f-cc04-4733-9968-e199472b4241-kube-api-access-srdls\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071475 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf376c5f-cc04-4733-9968-e199472b4241-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071485 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtg9b\" (UniqueName: \"kubernetes.io/projected/334e9392-6a5f-4aa8-83d7-41e26e94dd32-kube-api-access-xtg9b\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071495 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b555633-a53e-4689-b746-98bd29e6742e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071503 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/745f142b-ee2d-4354-98b3-2b6cd13e3b5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.071513 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c7c403-ece4-4778-9a1c-25dbc355a0bf-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.072300 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e254320-082c-442b-a1a9-4b7fafe2c556-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e254320-082c-442b-a1a9-4b7fafe2c556" (UID: "5e254320-082c-442b-a1a9-4b7fafe2c556"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.072672 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.072976 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.073425 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08b38d85-cf57-41a9-9779-1593300b77a3-logs" (OuterVolumeSpecName: "logs") pod "08b38d85-cf57-41a9-9779-1593300b77a3" (UID: "08b38d85-cf57-41a9-9779-1593300b77a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.081463 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "52ab08e4-a114-4c99-adc6-dc05f711d8d9" (UID: "52ab08e4-a114-4c99-adc6-dc05f711d8d9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: E0223 07:10:08.085039 5118 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 23 07:10:08 crc kubenswrapper[5118]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-23T07:10:00Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 23 07:10:08 crc kubenswrapper[5118]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Feb 23 07:10:08 crc kubenswrapper[5118]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-f8rdg" message=< Feb 23 07:10:08 crc kubenswrapper[5118]: Exiting ovn-controller (1) [FAILED] Feb 23 07:10:08 crc kubenswrapper[5118]: Killing ovn-controller (1) [ OK ] Feb 23 07:10:08 crc kubenswrapper[5118]: Killing ovn-controller (1) with SIGKILL [ OK ] Feb 23 07:10:08 crc kubenswrapper[5118]: 2026-02-23T07:10:00Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 23 07:10:08 crc kubenswrapper[5118]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Feb 23 07:10:08 crc kubenswrapper[5118]: > Feb 23 07:10:08 crc kubenswrapper[5118]: E0223 07:10:08.085082 5118 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 23 07:10:08 crc kubenswrapper[5118]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-23T07:10:00Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 23 07:10:08 crc kubenswrapper[5118]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Feb 23 07:10:08 crc kubenswrapper[5118]: > pod="openstack/ovn-controller-f8rdg" podUID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" containerName="ovn-controller" containerID="cri-o://682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.085137 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-f8rdg" podUID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" containerName="ovn-controller" containerID="cri-o://682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd" gracePeriod=21 Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.088842 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e66bcbb5-075a-4a87-981c-0dc608f19742-logs" (OuterVolumeSpecName: "logs") pod "e66bcbb5-075a-4a87-981c-0dc608f19742" (UID: "e66bcbb5-075a-4a87-981c-0dc608f19742"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.098651 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-config-data" (OuterVolumeSpecName: "config-data") pod "52ab08e4-a114-4c99-adc6-dc05f711d8d9" (UID: "52ab08e4-a114-4c99-adc6-dc05f711d8d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.109678 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "84fdf432-1886-4e91-bd3c-bca6f1b90c3a" (UID: "84fdf432-1886-4e91-bd3c-bca6f1b90c3a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.109756 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b38d85-cf57-41a9-9779-1593300b77a3-kube-api-access-5q9wp" (OuterVolumeSpecName: "kube-api-access-5q9wp") pod "08b38d85-cf57-41a9-9779-1593300b77a3" (UID: "08b38d85-cf57-41a9-9779-1593300b77a3"). InnerVolumeSpecName "kube-api-access-5q9wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.109891 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e66bcbb5-075a-4a87-981c-0dc608f19742" (UID: "e66bcbb5-075a-4a87-981c-0dc608f19742"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110124 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66bcbb5-075a-4a87-981c-0dc608f19742-kube-api-access-qsk5g" (OuterVolumeSpecName: "kube-api-access-qsk5g") pod "e66bcbb5-075a-4a87-981c-0dc608f19742" (UID: "e66bcbb5-075a-4a87-981c-0dc608f19742"). InnerVolumeSpecName "kube-api-access-qsk5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110162 5118 generic.go:334] "Generic (PLEG): container finished" podID="52ab08e4-a114-4c99-adc6-dc05f711d8d9" containerID="275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97" exitCode=0 Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110242 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110259 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110305 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a094-account-create-update-5m7nf" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110318 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"52ab08e4-a114-4c99-adc6-dc05f711d8d9","Type":"ContainerDied","Data":"275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97"} Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110344 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"52ab08e4-a114-4c99-adc6-dc05f711d8d9","Type":"ContainerDied","Data":"2bd32646f8f8eec7c59c41f532ac0fc44795487171516ec5a5dcbbf77f899b7f"} Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110361 5118 scope.go:117] "RemoveContainer" containerID="275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110364 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110398 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110432 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-241c-account-create-update-2sgxt" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110460 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110494 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f94b-account-create-update-cvr4d" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110497 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110519 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69db8f76f-xbrx7" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110645 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2e07-account-create-update-9hjkr" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.110695 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5df75dfc9b-mpgf2" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.111358 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e254320-082c-442b-a1a9-4b7fafe2c556-kube-api-access-b4gmc" (OuterVolumeSpecName: "kube-api-access-b4gmc") pod "5e254320-082c-442b-a1a9-4b7fafe2c556" (UID: "5e254320-082c-442b-a1a9-4b7fafe2c556"). InnerVolumeSpecName "kube-api-access-b4gmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.117267 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-config-data" (OuterVolumeSpecName: "config-data") pod "334e9392-6a5f-4aa8-83d7-41e26e94dd32" (UID: "334e9392-6a5f-4aa8-83d7-41e26e94dd32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.117406 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08b38d85-cf57-41a9-9779-1593300b77a3" (UID: "08b38d85-cf57-41a9-9779-1593300b77a3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.124528 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kube-api-access-r8wl8" (OuterVolumeSpecName: "kube-api-access-r8wl8") pod "52ab08e4-a114-4c99-adc6-dc05f711d8d9" (UID: "52ab08e4-a114-4c99-adc6-dc05f711d8d9"). InnerVolumeSpecName "kube-api-access-r8wl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.131486 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cd013d81-347d-4c1c-9ccf-0f5e1a590755" (UID: "cd013d81-347d-4c1c-9ccf-0f5e1a590755"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.131920 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-77bd586555-4s2g7" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.173:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.132647 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-77bd586555-4s2g7" podUID="e75b838e-decf-4583-8b96-a41f54e2a654" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.173:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.176289 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95c7c403-ece4-4778-9a1c-25dbc355a0bf" (UID: "95c7c403-ece4-4778-9a1c-25dbc355a0bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.176786 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-combined-ca-bundle\") pod \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\" (UID: \"95c7c403-ece4-4778-9a1c-25dbc355a0bf\") " Feb 23 07:10:08 crc kubenswrapper[5118]: W0223 07:10:08.177460 5118 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/95c7c403-ece4-4778-9a1c-25dbc355a0bf/volumes/kubernetes.io~secret/combined-ca-bundle Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177482 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95c7c403-ece4-4778-9a1c-25dbc355a0bf" (UID: "95c7c403-ece4-4778-9a1c-25dbc355a0bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177539 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177554 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66bcbb5-075a-4a87-981c-0dc608f19742-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177564 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177572 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177581 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsk5g\" (UniqueName: \"kubernetes.io/projected/e66bcbb5-075a-4a87-981c-0dc608f19742-kube-api-access-qsk5g\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177589 5118 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177598 5118 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/84fdf432-1886-4e91-bd3c-bca6f1b90c3a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177607 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8wl8\" (UniqueName: \"kubernetes.io/projected/52ab08e4-a114-4c99-adc6-dc05f711d8d9-kube-api-access-r8wl8\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177616 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177625 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177634 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q9wp\" (UniqueName: \"kubernetes.io/projected/08b38d85-cf57-41a9-9779-1593300b77a3-kube-api-access-5q9wp\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177642 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4gmc\" (UniqueName: \"kubernetes.io/projected/5e254320-082c-442b-a1a9-4b7fafe2c556-kube-api-access-b4gmc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177651 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177659 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08b38d85-cf57-41a9-9779-1593300b77a3-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.177667 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e254320-082c-442b-a1a9-4b7fafe2c556-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.180084 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "95c7c403-ece4-4778-9a1c-25dbc355a0bf" (UID: "95c7c403-ece4-4778-9a1c-25dbc355a0bf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.188253 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "334e9392-6a5f-4aa8-83d7-41e26e94dd32" (UID: "334e9392-6a5f-4aa8-83d7-41e26e94dd32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.227165 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.246225 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cd013d81-347d-4c1c-9ccf-0f5e1a590755" (UID: "cd013d81-347d-4c1c-9ccf-0f5e1a590755"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.275810 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08b38d85-cf57-41a9-9779-1593300b77a3" (UID: "08b38d85-cf57-41a9-9779-1593300b77a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.282701 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334e9392-6a5f-4aa8-83d7-41e26e94dd32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.282733 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd013d81-347d-4c1c-9ccf-0f5e1a590755-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.282742 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.282750 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.311785 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.343807 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e66bcbb5-075a-4a87-981c-0dc608f19742" (UID: "e66bcbb5-075a-4a87-981c-0dc608f19742"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.344563 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.357397 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data" (OuterVolumeSpecName: "config-data") pod "08b38d85-cf57-41a9-9779-1593300b77a3" (UID: "08b38d85-cf57-41a9-9779-1593300b77a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.361628 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.364347 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "95c7c403-ece4-4778-9a1c-25dbc355a0bf" (UID: "95c7c403-ece4-4778-9a1c-25dbc355a0bf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.385146 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.385974 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08b38d85-cf57-41a9-9779-1593300b77a3-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.386008 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.386022 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.389629 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52ab08e4-a114-4c99-adc6-dc05f711d8d9" (UID: "52ab08e4-a114-4c99-adc6-dc05f711d8d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.421718 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data" (OuterVolumeSpecName: "config-data") pod "95c7c403-ece4-4778-9a1c-25dbc355a0bf" (UID: "95c7c403-ece4-4778-9a1c-25dbc355a0bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.453247 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data" (OuterVolumeSpecName: "config-data") pod "e66bcbb5-075a-4a87-981c-0dc608f19742" (UID: "e66bcbb5-075a-4a87-981c-0dc608f19742"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.453318 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.466289 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.466340 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.476179 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b5b85dd46-5prjs"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.489918 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66bcbb5-075a-4a87-981c-0dc608f19742-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.489952 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c7c403-ece4-4778-9a1c-25dbc355a0bf-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.489964 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.521981 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6b5b85dd46-5prjs"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.601739 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.641225 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.677086 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "52ab08e4-a114-4c99-adc6-dc05f711d8d9" (UID: "52ab08e4-a114-4c99-adc6-dc05f711d8d9"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.703990 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2de9-account-create-update-7hqls"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.718003 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2de9-account-create-update-7hqls"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.751740 5118 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/52ab08e4-a114-4c99-adc6-dc05f711d8d9-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:08 crc kubenswrapper[5118]: E0223 07:10:08.752794 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 23 07:10:08 crc kubenswrapper[5118]: E0223 07:10:08.752856 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data podName:e3b37356-5c38-40b3-af55-4f25a2f16b21 nodeName:}" failed. No retries permitted until 2026-02-23 07:10:16.752836493 +0000 UTC m=+1479.756621076 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data") pod "rabbitmq-server-0" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21") : configmap "rabbitmq-config-data" not found Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.806640 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.863312 5118 scope.go:117] "RemoveContainer" containerID="275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97" Feb 23 07:10:08 crc kubenswrapper[5118]: E0223 07:10:08.865914 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97\": container with ID starting with 275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97 not found: ID does not exist" containerID="275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.865956 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97"} err="failed to get container status \"275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97\": rpc error: code = NotFound desc = could not find container \"275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97\": container with ID starting with 275849184e3578f877febb08e8a36f201360c92247717ad31c8fdc35c18e1d97 not found: ID does not exist" Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.933037 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-241c-account-create-update-2sgxt"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.945081 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-241c-account-create-update-2sgxt"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.954872 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:10:08 crc kubenswrapper[5118]: I0223 07:10:08.966851 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.023884 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a094-account-create-update-5m7nf"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.033270 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a094-account-create-update-5m7nf"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.047944 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5df75dfc9b-mpgf2"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.049581 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-f8rdg_c6bf46e9-d93e-4754-9f48-fc598c9e1359/ovn-controller/0.log" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.049645 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8rdg" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.055021 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5df75dfc9b-mpgf2"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.073660 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.089924 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.117561 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2e07-account-create-update-9hjkr"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.127917 5118 generic.go:334] "Generic (PLEG): container finished" podID="d08c9f04-59d9-4892-a540-5c892c604a71" containerID="0123533d83aacb1fef7bf221a008c5a2e879c0ab52103eda69db30b401c15faa" exitCode=0 Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.127993 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d08c9f04-59d9-4892-a540-5c892c604a71","Type":"ContainerDied","Data":"0123533d83aacb1fef7bf221a008c5a2e879c0ab52103eda69db30b401c15faa"} Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.129962 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2e07-account-create-update-9hjkr"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.133019 5118 generic.go:334] "Generic (PLEG): container finished" podID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerID="614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47" exitCode=0 Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.133065 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtk98" event={"ID":"8a98024d-a91d-4769-9ec8-5537f7d6c20f","Type":"ContainerDied","Data":"614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47"} Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.140070 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.149635 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.154806 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-f8rdg_c6bf46e9-d93e-4754-9f48-fc598c9e1359/ovn-controller/0.log" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.154862 5118 generic.go:334] "Generic (PLEG): container finished" podID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" containerID="682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd" exitCode=137 Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.154947 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8rdg" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.154993 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8rdg" event={"ID":"c6bf46e9-d93e-4754-9f48-fc598c9e1359","Type":"ContainerDied","Data":"682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd"} Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.155040 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8rdg" event={"ID":"c6bf46e9-d93e-4754-9f48-fc598c9e1359","Type":"ContainerDied","Data":"51baf97df9c9fc1dde9276e6727d223947f6b7974e9089ff60b43dab0132becf"} Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.155064 5118 scope.go:117] "RemoveContainer" containerID="682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.161174 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-ovn-controller-tls-certs\") pod \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.161243 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run-ovn\") pod \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.161352 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run\") pod \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.161413 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-combined-ca-bundle\") pod \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.161672 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6bf46e9-d93e-4754-9f48-fc598c9e1359-scripts\") pod \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.161801 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rst7g\" (UniqueName: \"kubernetes.io/projected/c6bf46e9-d93e-4754-9f48-fc598c9e1359-kube-api-access-rst7g\") pod \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.161820 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-log-ovn\") pod \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\" (UID: \"c6bf46e9-d93e-4754-9f48-fc598c9e1359\") " Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.169420 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bf46e9-d93e-4754-9f48-fc598c9e1359-kube-api-access-rst7g" (OuterVolumeSpecName: "kube-api-access-rst7g") pod "c6bf46e9-d93e-4754-9f48-fc598c9e1359" (UID: "c6bf46e9-d93e-4754-9f48-fc598c9e1359"). InnerVolumeSpecName "kube-api-access-rst7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.170646 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bf46e9-d93e-4754-9f48-fc598c9e1359-scripts" (OuterVolumeSpecName: "scripts") pod "c6bf46e9-d93e-4754-9f48-fc598c9e1359" (UID: "c6bf46e9-d93e-4754-9f48-fc598c9e1359"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.170762 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run" (OuterVolumeSpecName: "var-run") pod "c6bf46e9-d93e-4754-9f48-fc598c9e1359" (UID: "c6bf46e9-d93e-4754-9f48-fc598c9e1359"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.170782 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c6bf46e9-d93e-4754-9f48-fc598c9e1359" (UID: "c6bf46e9-d93e-4754-9f48-fc598c9e1359"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.171988 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c6bf46e9-d93e-4754-9f48-fc598c9e1359" (UID: "c6bf46e9-d93e-4754-9f48-fc598c9e1359"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.182152 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-868db49cd-qncjp" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.184000 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1011-account-create-update-86wgz" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.197613 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-skw6h"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.213772 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-skw6h"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.233546 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6bf46e9-d93e-4754-9f48-fc598c9e1359" (UID: "c6bf46e9-d93e-4754-9f48-fc598c9e1359"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.235863 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f94b-account-create-update-cvr4d"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.241238 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f94b-account-create-update-cvr4d"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.255665 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.260082 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.265166 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-69db8f76f-xbrx7"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.266692 5118 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.266730 5118 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.266740 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.266753 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c6bf46e9-d93e-4754-9f48-fc598c9e1359-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.266762 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rst7g\" (UniqueName: \"kubernetes.io/projected/c6bf46e9-d93e-4754-9f48-fc598c9e1359-kube-api-access-rst7g\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.266771 5118 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c6bf46e9-d93e-4754-9f48-fc598c9e1359-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.344175 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-69db8f76f-xbrx7"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.370186 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.376122 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "c6bf46e9-d93e-4754-9f48-fc598c9e1359" (UID: "c6bf46e9-d93e-4754-9f48-fc598c9e1359"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.391561 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.465184 5118 scope.go:117] "RemoveContainer" containerID="682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd" Feb 23 07:10:09 crc kubenswrapper[5118]: E0223 07:10:09.469821 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd\": container with ID starting with 682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd not found: ID does not exist" containerID="682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.469866 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd"} err="failed to get container status \"682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd\": rpc error: code = NotFound desc = could not find container \"682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd\": container with ID starting with 682468c3388924f8e2dc71e1223905cb05d4db4b77e8e3bc18cb20be5a1ad1cd not found: ID does not exist" Feb 23 07:10:09 crc kubenswrapper[5118]: I0223 07:10:09.472878 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bf46e9-d93e-4754-9f48-fc598c9e1359-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.489955 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1011-account-create-update-86wgz"] Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.502569 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1011-account-create-update-86wgz"] Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.526212 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-868db49cd-qncjp"] Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.542630 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-868db49cd-qncjp"] Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.551423 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-f8rdg"] Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.573917 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl6m8\" (UniqueName: \"kubernetes.io/projected/ce9241f2-96be-4510-8216-34293762880a-kube-api-access-wl6m8\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.573949 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce9241f2-96be-4510-8216-34293762880a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.574804 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-f8rdg"] Feb 23 07:10:10 crc kubenswrapper[5118]: E0223 07:10:09.724645 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac is running failed: container process not found" containerID="b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 07:10:10 crc kubenswrapper[5118]: E0223 07:10:09.725082 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac is running failed: container process not found" containerID="b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.725128 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0079e5d0-c2f8-43f7-8dad-207eaedca4d6" path="/var/lib/kubelet/pods/0079e5d0-c2f8-43f7-8dad-207eaedca4d6/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: E0223 07:10:09.725531 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac is running failed: container process not found" containerID="b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 23 07:10:10 crc kubenswrapper[5118]: E0223 07:10:09.725589 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerName="ovn-northd" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.725814 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ba04e0-7d62-472f-ab31-c41f926c93e7" path="/var/lib/kubelet/pods/04ba04e0-7d62-472f-ab31-c41f926c93e7/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.730197 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b38d85-cf57-41a9-9779-1593300b77a3" path="/var/lib/kubelet/pods/08b38d85-cf57-41a9-9779-1593300b77a3/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.731019 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="178ef478-d8d3-49a5-9188-9970d3859049" path="/var/lib/kubelet/pods/178ef478-d8d3-49a5-9188-9970d3859049/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.732221 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a79f618-3555-44a5-8c52-ec9120261645" path="/var/lib/kubelet/pods/2a79f618-3555-44a5-8c52-ec9120261645/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.732874 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c9e11a2-4b8f-4578-8c92-7d3e06258800" path="/var/lib/kubelet/pods/2c9e11a2-4b8f-4578-8c92-7d3e06258800/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.734759 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="334e9392-6a5f-4aa8-83d7-41e26e94dd32" path="/var/lib/kubelet/pods/334e9392-6a5f-4aa8-83d7-41e26e94dd32/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.735334 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="376ff246-417d-442a-83d1-1579abd318ba" path="/var/lib/kubelet/pods/376ff246-417d-442a-83d1-1579abd318ba/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.736001 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="460a8b7a-b61f-4f56-889e-54b5c2346679" path="/var/lib/kubelet/pods/460a8b7a-b61f-4f56-889e-54b5c2346679/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.737009 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52ab08e4-a114-4c99-adc6-dc05f711d8d9" path="/var/lib/kubelet/pods/52ab08e4-a114-4c99-adc6-dc05f711d8d9/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.737511 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e254320-082c-442b-a1a9-4b7fafe2c556" path="/var/lib/kubelet/pods/5e254320-082c-442b-a1a9-4b7fafe2c556/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.737967 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652f586b-2ae4-4a45-bc82-01c65ec27696" path="/var/lib/kubelet/pods/652f586b-2ae4-4a45-bc82-01c65ec27696/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.738294 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="745f142b-ee2d-4354-98b3-2b6cd13e3b5e" path="/var/lib/kubelet/pods/745f142b-ee2d-4354-98b3-2b6cd13e3b5e/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.742615 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" path="/var/lib/kubelet/pods/84fdf432-1886-4e91-bd3c-bca6f1b90c3a/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.743130 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a26f9ed-63ef-4fa5-934f-e1190d79cf85" path="/var/lib/kubelet/pods/8a26f9ed-63ef-4fa5-934f-e1190d79cf85/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.743480 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" path="/var/lib/kubelet/pods/95c7c403-ece4-4778-9a1c-25dbc355a0bf/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.744428 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b555633-a53e-4689-b746-98bd29e6742e" path="/var/lib/kubelet/pods/9b555633-a53e-4689-b746-98bd29e6742e/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.744821 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" path="/var/lib/kubelet/pods/b0e825c7-deb0-41b5-b358-f23dcc0f1082/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.745368 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c8ab95-d7a9-4f39-abff-bd8fd89590ed" path="/var/lib/kubelet/pods/b6c8ab95-d7a9-4f39-abff-bd8fd89590ed/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.745883 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf376c5f-cc04-4733-9968-e199472b4241" path="/var/lib/kubelet/pods/bf376c5f-cc04-4733-9968-e199472b4241/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.746670 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" path="/var/lib/kubelet/pods/c6bf46e9-d93e-4754-9f48-fc598c9e1359/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.747338 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" path="/var/lib/kubelet/pods/cd013d81-347d-4c1c-9ccf-0f5e1a590755/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.747738 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce9241f2-96be-4510-8216-34293762880a" path="/var/lib/kubelet/pods/ce9241f2-96be-4510-8216-34293762880a/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.748003 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf1e2e3d-0fc3-474a-a15d-6808347c8240" path="/var/lib/kubelet/pods/cf1e2e3d-0fc3-474a-a15d-6808347c8240/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.748996 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e66bcbb5-075a-4a87-981c-0dc608f19742" path="/var/lib/kubelet/pods/e66bcbb5-075a-4a87-981c-0dc608f19742/volumes" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.768388 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.880038 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-public-tls-certs\") pod \"92a80c23-cba1-417f-bbd5-5c5138c3664a\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.880708 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-fernet-keys\") pod \"92a80c23-cba1-417f-bbd5-5c5138c3664a\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.880743 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-internal-tls-certs\") pod \"92a80c23-cba1-417f-bbd5-5c5138c3664a\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.880806 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-credential-keys\") pod \"92a80c23-cba1-417f-bbd5-5c5138c3664a\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.880828 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-combined-ca-bundle\") pod \"92a80c23-cba1-417f-bbd5-5c5138c3664a\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.880856 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-scripts\") pod \"92a80c23-cba1-417f-bbd5-5c5138c3664a\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.880927 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-config-data\") pod \"92a80c23-cba1-417f-bbd5-5c5138c3664a\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.880951 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tlhr\" (UniqueName: \"kubernetes.io/projected/92a80c23-cba1-417f-bbd5-5c5138c3664a-kube-api-access-7tlhr\") pod \"92a80c23-cba1-417f-bbd5-5c5138c3664a\" (UID: \"92a80c23-cba1-417f-bbd5-5c5138c3664a\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.886066 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-scripts" (OuterVolumeSpecName: "scripts") pod "92a80c23-cba1-417f-bbd5-5c5138c3664a" (UID: "92a80c23-cba1-417f-bbd5-5c5138c3664a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.886486 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "92a80c23-cba1-417f-bbd5-5c5138c3664a" (UID: "92a80c23-cba1-417f-bbd5-5c5138c3664a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.898118 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "92a80c23-cba1-417f-bbd5-5c5138c3664a" (UID: "92a80c23-cba1-417f-bbd5-5c5138c3664a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.915617 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a80c23-cba1-417f-bbd5-5c5138c3664a-kube-api-access-7tlhr" (OuterVolumeSpecName: "kube-api-access-7tlhr") pod "92a80c23-cba1-417f-bbd5-5c5138c3664a" (UID: "92a80c23-cba1-417f-bbd5-5c5138c3664a"). InnerVolumeSpecName "kube-api-access-7tlhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.936378 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92a80c23-cba1-417f-bbd5-5c5138c3664a" (UID: "92a80c23-cba1-417f-bbd5-5c5138c3664a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.941300 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-config-data" (OuterVolumeSpecName: "config-data") pod "92a80c23-cba1-417f-bbd5-5c5138c3664a" (UID: "92a80c23-cba1-417f-bbd5-5c5138c3664a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.957884 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "92a80c23-cba1-417f-bbd5-5c5138c3664a" (UID: "92a80c23-cba1-417f-bbd5-5c5138c3664a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.967001 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "92a80c23-cba1-417f-bbd5-5c5138c3664a" (UID: "92a80c23-cba1-417f-bbd5-5c5138c3664a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.983743 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.983782 5118 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.983794 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.983805 5118 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.983819 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.983830 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.983841 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92a80c23-cba1-417f-bbd5-5c5138c3664a-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:09.983850 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tlhr\" (UniqueName: \"kubernetes.io/projected/92a80c23-cba1-417f-bbd5-5c5138c3664a-kube-api-access-7tlhr\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.203500 5118 generic.go:334] "Generic (PLEG): container finished" podID="92a80c23-cba1-417f-bbd5-5c5138c3664a" containerID="7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3" exitCode=0 Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.203693 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-777dc4b79-zkfms" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.204019 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-777dc4b79-zkfms" event={"ID":"92a80c23-cba1-417f-bbd5-5c5138c3664a","Type":"ContainerDied","Data":"7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3"} Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.204087 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-777dc4b79-zkfms" event={"ID":"92a80c23-cba1-417f-bbd5-5c5138c3664a","Type":"ContainerDied","Data":"591ea039ba39ce6f2d7336c52c3117c715770dcae9d1daf7a2d08d88be55b1a2"} Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.204141 5118 scope.go:117] "RemoveContainer" containerID="7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.211300 5118 generic.go:334] "Generic (PLEG): container finished" podID="e3b37356-5c38-40b3-af55-4f25a2f16b21" containerID="0a5be7ec0d548d228afd1de2a7afc107749b13909039d656bc1026e5bcf306a3" exitCode=0 Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.211389 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e3b37356-5c38-40b3-af55-4f25a2f16b21","Type":"ContainerDied","Data":"0a5be7ec0d548d228afd1de2a7afc107749b13909039d656bc1026e5bcf306a3"} Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.225546 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8fbf0cfa-8a35-49c6-bfa5-6639a1e75752/ovn-northd/0.log" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.225647 5118 generic.go:334] "Generic (PLEG): container finished" podID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerID="b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac" exitCode=139 Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.225771 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752","Type":"ContainerDied","Data":"b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac"} Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.228572 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtk98" event={"ID":"8a98024d-a91d-4769-9ec8-5537f7d6c20f","Type":"ContainerStarted","Data":"47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a"} Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.281279 5118 scope.go:117] "RemoveContainer" containerID="7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3" Feb 23 07:10:10 crc kubenswrapper[5118]: E0223 07:10:10.291308 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3\": container with ID starting with 7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3 not found: ID does not exist" containerID="7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.291355 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3"} err="failed to get container status \"7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3\": rpc error: code = NotFound desc = could not find container \"7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3\": container with ID starting with 7d47e2ce96e0e8bdf946ae9d1b1758a1ee156432adadf6dc3f97af13812107c3 not found: ID does not exist" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.305289 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mtk98" podStartSLOduration=9.282349458 podStartE2EDuration="13.305253705s" podCreationTimestamp="2026-02-23 07:09:57 +0000 UTC" firstStartedPulling="2026-02-23 07:10:05.592209565 +0000 UTC m=+1468.595994138" lastFinishedPulling="2026-02-23 07:10:09.615113812 +0000 UTC m=+1472.618898385" observedRunningTime="2026-02-23 07:10:10.250023286 +0000 UTC m=+1473.253807869" watchObservedRunningTime="2026-02-23 07:10:10.305253705 +0000 UTC m=+1473.309038278" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.327340 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-777dc4b79-zkfms"] Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.336647 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-777dc4b79-zkfms"] Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.601945 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.603215 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8fbf0cfa-8a35-49c6-bfa5-6639a1e75752/ovn-northd/0.log" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.603296 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.605496 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.802897 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn2d7\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-kube-api-access-sn2d7\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803315 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3b37356-5c38-40b3-af55-4f25a2f16b21-erlang-cookie-secret\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803339 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-rundir\") pod \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803359 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-plugins\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803378 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-erlang-cookie\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803404 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-confd\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803440 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzk42\" (UniqueName: \"kubernetes.io/projected/d08c9f04-59d9-4892-a540-5c892c604a71-kube-api-access-tzk42\") pod \"d08c9f04-59d9-4892-a540-5c892c604a71\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803480 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803511 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-kolla-config\") pod \"d08c9f04-59d9-4892-a540-5c892c604a71\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803531 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d08c9f04-59d9-4892-a540-5c892c604a71\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803578 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-tls\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803598 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803623 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-scripts\") pod \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803658 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-generated\") pod \"d08c9f04-59d9-4892-a540-5c892c604a71\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803702 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lv4x\" (UniqueName: \"kubernetes.io/projected/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-kube-api-access-6lv4x\") pod \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803739 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-metrics-certs-tls-certs\") pod \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803760 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-northd-tls-certs\") pod \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803777 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-operator-scripts\") pod \"d08c9f04-59d9-4892-a540-5c892c604a71\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803808 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-galera-tls-certs\") pod \"d08c9f04-59d9-4892-a540-5c892c604a71\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803823 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-config\") pod \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803841 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-default\") pod \"d08c9f04-59d9-4892-a540-5c892c604a71\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803867 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3b37356-5c38-40b3-af55-4f25a2f16b21-pod-info\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803891 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-combined-ca-bundle\") pod \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\" (UID: \"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803917 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-server-conf\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803934 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-combined-ca-bundle\") pod \"d08c9f04-59d9-4892-a540-5c892c604a71\" (UID: \"d08c9f04-59d9-4892-a540-5c892c604a71\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.803965 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-plugins-conf\") pod \"e3b37356-5c38-40b3-af55-4f25a2f16b21\" (UID: \"e3b37356-5c38-40b3-af55-4f25a2f16b21\") " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.805154 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.805521 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d08c9f04-59d9-4892-a540-5c892c604a71" (UID: "d08c9f04-59d9-4892-a540-5c892c604a71"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.808305 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-scripts" (OuterVolumeSpecName: "scripts") pod "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" (UID: "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.817125 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-config" (OuterVolumeSpecName: "config") pod "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" (UID: "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.818175 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d08c9f04-59d9-4892-a540-5c892c604a71" (UID: "d08c9f04-59d9-4892-a540-5c892c604a71"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.821000 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-kube-api-access-6lv4x" (OuterVolumeSpecName: "kube-api-access-6lv4x") pod "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" (UID: "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752"). InnerVolumeSpecName "kube-api-access-6lv4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.821173 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e3b37356-5c38-40b3-af55-4f25a2f16b21-pod-info" (OuterVolumeSpecName: "pod-info") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.824527 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.831815 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08c9f04-59d9-4892-a540-5c892c604a71-kube-api-access-tzk42" (OuterVolumeSpecName: "kube-api-access-tzk42") pod "d08c9f04-59d9-4892-a540-5c892c604a71" (UID: "d08c9f04-59d9-4892-a540-5c892c604a71"). InnerVolumeSpecName "kube-api-access-tzk42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.832514 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" (UID: "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.834375 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d08c9f04-59d9-4892-a540-5c892c604a71" (UID: "d08c9f04-59d9-4892-a540-5c892c604a71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.834858 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d08c9f04-59d9-4892-a540-5c892c604a71" (UID: "d08c9f04-59d9-4892-a540-5c892c604a71"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.849330 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.849450 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.857247 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.879385 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b37356-5c38-40b3-af55-4f25a2f16b21-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.887936 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-kube-api-access-sn2d7" (OuterVolumeSpecName: "kube-api-access-sn2d7") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "kube-api-access-sn2d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906505 5118 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906548 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906595 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906608 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906621 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906635 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lv4x\" (UniqueName: \"kubernetes.io/projected/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-kube-api-access-6lv4x\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906647 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906658 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906668 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d08c9f04-59d9-4892-a540-5c892c604a71-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906681 5118 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3b37356-5c38-40b3-af55-4f25a2f16b21-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906691 5118 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906703 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn2d7\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-kube-api-access-sn2d7\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906714 5118 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3b37356-5c38-40b3-af55-4f25a2f16b21-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906727 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906737 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906750 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.906760 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzk42\" (UniqueName: \"kubernetes.io/projected/d08c9f04-59d9-4892-a540-5c892c604a71-kube-api-access-tzk42\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.929243 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "d08c9f04-59d9-4892-a540-5c892c604a71" (UID: "d08c9f04-59d9-4892-a540-5c892c604a71"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.929740 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data" (OuterVolumeSpecName: "config-data") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.939675 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.972517 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d08c9f04-59d9-4892-a540-5c892c604a71" (UID: "d08c9f04-59d9-4892-a540-5c892c604a71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.972890 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-server-conf" (OuterVolumeSpecName: "server-conf") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.972990 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" (UID: "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.974951 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" (UID: "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.985064 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" (UID: "8fbf0cfa-8a35-49c6-bfa5-6639a1e75752"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:10 crc kubenswrapper[5118]: I0223 07:10:10.996435 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.007417 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "d08c9f04-59d9-4892-a540-5c892c604a71" (UID: "d08c9f04-59d9-4892-a540-5c892c604a71"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.008276 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.008318 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.008330 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.008339 5118 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.008349 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.008357 5118 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.008366 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.008375 5118 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3b37356-5c38-40b3-af55-4f25a2f16b21-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.008385 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d08c9f04-59d9-4892-a540-5c892c604a71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.022263 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e3b37356-5c38-40b3-af55-4f25a2f16b21" (UID: "e3b37356-5c38-40b3-af55-4f25a2f16b21"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.030364 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.108801 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-httpd-config\") pod \"cf1c2052-6563-45c5-888c-f7a153225f83\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.108879 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-ovndb-tls-certs\") pod \"cf1c2052-6563-45c5-888c-f7a153225f83\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.108919 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-public-tls-certs\") pod \"cf1c2052-6563-45c5-888c-f7a153225f83\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.108937 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-combined-ca-bundle\") pod \"cf1c2052-6563-45c5-888c-f7a153225f83\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.108978 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m2wx\" (UniqueName: \"kubernetes.io/projected/cf1c2052-6563-45c5-888c-f7a153225f83-kube-api-access-5m2wx\") pod \"cf1c2052-6563-45c5-888c-f7a153225f83\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.109056 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-config\") pod \"cf1c2052-6563-45c5-888c-f7a153225f83\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.109117 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-internal-tls-certs\") pod \"cf1c2052-6563-45c5-888c-f7a153225f83\" (UID: \"cf1c2052-6563-45c5-888c-f7a153225f83\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.109430 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3b37356-5c38-40b3-af55-4f25a2f16b21-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.109441 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.113650 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1c2052-6563-45c5-888c-f7a153225f83-kube-api-access-5m2wx" (OuterVolumeSpecName: "kube-api-access-5m2wx") pod "cf1c2052-6563-45c5-888c-f7a153225f83" (UID: "cf1c2052-6563-45c5-888c-f7a153225f83"). InnerVolumeSpecName "kube-api-access-5m2wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.114599 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cf1c2052-6563-45c5-888c-f7a153225f83" (UID: "cf1c2052-6563-45c5-888c-f7a153225f83"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.118323 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.156878 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf1c2052-6563-45c5-888c-f7a153225f83" (UID: "cf1c2052-6563-45c5-888c-f7a153225f83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.199319 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cf1c2052-6563-45c5-888c-f7a153225f83" (UID: "cf1c2052-6563-45c5-888c-f7a153225f83"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.205515 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-config" (OuterVolumeSpecName: "config") pod "cf1c2052-6563-45c5-888c-f7a153225f83" (UID: "cf1c2052-6563-45c5-888c-f7a153225f83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.210360 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-config-data\") pod \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.210463 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-log-httpd\") pod \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.210560 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-scripts\") pod \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.210618 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-ceilometer-tls-certs\") pod \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.210645 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-combined-ca-bundle\") pod \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.210683 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-run-httpd\") pod \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.210774 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kq2p\" (UniqueName: \"kubernetes.io/projected/a0a7f53e-845e-4dfd-a80d-f790b60270fc-kube-api-access-2kq2p\") pod \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.210818 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-sg-core-conf-yaml\") pod \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\" (UID: \"a0a7f53e-845e-4dfd-a80d-f790b60270fc\") " Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.211246 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.211266 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m2wx\" (UniqueName: \"kubernetes.io/projected/cf1c2052-6563-45c5-888c-f7a153225f83-kube-api-access-5m2wx\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.211279 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.211289 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.211297 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.222680 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0a7f53e-845e-4dfd-a80d-f790b60270fc" (UID: "a0a7f53e-845e-4dfd-a80d-f790b60270fc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.223785 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0a7f53e-845e-4dfd-a80d-f790b60270fc" (UID: "a0a7f53e-845e-4dfd-a80d-f790b60270fc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.227685 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-scripts" (OuterVolumeSpecName: "scripts") pod "a0a7f53e-845e-4dfd-a80d-f790b60270fc" (UID: "a0a7f53e-845e-4dfd-a80d-f790b60270fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.231190 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a7f53e-845e-4dfd-a80d-f790b60270fc-kube-api-access-2kq2p" (OuterVolumeSpecName: "kube-api-access-2kq2p") pod "a0a7f53e-845e-4dfd-a80d-f790b60270fc" (UID: "a0a7f53e-845e-4dfd-a80d-f790b60270fc"). InnerVolumeSpecName "kube-api-access-2kq2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.238423 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf1c2052-6563-45c5-888c-f7a153225f83" (UID: "cf1c2052-6563-45c5-888c-f7a153225f83"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.240149 5118 generic.go:334] "Generic (PLEG): container finished" podID="cf1c2052-6563-45c5-888c-f7a153225f83" containerID="26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61" exitCode=0 Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.240226 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ff569978f-gwmwn" event={"ID":"cf1c2052-6563-45c5-888c-f7a153225f83","Type":"ContainerDied","Data":"26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61"} Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.240258 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ff569978f-gwmwn" event={"ID":"cf1c2052-6563-45c5-888c-f7a153225f83","Type":"ContainerDied","Data":"5e2ad5be86c31b1d2c83c5e76adb5efa3697dd889de903b885eeee7094932341"} Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.240277 5118 scope.go:117] "RemoveContainer" containerID="8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.240429 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ff569978f-gwmwn" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.244580 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8fbf0cfa-8a35-49c6-bfa5-6639a1e75752/ovn-northd/0.log" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.244705 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.245639 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8fbf0cfa-8a35-49c6-bfa5-6639a1e75752","Type":"ContainerDied","Data":"e581b26bea61e2ab21c91d0bfc1c92a522875f21a634e723c36b7d81a4b8837a"} Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.246269 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a0a7f53e-845e-4dfd-a80d-f790b60270fc" (UID: "a0a7f53e-845e-4dfd-a80d-f790b60270fc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.247514 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d08c9f04-59d9-4892-a540-5c892c604a71","Type":"ContainerDied","Data":"ad90bffdaded534d0ab312196f1748105833f72f25de5f940c93acd8dca99b4c"} Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.247667 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.249305 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cf1c2052-6563-45c5-888c-f7a153225f83" (UID: "cf1c2052-6563-45c5-888c-f7a153225f83"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.252991 5118 generic.go:334] "Generic (PLEG): container finished" podID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerID="5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7" exitCode=0 Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.253065 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a7f53e-845e-4dfd-a80d-f790b60270fc","Type":"ContainerDied","Data":"5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7"} Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.253085 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a7f53e-845e-4dfd-a80d-f790b60270fc","Type":"ContainerDied","Data":"095c3652bb6291c30e57cd77c235c928b95d49ecee6ea8d409c2537557574246"} Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.253171 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.256059 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.256133 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e3b37356-5c38-40b3-af55-4f25a2f16b21","Type":"ContainerDied","Data":"de6bfc3dbe9249393b3ded714f1438bc7da2bcefbb345ebbff0a39d67d7c978d"} Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.289460 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.291171 5118 scope.go:117] "RemoveContainer" containerID="26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.297776 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.303845 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a0a7f53e-845e-4dfd-a80d-f790b60270fc" (UID: "a0a7f53e-845e-4dfd-a80d-f790b60270fc"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.314180 5118 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.314216 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.314226 5118 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.314236 5118 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.314244 5118 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a7f53e-845e-4dfd-a80d-f790b60270fc-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.314252 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1c2052-6563-45c5-888c-f7a153225f83-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.314261 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kq2p\" (UniqueName: \"kubernetes.io/projected/a0a7f53e-845e-4dfd-a80d-f790b60270fc-kube-api-access-2kq2p\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.314269 5118 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.331818 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.340838 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0a7f53e-845e-4dfd-a80d-f790b60270fc" (UID: "a0a7f53e-845e-4dfd-a80d-f790b60270fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.340194 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.357861 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.361176 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.363280 5118 scope.go:117] "RemoveContainer" containerID="8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0" Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.364224 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0\": container with ID starting with 8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0 not found: ID does not exist" containerID="8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.364292 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0"} err="failed to get container status \"8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0\": rpc error: code = NotFound desc = could not find container \"8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0\": container with ID starting with 8ee9da8b378378a3111b821f32ed403b957cdb37ab68521f837d90389b3857c0 not found: ID does not exist" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.364332 5118 scope.go:117] "RemoveContainer" containerID="26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61" Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.364686 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61\": container with ID starting with 26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61 not found: ID does not exist" containerID="26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.364709 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61"} err="failed to get container status \"26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61\": rpc error: code = NotFound desc = could not find container \"26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61\": container with ID starting with 26306be7ec63fc31f42436db4b904534e2e880401368f19554736448a67f2b61 not found: ID does not exist" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.364723 5118 scope.go:117] "RemoveContainer" containerID="25d2ae49d9a05edcb3c8c8fa2bea4dca27a3dbb07af2c6c8fc988dc353770b62" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.389603 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-config-data" (OuterVolumeSpecName: "config-data") pod "a0a7f53e-845e-4dfd-a80d-f790b60270fc" (UID: "a0a7f53e-845e-4dfd-a80d-f790b60270fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.400361 5118 scope.go:117] "RemoveContainer" containerID="b403b8eb9cd3b4b00853d9c5d499e1c40970754df960d593f68d462a9f1eb2ac" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.416063 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.416109 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a7f53e-845e-4dfd-a80d-f790b60270fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.423745 5118 scope.go:117] "RemoveContainer" containerID="0123533d83aacb1fef7bf221a008c5a2e879c0ab52103eda69db30b401c15faa" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.454594 5118 scope.go:117] "RemoveContainer" containerID="566012d8143e61ef90b99119f834d9897363370f946eb6cf21393af728dda6a1" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.481369 5118 scope.go:117] "RemoveContainer" containerID="fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.592434 5118 scope.go:117] "RemoveContainer" containerID="807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.607533 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6ff569978f-gwmwn"] Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.614717 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6ff569978f-gwmwn"] Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.631237 5118 scope.go:117] "RemoveContainer" containerID="5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.631902 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.638898 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.657813 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.658536 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.658994 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.659064 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.659790 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.663422 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.665419 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.665479 5118 scope.go:117] "RemoveContainer" containerID="ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8" Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.665517 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovs-vswitchd" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.691475 5118 scope.go:117] "RemoveContainer" containerID="fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274" Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.692533 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274\": container with ID starting with fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274 not found: ID does not exist" containerID="fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.692569 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274"} err="failed to get container status \"fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274\": rpc error: code = NotFound desc = could not find container \"fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274\": container with ID starting with fc18440b95359cd6093ae34e101a5cc7ec20bc126ef2f487af8f3cd6930a4274 not found: ID does not exist" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.692594 5118 scope.go:117] "RemoveContainer" containerID="807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18" Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.693035 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18\": container with ID starting with 807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18 not found: ID does not exist" containerID="807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.693060 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18"} err="failed to get container status \"807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18\": rpc error: code = NotFound desc = could not find container \"807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18\": container with ID starting with 807b3cca057a53c0c78b3f380c5fa5f7446fa591e1473c81ca0c4885b450da18 not found: ID does not exist" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.693079 5118 scope.go:117] "RemoveContainer" containerID="5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7" Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.693356 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7\": container with ID starting with 5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7 not found: ID does not exist" containerID="5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.693428 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7"} err="failed to get container status \"5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7\": rpc error: code = NotFound desc = could not find container \"5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7\": container with ID starting with 5aaf25e2343a8bf0f2a7ee580596176abae9d55c630fa13986e0a358c1f897c7 not found: ID does not exist" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.693451 5118 scope.go:117] "RemoveContainer" containerID="ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8" Feb 23 07:10:11 crc kubenswrapper[5118]: E0223 07:10:11.693707 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8\": container with ID starting with ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8 not found: ID does not exist" containerID="ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.693730 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8"} err="failed to get container status \"ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8\": rpc error: code = NotFound desc = could not find container \"ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8\": container with ID starting with ff78e665cdf7e9144a93f5cc04e60248707d7472c0274aa957c8c7cd552131f8 not found: ID does not exist" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.693746 5118 scope.go:117] "RemoveContainer" containerID="0a5be7ec0d548d228afd1de2a7afc107749b13909039d656bc1026e5bcf306a3" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.719448 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" path="/var/lib/kubelet/pods/8fbf0cfa-8a35-49c6-bfa5-6639a1e75752/volumes" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.720218 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a80c23-cba1-417f-bbd5-5c5138c3664a" path="/var/lib/kubelet/pods/92a80c23-cba1-417f-bbd5-5c5138c3664a/volumes" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.721257 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" path="/var/lib/kubelet/pods/a0a7f53e-845e-4dfd-a80d-f790b60270fc/volumes" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.723981 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf1c2052-6563-45c5-888c-f7a153225f83" path="/var/lib/kubelet/pods/cf1c2052-6563-45c5-888c-f7a153225f83/volumes" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.725367 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08c9f04-59d9-4892-a540-5c892c604a71" path="/var/lib/kubelet/pods/d08c9f04-59d9-4892-a540-5c892c604a71/volumes" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.727225 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b37356-5c38-40b3-af55-4f25a2f16b21" path="/var/lib/kubelet/pods/e3b37356-5c38-40b3-af55-4f25a2f16b21/volumes" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.741898 5118 scope.go:117] "RemoveContainer" containerID="cdb183b32df15cc29470c163dcec76e49a1bcccca2adcf52c5951d4b1ff228f2" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.984629 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5df75dfc9b-mpgf2" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:11 crc kubenswrapper[5118]: I0223 07:10:11.984647 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5df75dfc9b-mpgf2" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": context deadline exceeded" Feb 23 07:10:12 crc kubenswrapper[5118]: E0223 07:10:12.440747 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:10:12 crc kubenswrapper[5118]: E0223 07:10:12.440819 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data podName:5721793b-d753-4519-b484-fa9cb958def9 nodeName:}" failed. No retries permitted until 2026-02-23 07:10:28.440804865 +0000 UTC m=+1491.444589438 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data") pod "rabbitmq-cell1-server-0" (UID: "5721793b-d753-4519-b484-fa9cb958def9") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:10:16 crc kubenswrapper[5118]: E0223 07:10:16.657422 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:16 crc kubenswrapper[5118]: E0223 07:10:16.658806 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:16 crc kubenswrapper[5118]: E0223 07:10:16.663256 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:16 crc kubenswrapper[5118]: E0223 07:10:16.663336 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:16 crc kubenswrapper[5118]: E0223 07:10:16.663392 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" Feb 23 07:10:16 crc kubenswrapper[5118]: E0223 07:10:16.667248 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:16 crc kubenswrapper[5118]: E0223 07:10:16.668372 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:16 crc kubenswrapper[5118]: E0223 07:10:16.668432 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovs-vswitchd" Feb 23 07:10:17 crc kubenswrapper[5118]: I0223 07:10:17.149414 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:17 crc kubenswrapper[5118]: I0223 07:10:17.149389 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:17 crc kubenswrapper[5118]: I0223 07:10:17.918509 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:10:17 crc kubenswrapper[5118]: I0223 07:10:17.918586 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:10:18 crc kubenswrapper[5118]: I0223 07:10:18.004032 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:10:18 crc kubenswrapper[5118]: I0223 07:10:18.414585 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:10:18 crc kubenswrapper[5118]: I0223 07:10:18.470833 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtk98"] Feb 23 07:10:20 crc kubenswrapper[5118]: I0223 07:10:20.377238 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mtk98" podUID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerName="registry-server" containerID="cri-o://47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a" gracePeriod=2 Feb 23 07:10:20 crc kubenswrapper[5118]: I0223 07:10:20.930295 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.096960 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-utilities\") pod \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.097030 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95rh8\" (UniqueName: \"kubernetes.io/projected/8a98024d-a91d-4769-9ec8-5537f7d6c20f-kube-api-access-95rh8\") pod \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.097090 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-catalog-content\") pod \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\" (UID: \"8a98024d-a91d-4769-9ec8-5537f7d6c20f\") " Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.098590 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-utilities" (OuterVolumeSpecName: "utilities") pod "8a98024d-a91d-4769-9ec8-5537f7d6c20f" (UID: "8a98024d-a91d-4769-9ec8-5537f7d6c20f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.105407 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a98024d-a91d-4769-9ec8-5537f7d6c20f-kube-api-access-95rh8" (OuterVolumeSpecName: "kube-api-access-95rh8") pod "8a98024d-a91d-4769-9ec8-5537f7d6c20f" (UID: "8a98024d-a91d-4769-9ec8-5537f7d6c20f"). InnerVolumeSpecName "kube-api-access-95rh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.133069 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a98024d-a91d-4769-9ec8-5537f7d6c20f" (UID: "8a98024d-a91d-4769-9ec8-5537f7d6c20f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.199550 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.199589 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a98024d-a91d-4769-9ec8-5537f7d6c20f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.199608 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95rh8\" (UniqueName: \"kubernetes.io/projected/8a98024d-a91d-4769-9ec8-5537f7d6c20f-kube-api-access-95rh8\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.395259 5118 generic.go:334] "Generic (PLEG): container finished" podID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerID="47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a" exitCode=0 Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.395382 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtk98" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.396842 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtk98" event={"ID":"8a98024d-a91d-4769-9ec8-5537f7d6c20f","Type":"ContainerDied","Data":"47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a"} Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.396999 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtk98" event={"ID":"8a98024d-a91d-4769-9ec8-5537f7d6c20f","Type":"ContainerDied","Data":"76405e757f3b71bdf4e65919529ef87fb707a2c251ee9f626bb7d91a76d742ca"} Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.397040 5118 scope.go:117] "RemoveContainer" containerID="47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.459298 5118 scope.go:117] "RemoveContainer" containerID="614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.463259 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtk98"] Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.471869 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtk98"] Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.498766 5118 scope.go:117] "RemoveContainer" containerID="7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.532322 5118 scope.go:117] "RemoveContainer" containerID="47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a" Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.533494 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a\": container with ID starting with 47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a not found: ID does not exist" containerID="47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.533560 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a"} err="failed to get container status \"47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a\": rpc error: code = NotFound desc = could not find container \"47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a\": container with ID starting with 47acf5767ed76e83ce8c39c0d16c3dc6edb208ef5c3f312ffc6185749cd0e00a not found: ID does not exist" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.533605 5118 scope.go:117] "RemoveContainer" containerID="614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47" Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.534021 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47\": container with ID starting with 614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47 not found: ID does not exist" containerID="614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.534068 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47"} err="failed to get container status \"614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47\": rpc error: code = NotFound desc = could not find container \"614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47\": container with ID starting with 614e572d3b2192a894666898d5011b8b74420c8cb3d3cee7ae0b5069cbf25c47 not found: ID does not exist" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.534139 5118 scope.go:117] "RemoveContainer" containerID="7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b" Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.534588 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b\": container with ID starting with 7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b not found: ID does not exist" containerID="7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.534630 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b"} err="failed to get container status \"7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b\": rpc error: code = NotFound desc = could not find container \"7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b\": container with ID starting with 7b862be5aa93aad29061bd0baa2884f089d7f746b7834490ad430fcec7ba5d1b not found: ID does not exist" Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.657972 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.664598 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.668679 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.669717 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.669782 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.671492 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.673404 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:21 crc kubenswrapper[5118]: E0223 07:10:21.673604 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovs-vswitchd" Feb 23 07:10:21 crc kubenswrapper[5118]: I0223 07:10:21.711530 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" path="/var/lib/kubelet/pods/8a98024d-a91d-4769-9ec8-5537f7d6c20f/volumes" Feb 23 07:10:22 crc kubenswrapper[5118]: I0223 07:10:22.160292 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:22 crc kubenswrapper[5118]: I0223 07:10:22.160319 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:26 crc kubenswrapper[5118]: E0223 07:10:26.657935 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:26 crc kubenswrapper[5118]: E0223 07:10:26.659691 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:26 crc kubenswrapper[5118]: E0223 07:10:26.660442 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 23 07:10:26 crc kubenswrapper[5118]: E0223 07:10:26.660504 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" Feb 23 07:10:26 crc kubenswrapper[5118]: E0223 07:10:26.661701 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:26 crc kubenswrapper[5118]: E0223 07:10:26.664486 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:26 crc kubenswrapper[5118]: E0223 07:10:26.667364 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 23 07:10:26 crc kubenswrapper[5118]: E0223 07:10:26.667572 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-dvc2v" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovs-vswitchd" Feb 23 07:10:27 crc kubenswrapper[5118]: I0223 07:10:27.169421 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:27 crc kubenswrapper[5118]: I0223 07:10:27.169421 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:28 crc kubenswrapper[5118]: E0223 07:10:28.452055 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:10:28 crc kubenswrapper[5118]: E0223 07:10:28.452460 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data podName:5721793b-d753-4519-b484-fa9cb958def9 nodeName:}" failed. No retries permitted until 2026-02-23 07:11:00.452438313 +0000 UTC m=+1523.456222896 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data") pod "rabbitmq-cell1-server-0" (UID: "5721793b-d753-4519-b484-fa9cb958def9") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.491705 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dvc2v_68f11050-5931-4be3-8e5b-194035e88020/ovs-vswitchd/0.log" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.493983 5118 generic.go:334] "Generic (PLEG): container finished" podID="68f11050-5931-4be3-8e5b-194035e88020" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" exitCode=137 Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.494016 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dvc2v" event={"ID":"68f11050-5931-4be3-8e5b-194035e88020","Type":"ContainerDied","Data":"b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d"} Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.869261 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dvc2v_68f11050-5931-4be3-8e5b-194035e88020/ovs-vswitchd/0.log" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.869970 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.975905 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-etc-ovs\") pod \"68f11050-5931-4be3-8e5b-194035e88020\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976052 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-lib\") pod \"68f11050-5931-4be3-8e5b-194035e88020\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976061 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "68f11050-5931-4be3-8e5b-194035e88020" (UID: "68f11050-5931-4be3-8e5b-194035e88020"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976121 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cxnx\" (UniqueName: \"kubernetes.io/projected/68f11050-5931-4be3-8e5b-194035e88020-kube-api-access-7cxnx\") pod \"68f11050-5931-4be3-8e5b-194035e88020\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976138 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-lib" (OuterVolumeSpecName: "var-lib") pod "68f11050-5931-4be3-8e5b-194035e88020" (UID: "68f11050-5931-4be3-8e5b-194035e88020"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976141 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-log\") pod \"68f11050-5931-4be3-8e5b-194035e88020\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976178 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-log" (OuterVolumeSpecName: "var-log") pod "68f11050-5931-4be3-8e5b-194035e88020" (UID: "68f11050-5931-4be3-8e5b-194035e88020"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976193 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-run\") pod \"68f11050-5931-4be3-8e5b-194035e88020\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976209 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-run" (OuterVolumeSpecName: "var-run") pod "68f11050-5931-4be3-8e5b-194035e88020" (UID: "68f11050-5931-4be3-8e5b-194035e88020"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976347 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68f11050-5931-4be3-8e5b-194035e88020-scripts\") pod \"68f11050-5931-4be3-8e5b-194035e88020\" (UID: \"68f11050-5931-4be3-8e5b-194035e88020\") " Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976963 5118 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976982 5118 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-lib\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.976992 5118 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-log\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.977000 5118 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68f11050-5931-4be3-8e5b-194035e88020-var-run\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.977620 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f11050-5931-4be3-8e5b-194035e88020-scripts" (OuterVolumeSpecName: "scripts") pod "68f11050-5931-4be3-8e5b-194035e88020" (UID: "68f11050-5931-4be3-8e5b-194035e88020"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:10:29 crc kubenswrapper[5118]: I0223 07:10:29.997516 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f11050-5931-4be3-8e5b-194035e88020-kube-api-access-7cxnx" (OuterVolumeSpecName: "kube-api-access-7cxnx") pod "68f11050-5931-4be3-8e5b-194035e88020" (UID: "68f11050-5931-4be3-8e5b-194035e88020"). InnerVolumeSpecName "kube-api-access-7cxnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.078586 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68f11050-5931-4be3-8e5b-194035e88020-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.078650 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cxnx\" (UniqueName: \"kubernetes.io/projected/68f11050-5931-4be3-8e5b-194035e88020-kube-api-access-7cxnx\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.174242 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.280825 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.281061 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") pod \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.281116 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-combined-ca-bundle\") pod \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.281192 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-cache\") pod \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.281258 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct4ms\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-kube-api-access-ct4ms\") pod \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.281308 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-lock\") pod \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\" (UID: \"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da\") " Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.281930 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-lock" (OuterVolumeSpecName: "lock") pod "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.282478 5118 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-lock\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.282555 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-cache" (OuterVolumeSpecName: "cache") pod "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.287062 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.287178 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "swift") pod "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.287661 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-kube-api-access-ct4ms" (OuterVolumeSpecName: "kube-api-access-ct4ms") pod "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da"). InnerVolumeSpecName "kube-api-access-ct4ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.384040 5118 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-cache\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.384072 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct4ms\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-kube-api-access-ct4ms\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.384117 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.384133 5118 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.419622 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.485935 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.519564 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerID="e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb" exitCode=137 Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.519649 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb"} Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.521282 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b7e8b9fa-2985-45f6-97e1-77a56b8ba9da","Type":"ContainerDied","Data":"5a9bb9aa7eb6e2635b6815a29d567a3e06c60854cd41b883f534e669bdc8ace2"} Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.521361 5118 scope.go:117] "RemoveContainer" containerID="e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.519723 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.522217 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dvc2v_68f11050-5931-4be3-8e5b-194035e88020/ovs-vswitchd/0.log" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.522821 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dvc2v" event={"ID":"68f11050-5931-4be3-8e5b-194035e88020","Type":"ContainerDied","Data":"9301e9f414bcb2a4c27bd7223faf1d0cca6f8bed94f16cee9612a8fec1e089b3"} Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.522864 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dvc2v" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.544124 5118 scope.go:117] "RemoveContainer" containerID="6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.562200 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-dvc2v"] Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.563451 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-dvc2v"] Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.577894 5118 scope.go:117] "RemoveContainer" containerID="a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.597991 5118 scope.go:117] "RemoveContainer" containerID="ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.618802 5118 scope.go:117] "RemoveContainer" containerID="e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.642322 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" (UID: "b7e8b9fa-2985-45f6-97e1-77a56b8ba9da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.644901 5118 scope.go:117] "RemoveContainer" containerID="254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.676707 5118 scope.go:117] "RemoveContainer" containerID="0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.689843 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.699979 5118 scope.go:117] "RemoveContainer" containerID="5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.722546 5118 scope.go:117] "RemoveContainer" containerID="9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.752065 5118 scope.go:117] "RemoveContainer" containerID="8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.776603 5118 scope.go:117] "RemoveContainer" containerID="5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.806286 5118 scope.go:117] "RemoveContainer" containerID="8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.833196 5118 scope.go:117] "RemoveContainer" containerID="3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.869432 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.874287 5118 scope.go:117] "RemoveContainer" containerID="1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.879658 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.904257 5118 scope.go:117] "RemoveContainer" containerID="b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.931692 5118 scope.go:117] "RemoveContainer" containerID="e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.932486 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb\": container with ID starting with e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb not found: ID does not exist" containerID="e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.932526 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb"} err="failed to get container status \"e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb\": rpc error: code = NotFound desc = could not find container \"e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb\": container with ID starting with e860b0afed7efad15425811e3754d6e6ac170fcedeee83776c24e99b8b1fe8cb not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.932551 5118 scope.go:117] "RemoveContainer" containerID="6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.933197 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54\": container with ID starting with 6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54 not found: ID does not exist" containerID="6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.933240 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54"} err="failed to get container status \"6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54\": rpc error: code = NotFound desc = could not find container \"6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54\": container with ID starting with 6f7d74ea8f01756286d1e26cf2f117e84d31454f93e740c68ecbf2cf62586f54 not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.933270 5118 scope.go:117] "RemoveContainer" containerID="a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.933693 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8\": container with ID starting with a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8 not found: ID does not exist" containerID="a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.933745 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8"} err="failed to get container status \"a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8\": rpc error: code = NotFound desc = could not find container \"a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8\": container with ID starting with a0ba4cdc1ac08949a93ad7c7e4031797c918988df33ac9d19c4b7e7f3a29aac8 not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.933773 5118 scope.go:117] "RemoveContainer" containerID="ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.934158 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015\": container with ID starting with ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015 not found: ID does not exist" containerID="ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.934180 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015"} err="failed to get container status \"ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015\": rpc error: code = NotFound desc = could not find container \"ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015\": container with ID starting with ebcd677f579efd2c9a7103929eb8183430e5e7cd050bf545abfa84a9b7a7a015 not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.934194 5118 scope.go:117] "RemoveContainer" containerID="e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.934662 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c\": container with ID starting with e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c not found: ID does not exist" containerID="e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.934689 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c"} err="failed to get container status \"e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c\": rpc error: code = NotFound desc = could not find container \"e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c\": container with ID starting with e33fc319b3727220b83ff55db0adbedc1bc222308242f9fff28de42e6e7f202c not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.934706 5118 scope.go:117] "RemoveContainer" containerID="254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.935078 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c\": container with ID starting with 254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c not found: ID does not exist" containerID="254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.935112 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c"} err="failed to get container status \"254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c\": rpc error: code = NotFound desc = could not find container \"254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c\": container with ID starting with 254c9b1beceaf094b604810eda6a1266d5a885483d11e7ef8b8c4c1cd05eb20c not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.935124 5118 scope.go:117] "RemoveContainer" containerID="0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.935454 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64\": container with ID starting with 0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64 not found: ID does not exist" containerID="0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.935472 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64"} err="failed to get container status \"0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64\": rpc error: code = NotFound desc = could not find container \"0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64\": container with ID starting with 0abcfbc1c1bffb02dad599898f639ae8c2dd1490e7a46d5834a348825facaf64 not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.935483 5118 scope.go:117] "RemoveContainer" containerID="5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.935733 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9\": container with ID starting with 5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9 not found: ID does not exist" containerID="5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.935759 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9"} err="failed to get container status \"5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9\": rpc error: code = NotFound desc = could not find container \"5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9\": container with ID starting with 5cf07cf3267a1cc318eaf13b03fa00941dcbfe762414fd6ce7c42fb7a7c357b9 not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.935774 5118 scope.go:117] "RemoveContainer" containerID="9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.936132 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f\": container with ID starting with 9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f not found: ID does not exist" containerID="9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.936153 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f"} err="failed to get container status \"9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f\": rpc error: code = NotFound desc = could not find container \"9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f\": container with ID starting with 9d42ace31706e3b183f91d987e04fcf0f999fe746a2976f3d39e7f2ff2a4fa1f not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.936167 5118 scope.go:117] "RemoveContainer" containerID="8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.936564 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553\": container with ID starting with 8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553 not found: ID does not exist" containerID="8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.936599 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553"} err="failed to get container status \"8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553\": rpc error: code = NotFound desc = could not find container \"8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553\": container with ID starting with 8a00b3605881434d4ba83a253687df6941c89b899c35d7faa354ccf501707553 not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.936613 5118 scope.go:117] "RemoveContainer" containerID="5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.936930 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed\": container with ID starting with 5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed not found: ID does not exist" containerID="5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.936954 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed"} err="failed to get container status \"5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed\": rpc error: code = NotFound desc = could not find container \"5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed\": container with ID starting with 5cd094c7460e2ae0b8162f76fb848d220abb67678269f0fccc6a0636ac0849ed not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.936982 5118 scope.go:117] "RemoveContainer" containerID="8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.937248 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a\": container with ID starting with 8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a not found: ID does not exist" containerID="8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.937269 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a"} err="failed to get container status \"8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a\": rpc error: code = NotFound desc = could not find container \"8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a\": container with ID starting with 8f9a7274a3ab85586de9a8de787c804cf92a943b6a0413b53b39a56a124b7b8a not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.937282 5118 scope.go:117] "RemoveContainer" containerID="3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.937597 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865\": container with ID starting with 3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865 not found: ID does not exist" containerID="3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.937639 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865"} err="failed to get container status \"3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865\": rpc error: code = NotFound desc = could not find container \"3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865\": container with ID starting with 3c570d407f9c6cf76914a63d51fb5656c583d5b76eb7c3755e83284444dee865 not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.937666 5118 scope.go:117] "RemoveContainer" containerID="1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.937984 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b\": container with ID starting with 1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b not found: ID does not exist" containerID="1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.938008 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b"} err="failed to get container status \"1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b\": rpc error: code = NotFound desc = could not find container \"1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b\": container with ID starting with 1e0e49dde08d1b8b3995b54ba63e3f5ffb5f41cb89104dc37a7213e4fb88f80b not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.938025 5118 scope.go:117] "RemoveContainer" containerID="b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56" Feb 23 07:10:30 crc kubenswrapper[5118]: E0223 07:10:30.938425 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56\": container with ID starting with b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56 not found: ID does not exist" containerID="b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.938444 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56"} err="failed to get container status \"b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56\": rpc error: code = NotFound desc = could not find container \"b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56\": container with ID starting with b51c1c3daa56f5daf4547b34e43e9e31490a501760f98fb4a9cfe9c64a42eb56 not found: ID does not exist" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.938465 5118 scope.go:117] "RemoveContainer" containerID="b30210e9e7fa4d05132287a2db595add1e67ff76c72b3914f970d62835fcc88d" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.963713 5118 scope.go:117] "RemoveContainer" containerID="b75a1e1331a1773fc5f74f04ae526d4f9fcdc683606957de297b1a65a31c8cf5" Feb 23 07:10:30 crc kubenswrapper[5118]: I0223 07:10:30.987946 5118 scope.go:117] "RemoveContainer" containerID="8fadd3a879ca766fdf860608b51130b44f68888641fff695fc8c3f558ac8fed4" Feb 23 07:10:31 crc kubenswrapper[5118]: I0223 07:10:31.707593 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f11050-5931-4be3-8e5b-194035e88020" path="/var/lib/kubelet/pods/68f11050-5931-4be3-8e5b-194035e88020/volumes" Feb 23 07:10:31 crc kubenswrapper[5118]: I0223 07:10:31.708716 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" path="/var/lib/kubelet/pods/b7e8b9fa-2985-45f6-97e1-77a56b8ba9da/volumes" Feb 23 07:10:32 crc kubenswrapper[5118]: I0223 07:10:32.178400 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:32 crc kubenswrapper[5118]: I0223 07:10:32.178389 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 07:10:32 crc kubenswrapper[5118]: I0223 07:10:32.975911 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:10:32 crc kubenswrapper[5118]: I0223 07:10:32.976008 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.056895 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.194680 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa65e5e6-1e90-407a-a462-c8ef3e406df3-logs\") pod \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.194877 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data\") pod \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.194922 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data-custom\") pod \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.195063 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-combined-ca-bundle\") pod \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.195245 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fczfn\" (UniqueName: \"kubernetes.io/projected/fa65e5e6-1e90-407a-a462-c8ef3e406df3-kube-api-access-fczfn\") pod \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\" (UID: \"fa65e5e6-1e90-407a-a462-c8ef3e406df3\") " Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.196170 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa65e5e6-1e90-407a-a462-c8ef3e406df3-logs" (OuterVolumeSpecName: "logs") pod "fa65e5e6-1e90-407a-a462-c8ef3e406df3" (UID: "fa65e5e6-1e90-407a-a462-c8ef3e406df3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.214429 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fa65e5e6-1e90-407a-a462-c8ef3e406df3" (UID: "fa65e5e6-1e90-407a-a462-c8ef3e406df3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.214506 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa65e5e6-1e90-407a-a462-c8ef3e406df3-kube-api-access-fczfn" (OuterVolumeSpecName: "kube-api-access-fczfn") pod "fa65e5e6-1e90-407a-a462-c8ef3e406df3" (UID: "fa65e5e6-1e90-407a-a462-c8ef3e406df3"). InnerVolumeSpecName "kube-api-access-fczfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.237354 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa65e5e6-1e90-407a-a462-c8ef3e406df3" (UID: "fa65e5e6-1e90-407a-a462-c8ef3e406df3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.280479 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data" (OuterVolumeSpecName: "config-data") pod "fa65e5e6-1e90-407a-a462-c8ef3e406df3" (UID: "fa65e5e6-1e90-407a-a462-c8ef3e406df3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.297544 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa65e5e6-1e90-407a-a462-c8ef3e406df3-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.297579 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.297593 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.297607 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa65e5e6-1e90-407a-a462-c8ef3e406df3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.297618 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fczfn\" (UniqueName: \"kubernetes.io/projected/fa65e5e6-1e90-407a-a462-c8ef3e406df3-kube-api-access-fczfn\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.599740 5118 generic.go:334] "Generic (PLEG): container finished" podID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" containerID="ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd" exitCode=137 Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.599815 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" event={"ID":"fa65e5e6-1e90-407a-a462-c8ef3e406df3","Type":"ContainerDied","Data":"ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd"} Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.599861 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" event={"ID":"fa65e5e6-1e90-407a-a462-c8ef3e406df3","Type":"ContainerDied","Data":"2c591f552ee5660b09ac0ca89d5f086d3a3684d444411ab1b547f0a9ceef6efb"} Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.599858 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-587b7cf474-nwcvp" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.599892 5118 scope.go:117] "RemoveContainer" containerID="ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.647603 5118 scope.go:117] "RemoveContainer" containerID="3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.655906 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-587b7cf474-nwcvp"] Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.673158 5118 scope.go:117] "RemoveContainer" containerID="ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.673574 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-587b7cf474-nwcvp"] Feb 23 07:10:36 crc kubenswrapper[5118]: E0223 07:10:36.673821 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd\": container with ID starting with ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd not found: ID does not exist" containerID="ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.673859 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd"} err="failed to get container status \"ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd\": rpc error: code = NotFound desc = could not find container \"ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd\": container with ID starting with ae5a11a26343364f08830e554de206f0e0cae0a2920002db6095c005b0b286dd not found: ID does not exist" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.673884 5118 scope.go:117] "RemoveContainer" containerID="3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297" Feb 23 07:10:36 crc kubenswrapper[5118]: E0223 07:10:36.674249 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297\": container with ID starting with 3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297 not found: ID does not exist" containerID="3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.674279 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297"} err="failed to get container status \"3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297\": rpc error: code = NotFound desc = could not find container \"3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297\": container with ID starting with 3655622ec94703aabcf0c1772ccd31e18b07af8b4d88dce6d46e88f5da36d297 not found: ID does not exist" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.956440 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": EOF" Feb 23 07:10:36 crc kubenswrapper[5118]: I0223 07:10:36.956886 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": EOF" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.128309 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": dial tcp 10.217.0.209:9311: connect: connection refused" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.128433 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8c77bbddd-85rm4" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": dial tcp 10.217.0.209:9311: connect: connection refused" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.206959 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.322280 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data-custom\") pod \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.322378 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b42tc\" (UniqueName: \"kubernetes.io/projected/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-kube-api-access-b42tc\") pod \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.322480 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data\") pod \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.322535 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-logs\") pod \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.322565 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-combined-ca-bundle\") pod \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\" (UID: \"5cc63cb3-2efc-441f-bd3c-8a6af30b9524\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.323055 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-logs" (OuterVolumeSpecName: "logs") pod "5cc63cb3-2efc-441f-bd3c-8a6af30b9524" (UID: "5cc63cb3-2efc-441f-bd3c-8a6af30b9524"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.325973 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5cc63cb3-2efc-441f-bd3c-8a6af30b9524" (UID: "5cc63cb3-2efc-441f-bd3c-8a6af30b9524"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.325990 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-kube-api-access-b42tc" (OuterVolumeSpecName: "kube-api-access-b42tc") pod "5cc63cb3-2efc-441f-bd3c-8a6af30b9524" (UID: "5cc63cb3-2efc-441f-bd3c-8a6af30b9524"). InnerVolumeSpecName "kube-api-access-b42tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.345187 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cc63cb3-2efc-441f-bd3c-8a6af30b9524" (UID: "5cc63cb3-2efc-441f-bd3c-8a6af30b9524"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.347418 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.372384 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data" (OuterVolumeSpecName: "config-data") pod "5cc63cb3-2efc-441f-bd3c-8a6af30b9524" (UID: "5cc63cb3-2efc-441f-bd3c-8a6af30b9524"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.424893 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.424935 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.424946 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.424957 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.424968 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b42tc\" (UniqueName: \"kubernetes.io/projected/5cc63cb3-2efc-441f-bd3c-8a6af30b9524-kube-api-access-b42tc\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.526128 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-internal-tls-certs\") pod \"5f515e00-c6e0-4849-b073-64721780e216\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.526265 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f515e00-c6e0-4849-b073-64721780e216-logs\") pod \"5f515e00-c6e0-4849-b073-64721780e216\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.526302 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl94x\" (UniqueName: \"kubernetes.io/projected/5f515e00-c6e0-4849-b073-64721780e216-kube-api-access-hl94x\") pod \"5f515e00-c6e0-4849-b073-64721780e216\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.526342 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data-custom\") pod \"5f515e00-c6e0-4849-b073-64721780e216\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.526442 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data\") pod \"5f515e00-c6e0-4849-b073-64721780e216\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.526484 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-public-tls-certs\") pod \"5f515e00-c6e0-4849-b073-64721780e216\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.526507 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-combined-ca-bundle\") pod \"5f515e00-c6e0-4849-b073-64721780e216\" (UID: \"5f515e00-c6e0-4849-b073-64721780e216\") " Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.538539 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f515e00-c6e0-4849-b073-64721780e216-logs" (OuterVolumeSpecName: "logs") pod "5f515e00-c6e0-4849-b073-64721780e216" (UID: "5f515e00-c6e0-4849-b073-64721780e216"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.547364 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f515e00-c6e0-4849-b073-64721780e216-kube-api-access-hl94x" (OuterVolumeSpecName: "kube-api-access-hl94x") pod "5f515e00-c6e0-4849-b073-64721780e216" (UID: "5f515e00-c6e0-4849-b073-64721780e216"). InnerVolumeSpecName "kube-api-access-hl94x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.570277 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5f515e00-c6e0-4849-b073-64721780e216" (UID: "5f515e00-c6e0-4849-b073-64721780e216"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.593308 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f515e00-c6e0-4849-b073-64721780e216" (UID: "5f515e00-c6e0-4849-b073-64721780e216"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.617391 5118 generic.go:334] "Generic (PLEG): container finished" podID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" containerID="c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4" exitCode=137 Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.617493 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75595465d9-2pkqt" event={"ID":"5cc63cb3-2efc-441f-bd3c-8a6af30b9524","Type":"ContainerDied","Data":"c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4"} Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.617522 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75595465d9-2pkqt" event={"ID":"5cc63cb3-2efc-441f-bd3c-8a6af30b9524","Type":"ContainerDied","Data":"2c6e64facf64b99522efa81e478fdc6768fd5a33b27ca954b45ee12797cc600c"} Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.617542 5118 scope.go:117] "RemoveContainer" containerID="c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.617669 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-75595465d9-2pkqt" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.632625 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.632662 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f515e00-c6e0-4849-b073-64721780e216-logs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.632673 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl94x\" (UniqueName: \"kubernetes.io/projected/5f515e00-c6e0-4849-b073-64721780e216-kube-api-access-hl94x\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.632683 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.647546 5118 generic.go:334] "Generic (PLEG): container finished" podID="5f515e00-c6e0-4849-b073-64721780e216" containerID="f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba" exitCode=137 Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.647596 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8c77bbddd-85rm4" event={"ID":"5f515e00-c6e0-4849-b073-64721780e216","Type":"ContainerDied","Data":"f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba"} Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.647632 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8c77bbddd-85rm4" event={"ID":"5f515e00-c6e0-4849-b073-64721780e216","Type":"ContainerDied","Data":"4f65c7492b670e7aacb6eaa4d4d0afc013484e1f73eb58a4335ca6edeb4ee964"} Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.647703 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8c77bbddd-85rm4" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.652738 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5f515e00-c6e0-4849-b073-64721780e216" (UID: "5f515e00-c6e0-4849-b073-64721780e216"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.653310 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5f515e00-c6e0-4849-b073-64721780e216" (UID: "5f515e00-c6e0-4849-b073-64721780e216"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.654345 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data" (OuterVolumeSpecName: "config-data") pod "5f515e00-c6e0-4849-b073-64721780e216" (UID: "5f515e00-c6e0-4849-b073-64721780e216"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.664349 5118 scope.go:117] "RemoveContainer" containerID="1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.689604 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-75595465d9-2pkqt"] Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.712289 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" path="/var/lib/kubelet/pods/fa65e5e6-1e90-407a-a462-c8ef3e406df3/volumes" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.713692 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-75595465d9-2pkqt"] Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.724582 5118 scope.go:117] "RemoveContainer" containerID="c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4" Feb 23 07:10:37 crc kubenswrapper[5118]: E0223 07:10:37.725873 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4\": container with ID starting with c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4 not found: ID does not exist" containerID="c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.725918 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4"} err="failed to get container status \"c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4\": rpc error: code = NotFound desc = could not find container \"c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4\": container with ID starting with c8cfacd1af4b64653a6a73b15d376ffde82de15f73ed1f1ca28911e4dac172b4 not found: ID does not exist" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.725945 5118 scope.go:117] "RemoveContainer" containerID="1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec" Feb 23 07:10:37 crc kubenswrapper[5118]: E0223 07:10:37.726401 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec\": container with ID starting with 1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec not found: ID does not exist" containerID="1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.726438 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec"} err="failed to get container status \"1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec\": rpc error: code = NotFound desc = could not find container \"1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec\": container with ID starting with 1cd298bcd25c58aa9ad3432a05ba3394395a6bc4ab199aba99594ae694dcb7ec not found: ID does not exist" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.726463 5118 scope.go:117] "RemoveContainer" containerID="f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.734248 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.734270 5118 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.734280 5118 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f515e00-c6e0-4849-b073-64721780e216-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.750075 5118 scope.go:117] "RemoveContainer" containerID="a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.766926 5118 scope.go:117] "RemoveContainer" containerID="f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba" Feb 23 07:10:37 crc kubenswrapper[5118]: E0223 07:10:37.768577 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba\": container with ID starting with f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba not found: ID does not exist" containerID="f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.768620 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba"} err="failed to get container status \"f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba\": rpc error: code = NotFound desc = could not find container \"f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba\": container with ID starting with f832d79782b8f36f973db00c4ed3b76d6492487fbe1e454ff72f42688446e4ba not found: ID does not exist" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.768648 5118 scope.go:117] "RemoveContainer" containerID="a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f" Feb 23 07:10:37 crc kubenswrapper[5118]: E0223 07:10:37.772316 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f\": container with ID starting with a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f not found: ID does not exist" containerID="a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.772335 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f"} err="failed to get container status \"a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f\": rpc error: code = NotFound desc = could not find container \"a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f\": container with ID starting with a8a9a12c4d8fed7fd906c17eba3bb10f389a5518ab0f0c1da2de0fd8218d5f5f not found: ID does not exist" Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.972187 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8c77bbddd-85rm4"] Feb 23 07:10:37 crc kubenswrapper[5118]: I0223 07:10:37.977706 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8c77bbddd-85rm4"] Feb 23 07:10:39 crc kubenswrapper[5118]: I0223 07:10:39.709312 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" path="/var/lib/kubelet/pods/5cc63cb3-2efc-441f-bd3c-8a6af30b9524/volumes" Feb 23 07:10:39 crc kubenswrapper[5118]: I0223 07:10:39.710742 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f515e00-c6e0-4849-b073-64721780e216" path="/var/lib/kubelet/pods/5f515e00-c6e0-4849-b073-64721780e216/volumes" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.582756 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9n2g9"] Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.583775 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-expirer" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.583798 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-expirer" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.583817 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-updater" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.583831 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-updater" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.583861 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-auditor" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.583879 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-auditor" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.583906 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.583920 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.583943 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-reaper" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.583955 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-reaper" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.583980 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="swift-recon-cron" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.583993 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="swift-recon-cron" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584017 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-metadata" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584030 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-metadata" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584052 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="sg-core" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584065 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="sg-core" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584083 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerName="nova-api-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584103 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerName="nova-api-api" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584146 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerName="openstack-network-exporter" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584159 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerName="openstack-network-exporter" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584183 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-server" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584196 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-server" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584212 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b37356-5c38-40b3-af55-4f25a2f16b21" containerName="setup-container" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584225 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b37356-5c38-40b3-af55-4f25a2f16b21" containerName="setup-container" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584259 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b37356-5c38-40b3-af55-4f25a2f16b21" containerName="rabbitmq" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584272 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b37356-5c38-40b3-af55-4f25a2f16b21" containerName="rabbitmq" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584288 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-replicator" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584316 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-replicator" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584332 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" containerName="placement-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584346 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" containerName="placement-api" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584360 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b38d85-cf57-41a9-9779-1593300b77a3" containerName="barbican-keystone-listener" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584373 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b38d85-cf57-41a9-9779-1593300b77a3" containerName="barbican-keystone-listener" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584389 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584401 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584423 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08c9f04-59d9-4892-a540-5c892c604a71" containerName="galera" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584436 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08c9f04-59d9-4892-a540-5c892c604a71" containerName="galera" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584455 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c8ab95-d7a9-4f39-abff-bd8fd89590ed" containerName="kube-state-metrics" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584468 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c8ab95-d7a9-4f39-abff-bd8fd89590ed" containerName="kube-state-metrics" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584482 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerName="extract-utilities" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584495 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerName="extract-utilities" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584520 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376ff246-417d-442a-83d1-1579abd318ba" containerName="glance-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584533 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="376ff246-417d-442a-83d1-1579abd318ba" containerName="glance-log" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584552 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584564 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584584 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-replicator" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584597 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-replicator" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584616 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-auditor" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584629 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-auditor" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584655 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178ef478-d8d3-49a5-9188-9970d3859049" containerName="galera" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584667 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="178ef478-d8d3-49a5-9188-9970d3859049" containerName="galera" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584688 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="rsync" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584701 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="rsync" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584721 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1c2052-6563-45c5-888c-f7a153225f83" containerName="neutron-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584734 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1c2052-6563-45c5-888c-f7a153225f83" containerName="neutron-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584761 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" containerName="barbican-worker-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584774 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" containerName="barbican-worker-log" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584791 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-replicator" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584805 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-replicator" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584822 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376ff246-417d-442a-83d1-1579abd318ba" containerName="glance-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584835 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="376ff246-417d-442a-83d1-1579abd318ba" containerName="glance-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584849 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" containerName="barbican-keystone-listener-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584862 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" containerName="barbican-keystone-listener-log" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584880 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584895 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-log" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584918 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e254320-082c-442b-a1a9-4b7fafe2c556" containerName="mariadb-account-create-update" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584930 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e254320-082c-442b-a1a9-4b7fafe2c556" containerName="mariadb-account-create-update" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584950 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08c9f04-59d9-4892-a540-5c892c604a71" containerName="mysql-bootstrap" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584965 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08c9f04-59d9-4892-a540-5c892c604a71" containerName="mysql-bootstrap" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.584981 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server-init" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.584993 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server-init" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585009 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a80c23-cba1-417f-bbd5-5c5138c3664a" containerName="keystone-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585023 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a80c23-cba1-417f-bbd5-5c5138c3664a" containerName="keystone-api" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585047 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-updater" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585060 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-updater" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585079 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerName="nova-api-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585092 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerName="nova-api-log" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585209 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-server" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585223 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-server" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585242 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585256 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api-log" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585277 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a79f618-3555-44a5-8c52-ec9120261645" containerName="glance-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585292 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a79f618-3555-44a5-8c52-ec9120261645" containerName="glance-log" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585308 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1e2e3d-0fc3-474a-a15d-6808347c8240" containerName="nova-scheduler-scheduler" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585321 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1e2e3d-0fc3-474a-a15d-6808347c8240" containerName="nova-scheduler-scheduler" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585345 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e254320-082c-442b-a1a9-4b7fafe2c556" containerName="mariadb-account-create-update" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585359 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e254320-082c-442b-a1a9-4b7fafe2c556" containerName="mariadb-account-create-update" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585380 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" containerName="placement-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585393 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" containerName="placement-log" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585409 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a79f618-3555-44a5-8c52-ec9120261645" containerName="glance-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585422 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a79f618-3555-44a5-8c52-ec9120261645" containerName="glance-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585436 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="ceilometer-notification-agent" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585449 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="ceilometer-notification-agent" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585470 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-server" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585483 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-server" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585502 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="ceilometer-central-agent" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585515 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="ceilometer-central-agent" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585536 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" containerName="ovn-controller" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585548 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" containerName="ovn-controller" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585564 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b38d85-cf57-41a9-9779-1593300b77a3" containerName="barbican-keystone-listener-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585578 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b38d85-cf57-41a9-9779-1593300b77a3" containerName="barbican-keystone-listener-log" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585599 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="proxy-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585612 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="proxy-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585634 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerName="extract-content" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585648 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerName="extract-content" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585665 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerName="ovn-northd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585681 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerName="ovn-northd" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585696 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" containerName="barbican-worker" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585708 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" containerName="barbican-worker" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585726 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460a8b7a-b61f-4f56-889e-54b5c2346679" containerName="nova-cell1-conductor-conductor" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585740 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="460a8b7a-b61f-4f56-889e-54b5c2346679" containerName="nova-cell1-conductor-conductor" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585763 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" containerName="barbican-keystone-listener" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585778 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" containerName="barbican-keystone-listener" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585797 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerName="registry-server" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585811 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerName="registry-server" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585832 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178ef478-d8d3-49a5-9188-9970d3859049" containerName="mysql-bootstrap" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585845 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="178ef478-d8d3-49a5-9188-9970d3859049" containerName="mysql-bootstrap" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585866 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66bcbb5-075a-4a87-981c-0dc608f19742" containerName="barbican-worker" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585880 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66bcbb5-075a-4a87-981c-0dc608f19742" containerName="barbican-worker" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585900 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1c2052-6563-45c5-888c-f7a153225f83" containerName="neutron-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585913 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1c2052-6563-45c5-888c-f7a153225f83" containerName="neutron-api" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585930 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="334e9392-6a5f-4aa8-83d7-41e26e94dd32" containerName="nova-cell0-conductor-conductor" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585943 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="334e9392-6a5f-4aa8-83d7-41e26e94dd32" containerName="nova-cell0-conductor-conductor" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585967 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66bcbb5-075a-4a87-981c-0dc608f19742" containerName="barbican-worker-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.585979 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66bcbb5-075a-4a87-981c-0dc608f19742" containerName="barbican-worker-log" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.585994 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ba04e0-7d62-472f-ab31-c41f926c93e7" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586007 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ba04e0-7d62-472f-ab31-c41f926c93e7" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.586029 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ab08e4-a114-4c99-adc6-dc05f711d8d9" containerName="memcached" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586042 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ab08e4-a114-4c99-adc6-dc05f711d8d9" containerName="memcached" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.586060 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovs-vswitchd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586073 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovs-vswitchd" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.586127 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-auditor" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586140 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-auditor" Feb 23 07:10:59 crc kubenswrapper[5118]: E0223 07:10:59.586153 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586166 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586410 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1e2e3d-0fc3-474a-a15d-6808347c8240" containerName="nova-scheduler-scheduler" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586439 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a79f618-3555-44a5-8c52-ec9120261645" containerName="glance-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586465 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586480 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" containerName="placement-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586498 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a98024d-a91d-4769-9ec8-5537f7d6c20f" containerName="registry-server" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586512 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-auditor" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586528 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" containerName="barbican-worker-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586547 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerName="openstack-network-exporter" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586566 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b37356-5c38-40b3-af55-4f25a2f16b21" containerName="rabbitmq" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586585 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c8ab95-d7a9-4f39-abff-bd8fd89590ed" containerName="kube-state-metrics" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586607 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e254320-082c-442b-a1a9-4b7fafe2c556" containerName="mariadb-account-create-update" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586623 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b38d85-cf57-41a9-9779-1593300b77a3" containerName="barbican-keystone-listener" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586645 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-updater" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586664 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-auditor" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586683 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c7c403-ece4-4778-9a1c-25dbc355a0bf" containerName="barbican-api-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586696 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b38d85-cf57-41a9-9779-1593300b77a3" containerName="barbican-keystone-listener-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586713 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586739 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovsdb-server" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586755 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-server" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586771 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="swift-recon-cron" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586785 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerName="nova-api-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586804 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1c2052-6563-45c5-888c-f7a153225f83" containerName="neutron-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586827 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-server" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586848 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-auditor" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586863 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586885 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="proxy-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586901 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="sg-core" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586923 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66bcbb5-075a-4a87-981c-0dc608f19742" containerName="barbican-worker" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586938 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="ceilometer-notification-agent" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586956 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="84fdf432-1886-4e91-bd3c-bca6f1b90c3a" containerName="nova-metadata-metadata" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586974 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-replicator" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.586993 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1c2052-6563-45c5-888c-f7a153225f83" containerName="neutron-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587009 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6bf46e9-d93e-4754-9f48-fc598c9e1359" containerName="ovn-controller" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587028 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="container-updater" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587044 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f11050-5931-4be3-8e5b-194035e88020" containerName="ovs-vswitchd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587060 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="52ab08e4-a114-4c99-adc6-dc05f711d8d9" containerName="memcached" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587079 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a79f618-3555-44a5-8c52-ec9120261645" containerName="glance-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587128 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc63cb3-2efc-441f-bd3c-8a6af30b9524" containerName="barbican-worker" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587152 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="rsync" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587173 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd013d81-347d-4c1c-9ccf-0f5e1a590755" containerName="nova-api-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587193 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="178ef478-d8d3-49a5-9188-9970d3859049" containerName="galera" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587210 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="460a8b7a-b61f-4f56-889e-54b5c2346679" containerName="nova-cell1-conductor-conductor" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587225 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbf0cfa-8a35-49c6-bfa5-6639a1e75752" containerName="ovn-northd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587241 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-reaper" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587260 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-server" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587273 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a80c23-cba1-417f-bbd5-5c5138c3664a" containerName="keystone-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587291 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" containerName="barbican-keystone-listener-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587311 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a7f53e-845e-4dfd-a80d-f790b60270fc" containerName="ceilometer-central-agent" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587332 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-replicator" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587354 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="376ff246-417d-442a-83d1-1579abd318ba" containerName="glance-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587374 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e825c7-deb0-41b5-b358-f23dcc0f1082" containerName="placement-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587389 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ba04e0-7d62-472f-ab31-c41f926c93e7" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587408 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08c9f04-59d9-4892-a540-5c892c604a71" containerName="galera" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587427 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f515e00-c6e0-4849-b073-64721780e216" containerName="barbican-api" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587441 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="376ff246-417d-442a-83d1-1579abd318ba" containerName="glance-httpd" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587456 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66bcbb5-075a-4a87-981c-0dc608f19742" containerName="barbican-worker-log" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587470 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="object-expirer" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587490 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa65e5e6-1e90-407a-a462-c8ef3e406df3" containerName="barbican-keystone-listener" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587509 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8b9fa-2985-45f6-97e1-77a56b8ba9da" containerName="account-replicator" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.587523 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="334e9392-6a5f-4aa8-83d7-41e26e94dd32" containerName="nova-cell0-conductor-conductor" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.588022 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e254320-082c-442b-a1a9-4b7fafe2c556" containerName="mariadb-account-create-update" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.589426 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.604751 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9n2g9"] Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.748255 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-utilities\") pod \"community-operators-9n2g9\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.748307 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqndl\" (UniqueName: \"kubernetes.io/projected/55a3d955-0f90-4f59-8282-e108b819b5fe-kube-api-access-wqndl\") pod \"community-operators-9n2g9\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.748371 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-catalog-content\") pod \"community-operators-9n2g9\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.850008 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-utilities\") pod \"community-operators-9n2g9\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.850425 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqndl\" (UniqueName: \"kubernetes.io/projected/55a3d955-0f90-4f59-8282-e108b819b5fe-kube-api-access-wqndl\") pod \"community-operators-9n2g9\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.850731 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-catalog-content\") pod \"community-operators-9n2g9\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.851654 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-utilities\") pod \"community-operators-9n2g9\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.852815 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-catalog-content\") pod \"community-operators-9n2g9\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.871352 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqndl\" (UniqueName: \"kubernetes.io/projected/55a3d955-0f90-4f59-8282-e108b819b5fe-kube-api-access-wqndl\") pod \"community-operators-9n2g9\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:10:59 crc kubenswrapper[5118]: I0223 07:10:59.954486 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:11:00 crc kubenswrapper[5118]: I0223 07:11:00.430120 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9n2g9"] Feb 23 07:11:00 crc kubenswrapper[5118]: E0223 07:11:00.464916 5118 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:00 crc kubenswrapper[5118]: E0223 07:11:00.465002 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data podName:5721793b-d753-4519-b484-fa9cb958def9 nodeName:}" failed. No retries permitted until 2026-02-23 07:12:04.464982463 +0000 UTC m=+1587.468767036 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data") pod "rabbitmq-cell1-server-0" (UID: "5721793b-d753-4519-b484-fa9cb958def9") : configmap "rabbitmq-cell1-config-data" not found Feb 23 07:11:00 crc kubenswrapper[5118]: I0223 07:11:00.966467 5118 generic.go:334] "Generic (PLEG): container finished" podID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerID="aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4" exitCode=0 Feb 23 07:11:00 crc kubenswrapper[5118]: I0223 07:11:00.966549 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n2g9" event={"ID":"55a3d955-0f90-4f59-8282-e108b819b5fe","Type":"ContainerDied","Data":"aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4"} Feb 23 07:11:00 crc kubenswrapper[5118]: I0223 07:11:00.966630 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n2g9" event={"ID":"55a3d955-0f90-4f59-8282-e108b819b5fe","Type":"ContainerStarted","Data":"27ce4599cd4bae881a8fbc5c3f5cfe8107227493cdbe5fd304500b5a948c6e5a"} Feb 23 07:11:01 crc kubenswrapper[5118]: I0223 07:11:01.978399 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n2g9" event={"ID":"55a3d955-0f90-4f59-8282-e108b819b5fe","Type":"ContainerStarted","Data":"8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2"} Feb 23 07:11:02 crc kubenswrapper[5118]: I0223 07:11:02.975037 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:11:02 crc kubenswrapper[5118]: I0223 07:11:02.975178 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:11:02 crc kubenswrapper[5118]: I0223 07:11:02.975245 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:11:02 crc kubenswrapper[5118]: I0223 07:11:02.976308 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ac91484d2ab6a449ecd673a74a9b5daa98cf0d8d88bcca8f30a4d381279f2ab"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:11:02 crc kubenswrapper[5118]: I0223 07:11:02.976780 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://4ac91484d2ab6a449ecd673a74a9b5daa98cf0d8d88bcca8f30a4d381279f2ab" gracePeriod=600 Feb 23 07:11:02 crc kubenswrapper[5118]: I0223 07:11:02.993834 5118 generic.go:334] "Generic (PLEG): container finished" podID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerID="8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2" exitCode=0 Feb 23 07:11:02 crc kubenswrapper[5118]: I0223 07:11:02.993896 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n2g9" event={"ID":"55a3d955-0f90-4f59-8282-e108b819b5fe","Type":"ContainerDied","Data":"8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2"} Feb 23 07:11:03 crc kubenswrapper[5118]: E0223 07:11:03.488715 5118 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 23 07:11:03 crc kubenswrapper[5118]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Most common reasons for this are: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 23 07:11:03 crc kubenswrapper[5118]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is not running Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: In addition to the diagnostics info below: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 23 07:11:03 crc kubenswrapper[5118]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack Feb 23 07:11:03 crc kubenswrapper[5118]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: DIAGNOSTICS Feb 23 07:11:03 crc kubenswrapper[5118]: =========== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'] Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: Feb 23 07:11:03 crc kubenswrapper[5118]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: nxdomain (non-existing domain) Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Current node details: Feb 23 07:11:03 crc kubenswrapper[5118]: * node name: 'rabbitmqcli-521-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack' Feb 23 07:11:03 crc kubenswrapper[5118]: * effective user's home directory: /var/lib/rabbitmq Feb 23 07:11:03 crc kubenswrapper[5118]: * Erlang cookie hash: Zk9Apke+rQyI4khtSw4P/Q== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Most common reasons for this are: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 23 07:11:03 crc kubenswrapper[5118]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is not running Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: In addition to the diagnostics info below: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 23 07:11:03 crc kubenswrapper[5118]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack Feb 23 07:11:03 crc kubenswrapper[5118]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: DIAGNOSTICS Feb 23 07:11:03 crc kubenswrapper[5118]: =========== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'] Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: Feb 23 07:11:03 crc kubenswrapper[5118]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: nxdomain (non-existing domain) Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Current node details: Feb 23 07:11:03 crc kubenswrapper[5118]: * node name: 'rabbitmqcli-273-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack' Feb 23 07:11:03 crc kubenswrapper[5118]: * effective user's home directory: /var/lib/rabbitmq Feb 23 07:11:03 crc kubenswrapper[5118]: * Erlang cookie hash: Zk9Apke+rQyI4khtSw4P/Q== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: > execCommand=["/bin/bash","-c","if [ ! -z \"$(cat /etc/pod-info/skipPreStopChecks)\" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 \u0026\u0026 rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true \u0026\u0026 rabbitmq-upgrade drain -t 604800"] containerName="rabbitmq" pod="openstack/rabbitmq-cell1-server-0" message=< Feb 23 07:11:03 crc kubenswrapper[5118]: Will put node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack into maintenance mode. The node will no longer serve any client traffic! Feb 23 07:11:03 crc kubenswrapper[5118]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Most common reasons for this are: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 23 07:11:03 crc kubenswrapper[5118]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is not running Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: In addition to the diagnostics info below: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 23 07:11:03 crc kubenswrapper[5118]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack Feb 23 07:11:03 crc kubenswrapper[5118]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: DIAGNOSTICS Feb 23 07:11:03 crc kubenswrapper[5118]: =========== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'] Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: Feb 23 07:11:03 crc kubenswrapper[5118]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: nxdomain (non-existing domain) Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Current node details: Feb 23 07:11:03 crc kubenswrapper[5118]: * node name: 'rabbitmqcli-521-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack' Feb 23 07:11:03 crc kubenswrapper[5118]: * effective user's home directory: /var/lib/rabbitmq Feb 23 07:11:03 crc kubenswrapper[5118]: * Erlang cookie hash: Zk9Apke+rQyI4khtSw4P/Q== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Most common reasons for this are: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 23 07:11:03 crc kubenswrapper[5118]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is not running Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: In addition to the diagnostics info below: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 23 07:11:03 crc kubenswrapper[5118]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack Feb 23 07:11:03 crc kubenswrapper[5118]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: DIAGNOSTICS Feb 23 07:11:03 crc kubenswrapper[5118]: =========== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'] Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: Feb 23 07:11:03 crc kubenswrapper[5118]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: nxdomain (non-existing domain) Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Current node details: Feb 23 07:11:03 crc kubenswrapper[5118]: * node name: 'rabbitmqcli-273-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack' Feb 23 07:11:03 crc kubenswrapper[5118]: * effective user's home directory: /var/lib/rabbitmq Feb 23 07:11:03 crc kubenswrapper[5118]: * Erlang cookie hash: Zk9Apke+rQyI4khtSw4P/Q== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: > Feb 23 07:11:03 crc kubenswrapper[5118]: E0223 07:11:03.488776 5118 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 23 07:11:03 crc kubenswrapper[5118]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Most common reasons for this are: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 23 07:11:03 crc kubenswrapper[5118]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is not running Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: In addition to the diagnostics info below: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 23 07:11:03 crc kubenswrapper[5118]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack Feb 23 07:11:03 crc kubenswrapper[5118]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: DIAGNOSTICS Feb 23 07:11:03 crc kubenswrapper[5118]: =========== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'] Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: Feb 23 07:11:03 crc kubenswrapper[5118]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: nxdomain (non-existing domain) Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Current node details: Feb 23 07:11:03 crc kubenswrapper[5118]: * node name: 'rabbitmqcli-521-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack' Feb 23 07:11:03 crc kubenswrapper[5118]: * effective user's home directory: /var/lib/rabbitmq Feb 23 07:11:03 crc kubenswrapper[5118]: * Erlang cookie hash: Zk9Apke+rQyI4khtSw4P/Q== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Error: unable to perform an operation on node 'rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Most common reasons for this are: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 23 07:11:03 crc kubenswrapper[5118]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 23 07:11:03 crc kubenswrapper[5118]: * Target node is not running Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: In addition to the diagnostics info below: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 23 07:11:03 crc kubenswrapper[5118]: * Consult server logs on node rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack Feb 23 07:11:03 crc kubenswrapper[5118]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: DIAGNOSTICS Feb 23 07:11:03 crc kubenswrapper[5118]: =========== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: attempted to contact: ['rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack'] Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: Feb 23 07:11:03 crc kubenswrapper[5118]: * unable to connect to epmd (port 4369) on rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack: nxdomain (non-existing domain) Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: Current node details: Feb 23 07:11:03 crc kubenswrapper[5118]: * node name: 'rabbitmqcli-273-rabbit@rabbitmq-cell1-server-0.rabbitmq-cell1-nodes.openstack' Feb 23 07:11:03 crc kubenswrapper[5118]: * effective user's home directory: /var/lib/rabbitmq Feb 23 07:11:03 crc kubenswrapper[5118]: * Erlang cookie hash: Zk9Apke+rQyI4khtSw4P/Q== Feb 23 07:11:03 crc kubenswrapper[5118]: Feb 23 07:11:03 crc kubenswrapper[5118]: > pod="openstack/rabbitmq-cell1-server-0" podUID="5721793b-d753-4519-b484-fa9cb958def9" containerName="rabbitmq" containerID="cri-o://23aa93acb427382b8b9e35e7ba3fe6a0a163d9178713ac3213c7a866a2c6d3e7" Feb 23 07:11:03 crc kubenswrapper[5118]: I0223 07:11:03.488831 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5721793b-d753-4519-b484-fa9cb958def9" containerName="rabbitmq" containerID="cri-o://23aa93acb427382b8b9e35e7ba3fe6a0a163d9178713ac3213c7a866a2c6d3e7" gracePeriod=604738 Feb 23 07:11:04 crc kubenswrapper[5118]: I0223 07:11:04.008947 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="4ac91484d2ab6a449ecd673a74a9b5daa98cf0d8d88bcca8f30a4d381279f2ab" exitCode=0 Feb 23 07:11:04 crc kubenswrapper[5118]: I0223 07:11:04.009000 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"4ac91484d2ab6a449ecd673a74a9b5daa98cf0d8d88bcca8f30a4d381279f2ab"} Feb 23 07:11:04 crc kubenswrapper[5118]: I0223 07:11:04.009545 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b"} Feb 23 07:11:04 crc kubenswrapper[5118]: I0223 07:11:04.009584 5118 scope.go:117] "RemoveContainer" containerID="497192e0fff21b8738796a3ac22d605d2776126154853b7db5acd78788b414a6" Feb 23 07:11:04 crc kubenswrapper[5118]: I0223 07:11:04.014827 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n2g9" event={"ID":"55a3d955-0f90-4f59-8282-e108b819b5fe","Type":"ContainerStarted","Data":"bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6"} Feb 23 07:11:04 crc kubenswrapper[5118]: I0223 07:11:04.061049 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9n2g9" podStartSLOduration=2.635642911 podStartE2EDuration="5.061022639s" podCreationTimestamp="2026-02-23 07:10:59 +0000 UTC" firstStartedPulling="2026-02-23 07:11:00.969476889 +0000 UTC m=+1523.973261492" lastFinishedPulling="2026-02-23 07:11:03.394856607 +0000 UTC m=+1526.398641220" observedRunningTime="2026-02-23 07:11:04.059683096 +0000 UTC m=+1527.063467709" watchObservedRunningTime="2026-02-23 07:11:04.061022639 +0000 UTC m=+1527.064807242" Feb 23 07:11:09 crc kubenswrapper[5118]: I0223 07:11:09.955674 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:11:09 crc kubenswrapper[5118]: I0223 07:11:09.956428 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.040937 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.101066 5118 generic.go:334] "Generic (PLEG): container finished" podID="5721793b-d753-4519-b484-fa9cb958def9" containerID="23aa93acb427382b8b9e35e7ba3fe6a0a163d9178713ac3213c7a866a2c6d3e7" exitCode=0 Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.101161 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5721793b-d753-4519-b484-fa9cb958def9","Type":"ContainerDied","Data":"23aa93acb427382b8b9e35e7ba3fe6a0a163d9178713ac3213c7a866a2c6d3e7"} Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.101220 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5721793b-d753-4519-b484-fa9cb958def9","Type":"ContainerDied","Data":"98d02a05363a9f33a34e46dba064e4d303fb9272026a6b4a3347a78c263c4417"} Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.101243 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d02a05363a9f33a34e46dba064e4d303fb9272026a6b4a3347a78c263c4417" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.126932 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.140063 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-plugins-conf\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.140238 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh82w\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-kube-api-access-fh82w\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.140346 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-erlang-cookie\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.140379 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-server-conf\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.140517 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5721793b-d753-4519-b484-fa9cb958def9-erlang-cookie-secret\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.141299 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5721793b-d753-4519-b484-fa9cb958def9-pod-info\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.141384 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-confd\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.141431 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-plugins\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.141492 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.141779 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.141514 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.141940 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.142043 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-tls\") pod \"5721793b-d753-4519-b484-fa9cb958def9\" (UID: \"5721793b-d753-4519-b484-fa9cb958def9\") " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.142622 5118 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.142675 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.142840 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.148244 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.148604 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.149461 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5721793b-d753-4519-b484-fa9cb958def9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.149651 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-kube-api-access-fh82w" (OuterVolumeSpecName: "kube-api-access-fh82w") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "kube-api-access-fh82w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.150191 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5721793b-d753-4519-b484-fa9cb958def9-pod-info" (OuterVolumeSpecName: "pod-info") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.155363 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.182808 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data" (OuterVolumeSpecName: "config-data") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.228618 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-server-conf" (OuterVolumeSpecName: "server-conf") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.238829 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5721793b-d753-4519-b484-fa9cb958def9" (UID: "5721793b-d753-4519-b484-fa9cb958def9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.244374 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.244479 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.244537 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.244589 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh82w\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-kube-api-access-fh82w\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.244652 5118 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5721793b-d753-4519-b484-fa9cb958def9-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.244704 5118 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5721793b-d753-4519-b484-fa9cb958def9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.244759 5118 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5721793b-d753-4519-b484-fa9cb958def9-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.244855 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.244939 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5721793b-d753-4519-b484-fa9cb958def9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.258668 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.285288 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9n2g9"] Feb 23 07:11:10 crc kubenswrapper[5118]: I0223 07:11:10.345836 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:11 crc kubenswrapper[5118]: I0223 07:11:11.111850 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 07:11:11 crc kubenswrapper[5118]: I0223 07:11:11.165205 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:11:11 crc kubenswrapper[5118]: I0223 07:11:11.167825 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 07:11:11 crc kubenswrapper[5118]: I0223 07:11:11.713943 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5721793b-d753-4519-b484-fa9cb958def9" path="/var/lib/kubelet/pods/5721793b-d753-4519-b484-fa9cb958def9/volumes" Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.120663 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9n2g9" podUID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerName="registry-server" containerID="cri-o://bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6" gracePeriod=2 Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.602752 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.788516 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqndl\" (UniqueName: \"kubernetes.io/projected/55a3d955-0f90-4f59-8282-e108b819b5fe-kube-api-access-wqndl\") pod \"55a3d955-0f90-4f59-8282-e108b819b5fe\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.788643 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-utilities\") pod \"55a3d955-0f90-4f59-8282-e108b819b5fe\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.788830 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-catalog-content\") pod \"55a3d955-0f90-4f59-8282-e108b819b5fe\" (UID: \"55a3d955-0f90-4f59-8282-e108b819b5fe\") " Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.790390 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-utilities" (OuterVolumeSpecName: "utilities") pod "55a3d955-0f90-4f59-8282-e108b819b5fe" (UID: "55a3d955-0f90-4f59-8282-e108b819b5fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.791373 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.795816 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a3d955-0f90-4f59-8282-e108b819b5fe-kube-api-access-wqndl" (OuterVolumeSpecName: "kube-api-access-wqndl") pod "55a3d955-0f90-4f59-8282-e108b819b5fe" (UID: "55a3d955-0f90-4f59-8282-e108b819b5fe"). InnerVolumeSpecName "kube-api-access-wqndl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.873670 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55a3d955-0f90-4f59-8282-e108b819b5fe" (UID: "55a3d955-0f90-4f59-8282-e108b819b5fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.892816 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a3d955-0f90-4f59-8282-e108b819b5fe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:12 crc kubenswrapper[5118]: I0223 07:11:12.893355 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqndl\" (UniqueName: \"kubernetes.io/projected/55a3d955-0f90-4f59-8282-e108b819b5fe-kube-api-access-wqndl\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.137644 5118 generic.go:334] "Generic (PLEG): container finished" podID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerID="bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6" exitCode=0 Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.137767 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n2g9" event={"ID":"55a3d955-0f90-4f59-8282-e108b819b5fe","Type":"ContainerDied","Data":"bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6"} Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.137797 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n2g9" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.137815 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n2g9" event={"ID":"55a3d955-0f90-4f59-8282-e108b819b5fe","Type":"ContainerDied","Data":"27ce4599cd4bae881a8fbc5c3f5cfe8107227493cdbe5fd304500b5a948c6e5a"} Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.137835 5118 scope.go:117] "RemoveContainer" containerID="bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.200630 5118 scope.go:117] "RemoveContainer" containerID="8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.205654 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9n2g9"] Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.212906 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9n2g9"] Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.252799 5118 scope.go:117] "RemoveContainer" containerID="aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.274904 5118 scope.go:117] "RemoveContainer" containerID="bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6" Feb 23 07:11:13 crc kubenswrapper[5118]: E0223 07:11:13.275426 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6\": container with ID starting with bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6 not found: ID does not exist" containerID="bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.275480 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6"} err="failed to get container status \"bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6\": rpc error: code = NotFound desc = could not find container \"bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6\": container with ID starting with bb44f61ec95fe67cd97102f66cea38f802cc5322452dcaf8069a25d41bda0ce6 not found: ID does not exist" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.275519 5118 scope.go:117] "RemoveContainer" containerID="8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2" Feb 23 07:11:13 crc kubenswrapper[5118]: E0223 07:11:13.276011 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2\": container with ID starting with 8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2 not found: ID does not exist" containerID="8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.276064 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2"} err="failed to get container status \"8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2\": rpc error: code = NotFound desc = could not find container \"8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2\": container with ID starting with 8250500e532799b6a6b00874b94e90fd93ed5b061dc57873df16b0ac0b8b14e2 not found: ID does not exist" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.276128 5118 scope.go:117] "RemoveContainer" containerID="aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4" Feb 23 07:11:13 crc kubenswrapper[5118]: E0223 07:11:13.276573 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4\": container with ID starting with aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4 not found: ID does not exist" containerID="aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.276605 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4"} err="failed to get container status \"aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4\": rpc error: code = NotFound desc = could not find container \"aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4\": container with ID starting with aecb16239a35adc0c524a8e83178e43f1c6dd6578bc339ad0edf42fdc9ab77c4 not found: ID does not exist" Feb 23 07:11:13 crc kubenswrapper[5118]: I0223 07:11:13.715864 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a3d955-0f90-4f59-8282-e108b819b5fe" path="/var/lib/kubelet/pods/55a3d955-0f90-4f59-8282-e108b819b5fe/volumes" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.377019 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dcwz9"] Feb 23 07:11:20 crc kubenswrapper[5118]: E0223 07:11:20.377909 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerName="extract-utilities" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.377924 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerName="extract-utilities" Feb 23 07:11:20 crc kubenswrapper[5118]: E0223 07:11:20.377947 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5721793b-d753-4519-b484-fa9cb958def9" containerName="rabbitmq" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.377955 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5721793b-d753-4519-b484-fa9cb958def9" containerName="rabbitmq" Feb 23 07:11:20 crc kubenswrapper[5118]: E0223 07:11:20.377978 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerName="registry-server" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.377986 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerName="registry-server" Feb 23 07:11:20 crc kubenswrapper[5118]: E0223 07:11:20.378000 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5721793b-d753-4519-b484-fa9cb958def9" containerName="setup-container" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.378009 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5721793b-d753-4519-b484-fa9cb958def9" containerName="setup-container" Feb 23 07:11:20 crc kubenswrapper[5118]: E0223 07:11:20.378023 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerName="extract-content" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.378030 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerName="extract-content" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.378195 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5721793b-d753-4519-b484-fa9cb958def9" containerName="rabbitmq" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.378215 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a3d955-0f90-4f59-8282-e108b819b5fe" containerName="registry-server" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.379463 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.394465 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dcwz9"] Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.431020 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2bcl\" (UniqueName: \"kubernetes.io/projected/338273ec-7579-4cf8-810d-be812dd70ed8-kube-api-access-c2bcl\") pod \"certified-operators-dcwz9\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.431239 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-utilities\") pod \"certified-operators-dcwz9\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.431317 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-catalog-content\") pod \"certified-operators-dcwz9\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.532684 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-catalog-content\") pod \"certified-operators-dcwz9\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.532761 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2bcl\" (UniqueName: \"kubernetes.io/projected/338273ec-7579-4cf8-810d-be812dd70ed8-kube-api-access-c2bcl\") pod \"certified-operators-dcwz9\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.532850 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-utilities\") pod \"certified-operators-dcwz9\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.533313 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-utilities\") pod \"certified-operators-dcwz9\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.533382 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-catalog-content\") pod \"certified-operators-dcwz9\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.555330 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2bcl\" (UniqueName: \"kubernetes.io/projected/338273ec-7579-4cf8-810d-be812dd70ed8-kube-api-access-c2bcl\") pod \"certified-operators-dcwz9\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:20 crc kubenswrapper[5118]: I0223 07:11:20.701228 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:21 crc kubenswrapper[5118]: I0223 07:11:21.235286 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dcwz9"] Feb 23 07:11:22 crc kubenswrapper[5118]: I0223 07:11:22.264536 5118 generic.go:334] "Generic (PLEG): container finished" podID="338273ec-7579-4cf8-810d-be812dd70ed8" containerID="cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741" exitCode=0 Feb 23 07:11:22 crc kubenswrapper[5118]: I0223 07:11:22.264762 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcwz9" event={"ID":"338273ec-7579-4cf8-810d-be812dd70ed8","Type":"ContainerDied","Data":"cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741"} Feb 23 07:11:22 crc kubenswrapper[5118]: I0223 07:11:22.265289 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcwz9" event={"ID":"338273ec-7579-4cf8-810d-be812dd70ed8","Type":"ContainerStarted","Data":"a00f5903cd57db8ff4e9df7aa0f44eeb6aa25597594c0b741754a35652732350"} Feb 23 07:11:23 crc kubenswrapper[5118]: I0223 07:11:23.281674 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcwz9" event={"ID":"338273ec-7579-4cf8-810d-be812dd70ed8","Type":"ContainerStarted","Data":"cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad"} Feb 23 07:11:24 crc kubenswrapper[5118]: I0223 07:11:24.302292 5118 generic.go:334] "Generic (PLEG): container finished" podID="338273ec-7579-4cf8-810d-be812dd70ed8" containerID="cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad" exitCode=0 Feb 23 07:11:24 crc kubenswrapper[5118]: I0223 07:11:24.302390 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcwz9" event={"ID":"338273ec-7579-4cf8-810d-be812dd70ed8","Type":"ContainerDied","Data":"cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad"} Feb 23 07:11:25 crc kubenswrapper[5118]: I0223 07:11:25.317837 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcwz9" event={"ID":"338273ec-7579-4cf8-810d-be812dd70ed8","Type":"ContainerStarted","Data":"7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8"} Feb 23 07:11:25 crc kubenswrapper[5118]: I0223 07:11:25.348534 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dcwz9" podStartSLOduration=2.928132535 podStartE2EDuration="5.34851262s" podCreationTimestamp="2026-02-23 07:11:20 +0000 UTC" firstStartedPulling="2026-02-23 07:11:22.268234998 +0000 UTC m=+1545.272019591" lastFinishedPulling="2026-02-23 07:11:24.688615063 +0000 UTC m=+1547.692399676" observedRunningTime="2026-02-23 07:11:25.339521719 +0000 UTC m=+1548.343306302" watchObservedRunningTime="2026-02-23 07:11:25.34851262 +0000 UTC m=+1548.352297203" Feb 23 07:11:30 crc kubenswrapper[5118]: I0223 07:11:30.702031 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:30 crc kubenswrapper[5118]: I0223 07:11:30.702732 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:30 crc kubenswrapper[5118]: I0223 07:11:30.772820 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:31 crc kubenswrapper[5118]: I0223 07:11:31.457894 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:31 crc kubenswrapper[5118]: I0223 07:11:31.533222 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dcwz9"] Feb 23 07:11:33 crc kubenswrapper[5118]: I0223 07:11:33.410200 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dcwz9" podUID="338273ec-7579-4cf8-810d-be812dd70ed8" containerName="registry-server" containerID="cri-o://7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8" gracePeriod=2 Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.388180 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.409378 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-catalog-content\") pod \"338273ec-7579-4cf8-810d-be812dd70ed8\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.409476 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2bcl\" (UniqueName: \"kubernetes.io/projected/338273ec-7579-4cf8-810d-be812dd70ed8-kube-api-access-c2bcl\") pod \"338273ec-7579-4cf8-810d-be812dd70ed8\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.409566 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-utilities\") pod \"338273ec-7579-4cf8-810d-be812dd70ed8\" (UID: \"338273ec-7579-4cf8-810d-be812dd70ed8\") " Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.412007 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-utilities" (OuterVolumeSpecName: "utilities") pod "338273ec-7579-4cf8-810d-be812dd70ed8" (UID: "338273ec-7579-4cf8-810d-be812dd70ed8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.429585 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/338273ec-7579-4cf8-810d-be812dd70ed8-kube-api-access-c2bcl" (OuterVolumeSpecName: "kube-api-access-c2bcl") pod "338273ec-7579-4cf8-810d-be812dd70ed8" (UID: "338273ec-7579-4cf8-810d-be812dd70ed8"). InnerVolumeSpecName "kube-api-access-c2bcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.439987 5118 generic.go:334] "Generic (PLEG): container finished" podID="338273ec-7579-4cf8-810d-be812dd70ed8" containerID="7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8" exitCode=0 Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.440079 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcwz9" event={"ID":"338273ec-7579-4cf8-810d-be812dd70ed8","Type":"ContainerDied","Data":"7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8"} Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.440170 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcwz9" event={"ID":"338273ec-7579-4cf8-810d-be812dd70ed8","Type":"ContainerDied","Data":"a00f5903cd57db8ff4e9df7aa0f44eeb6aa25597594c0b741754a35652732350"} Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.440222 5118 scope.go:117] "RemoveContainer" containerID="7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.440396 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcwz9" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.476188 5118 scope.go:117] "RemoveContainer" containerID="cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.478106 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "338273ec-7579-4cf8-810d-be812dd70ed8" (UID: "338273ec-7579-4cf8-810d-be812dd70ed8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.503727 5118 scope.go:117] "RemoveContainer" containerID="cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.511366 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.511397 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2bcl\" (UniqueName: \"kubernetes.io/projected/338273ec-7579-4cf8-810d-be812dd70ed8-kube-api-access-c2bcl\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.511408 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/338273ec-7579-4cf8-810d-be812dd70ed8-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.532651 5118 scope.go:117] "RemoveContainer" containerID="7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8" Feb 23 07:11:34 crc kubenswrapper[5118]: E0223 07:11:34.533269 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8\": container with ID starting with 7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8 not found: ID does not exist" containerID="7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.533316 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8"} err="failed to get container status \"7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8\": rpc error: code = NotFound desc = could not find container \"7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8\": container with ID starting with 7a4d2e58a71ff54ac013e63f0fbc921bb88092240b626d0461849b7e5dd753c8 not found: ID does not exist" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.533344 5118 scope.go:117] "RemoveContainer" containerID="cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad" Feb 23 07:11:34 crc kubenswrapper[5118]: E0223 07:11:34.533727 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad\": container with ID starting with cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad not found: ID does not exist" containerID="cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.533776 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad"} err="failed to get container status \"cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad\": rpc error: code = NotFound desc = could not find container \"cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad\": container with ID starting with cc15721c09a970e7331bb722551cf713b0596c0ded5f8f7b6fdcacac40273dad not found: ID does not exist" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.533805 5118 scope.go:117] "RemoveContainer" containerID="cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741" Feb 23 07:11:34 crc kubenswrapper[5118]: E0223 07:11:34.534208 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741\": container with ID starting with cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741 not found: ID does not exist" containerID="cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.534244 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741"} err="failed to get container status \"cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741\": rpc error: code = NotFound desc = could not find container \"cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741\": container with ID starting with cb7653943c4a5e0195de0df2fa4385232e7d7665f7087f660045af79d76d2741 not found: ID does not exist" Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.811646 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dcwz9"] Feb 23 07:11:34 crc kubenswrapper[5118]: I0223 07:11:34.821634 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dcwz9"] Feb 23 07:11:35 crc kubenswrapper[5118]: I0223 07:11:35.713068 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="338273ec-7579-4cf8-810d-be812dd70ed8" path="/var/lib/kubelet/pods/338273ec-7579-4cf8-810d-be812dd70ed8/volumes" Feb 23 07:11:44 crc kubenswrapper[5118]: I0223 07:11:44.709524 5118 scope.go:117] "RemoveContainer" containerID="a2e3b6a370edeb80f110bba958bb271f7682fd85974733af6d118cfdaf41a31c" Feb 23 07:11:44 crc kubenswrapper[5118]: I0223 07:11:44.763679 5118 scope.go:117] "RemoveContainer" containerID="e5f4d3331c3c713d417d415e0c9c87c538d3282e22fb34df0e3af4f645992818" Feb 23 07:11:44 crc kubenswrapper[5118]: I0223 07:11:44.808650 5118 scope.go:117] "RemoveContainer" containerID="f1fb890b79737bfbf99ce5ecc106b736e0487142f1609990b3b71bfce17058cd" Feb 23 07:11:44 crc kubenswrapper[5118]: I0223 07:11:44.843799 5118 scope.go:117] "RemoveContainer" containerID="62a6b39873821838f5e6ec500bb13fb1c5bafa215a40ab539f49df60bc1bfa58" Feb 23 07:11:44 crc kubenswrapper[5118]: I0223 07:11:44.881172 5118 scope.go:117] "RemoveContainer" containerID="5e73fa4580f4c73c9c2474d6d8e409a299fd6e9d2a0cb10f8a794c016381abeb" Feb 23 07:11:44 crc kubenswrapper[5118]: I0223 07:11:44.931343 5118 scope.go:117] "RemoveContainer" containerID="765ded8ac4c466b73ff678e59e3c1bd2c7e26f27d5b181a36f4fa8da845c96bd" Feb 23 07:11:44 crc kubenswrapper[5118]: I0223 07:11:44.966883 5118 scope.go:117] "RemoveContainer" containerID="e04bed901323060d8bdbf121a8c0a21a8592ac08dc80af47b6dc5111c17834a4" Feb 23 07:11:44 crc kubenswrapper[5118]: I0223 07:11:44.999693 5118 scope.go:117] "RemoveContainer" containerID="d3a70082746e2d19d58943e96283801f0727e8f267055d1291d429d8d0285236" Feb 23 07:11:45 crc kubenswrapper[5118]: I0223 07:11:45.028875 5118 scope.go:117] "RemoveContainer" containerID="0e7c3b3803f8e6c09d371de5e9ca37ddd7fa5760543321c91850ad4c7a5f01e4" Feb 23 07:11:45 crc kubenswrapper[5118]: I0223 07:11:45.060992 5118 scope.go:117] "RemoveContainer" containerID="6017ddaf1fbb67ccc35bf16bd2c5d9c4390cd9ca9d79da4052a7cd0a0dfc5b1a" Feb 23 07:11:45 crc kubenswrapper[5118]: I0223 07:11:45.100234 5118 scope.go:117] "RemoveContainer" containerID="efeb2dee3d60940a41bd6f825379f0cd8db4a63a4c9712ef10172e0587fe62a9" Feb 23 07:11:45 crc kubenswrapper[5118]: I0223 07:11:45.130955 5118 scope.go:117] "RemoveContainer" containerID="140e9d9472eed0af6d34cad05e08bfca2a8201add4f370c162aeadd6dde25013" Feb 23 07:11:45 crc kubenswrapper[5118]: I0223 07:11:45.162441 5118 scope.go:117] "RemoveContainer" containerID="6a23fd3d2542fd72adf146968189c89588ba91f524f068f3518a0a21f82692ed" Feb 23 07:11:45 crc kubenswrapper[5118]: I0223 07:11:45.190389 5118 scope.go:117] "RemoveContainer" containerID="23aa93acb427382b8b9e35e7ba3fe6a0a163d9178713ac3213c7a866a2c6d3e7" Feb 23 07:11:45 crc kubenswrapper[5118]: I0223 07:11:45.229459 5118 scope.go:117] "RemoveContainer" containerID="e782f3020fb567d1d112542543ef24f19219d86c018400d9252caaab0cecce88" Feb 23 07:11:45 crc kubenswrapper[5118]: I0223 07:11:45.294083 5118 scope.go:117] "RemoveContainer" containerID="fe75c42f85412ce15214f8ef6bc41dd0ed931d945a2697c44e4ce3aee60b1692" Feb 23 07:12:45 crc kubenswrapper[5118]: I0223 07:12:45.633362 5118 scope.go:117] "RemoveContainer" containerID="1de6314811ebb64ce7d6e81e27c3ebe9806a6a2be25b7adcba9d4d318f00ff32" Feb 23 07:12:45 crc kubenswrapper[5118]: I0223 07:12:45.694029 5118 scope.go:117] "RemoveContainer" containerID="320a0c8fe4ce9d33211c93e92ea408a8c42d0c18964f7dda8ef928a9afbb7e0f" Feb 23 07:12:45 crc kubenswrapper[5118]: I0223 07:12:45.748984 5118 scope.go:117] "RemoveContainer" containerID="f91de33e332b8b4a6dbf6738e42a54c52bd94c43aa50a9f0ff04ca4baa4554bd" Feb 23 07:12:45 crc kubenswrapper[5118]: I0223 07:12:45.821592 5118 scope.go:117] "RemoveContainer" containerID="7cb8242dbfa61e0770e846acb68fe689bced9a76568c54dd9f46da08dc2ffd0a" Feb 23 07:12:45 crc kubenswrapper[5118]: I0223 07:12:45.876364 5118 scope.go:117] "RemoveContainer" containerID="9c2df29044a4a6505db392529c5d66e0c129921b569f448d9246b1dfaaeab1f6" Feb 23 07:12:45 crc kubenswrapper[5118]: I0223 07:12:45.903217 5118 scope.go:117] "RemoveContainer" containerID="4321cb33d1b2bec2a061b811cc47a8f13fcdd296c2f386145952801ad6df548b" Feb 23 07:12:45 crc kubenswrapper[5118]: I0223 07:12:45.943630 5118 scope.go:117] "RemoveContainer" containerID="402a757ca854a9d41f888fbe557b2790ea14bfab0b9b1beebf4734563316e710" Feb 23 07:12:45 crc kubenswrapper[5118]: I0223 07:12:45.975993 5118 scope.go:117] "RemoveContainer" containerID="c97640bf824262195b04827fc1d9a2c28b53219df60d5c4d4ae2cebf410083aa" Feb 23 07:12:46 crc kubenswrapper[5118]: I0223 07:12:46.003587 5118 scope.go:117] "RemoveContainer" containerID="45ffa8bb3a7b2fda9250327348a8dbae14fb92a0d0dba32bea29be4860f5a723" Feb 23 07:12:46 crc kubenswrapper[5118]: I0223 07:12:46.028848 5118 scope.go:117] "RemoveContainer" containerID="53c815e6e0d97274de16eccba1c763fdb5994fa4f5076151e86ee65fd19101e1" Feb 23 07:12:46 crc kubenswrapper[5118]: I0223 07:12:46.053771 5118 scope.go:117] "RemoveContainer" containerID="db1d242dbe5d63f480d76e72117a96ec58b79d79ceb0c11acd8fe34bb58ce66f" Feb 23 07:12:46 crc kubenswrapper[5118]: I0223 07:12:46.099127 5118 scope.go:117] "RemoveContainer" containerID="fd21cfd0580525c1b7844427525a7e2066f46567f71940f55223ebed1457c9b1" Feb 23 07:12:46 crc kubenswrapper[5118]: I0223 07:12:46.125662 5118 scope.go:117] "RemoveContainer" containerID="21901553cbeacd59d5ec7cc9bf72db01ee50046e05f98b576ecedf03ec161b8b" Feb 23 07:12:46 crc kubenswrapper[5118]: I0223 07:12:46.157254 5118 scope.go:117] "RemoveContainer" containerID="085fb994609b4b7cd53dbff3fc6fd8454ce84798c4da501d558fe01c7bd93cad" Feb 23 07:12:46 crc kubenswrapper[5118]: I0223 07:12:46.214726 5118 scope.go:117] "RemoveContainer" containerID="e3d135e626ac942021e458bc4a7ab385023f6d9287f81b2ebc06d10db2903a65" Feb 23 07:12:46 crc kubenswrapper[5118]: I0223 07:12:46.272213 5118 scope.go:117] "RemoveContainer" containerID="bf753c5ff204f7671f6f916f3471b964c3b8e931842477bec2466966122fb249" Feb 23 07:12:46 crc kubenswrapper[5118]: I0223 07:12:46.306539 5118 scope.go:117] "RemoveContainer" containerID="ef137dc1dae09884fa1f818bc1a3acbbc6d8e89520fb8a24cc76bcacf3b1d973" Feb 23 07:12:46 crc kubenswrapper[5118]: I0223 07:12:46.353638 5118 scope.go:117] "RemoveContainer" containerID="08f5ae809a52605089fc9cdb111ba3bc2e8bb6a1dc22b8e8172f0f7b7b1321e9" Feb 23 07:13:32 crc kubenswrapper[5118]: I0223 07:13:32.975975 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:13:32 crc kubenswrapper[5118]: I0223 07:13:32.976977 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:13:46 crc kubenswrapper[5118]: I0223 07:13:46.750827 5118 scope.go:117] "RemoveContainer" containerID="838eea4ca37b326a64d5d678a35327acd726a105a4ac2e5f2979892d448df136" Feb 23 07:13:46 crc kubenswrapper[5118]: I0223 07:13:46.798022 5118 scope.go:117] "RemoveContainer" containerID="ee31de91e40b156a453f3da6cf9d535c4074b1b72364559bf6cedba450085aa1" Feb 23 07:13:46 crc kubenswrapper[5118]: I0223 07:13:46.825691 5118 scope.go:117] "RemoveContainer" containerID="6562661aead1d6afb939f1cc5250e5487e9146e38f4c8f43650a040bdd480d69" Feb 23 07:13:46 crc kubenswrapper[5118]: I0223 07:13:46.857898 5118 scope.go:117] "RemoveContainer" containerID="d9b130e05974e910c11275a4779f706f2b8c61a00e6da82692831f10958cd486" Feb 23 07:13:46 crc kubenswrapper[5118]: I0223 07:13:46.883523 5118 scope.go:117] "RemoveContainer" containerID="6d1930f97fefea30957a1d47896f52db8ffd741601261f9d1da5be58352aff9b" Feb 23 07:13:46 crc kubenswrapper[5118]: I0223 07:13:46.907191 5118 scope.go:117] "RemoveContainer" containerID="27b321bfffbb4bd657b82546013c7861bf1c2f5af05ee3956e5a515a75954c59" Feb 23 07:13:46 crc kubenswrapper[5118]: I0223 07:13:46.936257 5118 scope.go:117] "RemoveContainer" containerID="56b4d180783d05f72c701c589b061dafc5eb3886c60c04d8b8ec016724107c8f" Feb 23 07:13:46 crc kubenswrapper[5118]: I0223 07:13:46.962906 5118 scope.go:117] "RemoveContainer" containerID="8f957cc8e11a9fbbf6380e9c0269fb9c7337a8d09534ea1c9361fe65ece901ca" Feb 23 07:13:46 crc kubenswrapper[5118]: I0223 07:13:46.990024 5118 scope.go:117] "RemoveContainer" containerID="744308f3e6aa45dc8d65925950643c4deaca79d38d7fbf553daec9ce3a79d864" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.028114 5118 scope.go:117] "RemoveContainer" containerID="4478c2eb2702eff226081dd62c590b24a317a608e7697bf1cd6f86ae6c3d71e1" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.056648 5118 scope.go:117] "RemoveContainer" containerID="66b033f5cdba1801f896cac33e18ec0b1d1f6261e8ea02c514f953da7c1c5a84" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.095202 5118 scope.go:117] "RemoveContainer" containerID="1b8d408fe3b77e3bb503c8839eb4d16302c83548ed27ca45d62d3fefc3f0f0fb" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.131831 5118 scope.go:117] "RemoveContainer" containerID="b790bed49ac437e2b4ca3f9fbeb049900c080653098145f84fa077dc058b9955" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.204122 5118 scope.go:117] "RemoveContainer" containerID="734762bdf80dd183a26cac81a8b5b957a69d69feded0dd37247d0ea7e3a3cd24" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.239170 5118 scope.go:117] "RemoveContainer" containerID="c1cb45006da98332eb6cd8b2fbe6eb461c5eecf7073435dc50db5dc59fb5544b" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.272736 5118 scope.go:117] "RemoveContainer" containerID="a331d02a6eec032c5afe8e514192d44b34c7dc18aaea080dddcb420228f17459" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.303668 5118 scope.go:117] "RemoveContainer" containerID="befb16d8a3f73c9a8d367e972c7e2cebda984c914cdc82b0856f084d6aa276f7" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.339545 5118 scope.go:117] "RemoveContainer" containerID="bf1c589b730495897b68ee4004af7ede183db68decb4860331926ec63e038b03" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.375061 5118 scope.go:117] "RemoveContainer" containerID="50ca6bcdb97f2c6020994172d7c32e3ae68a54330a49a74eecef1ee656c8e4db" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.407850 5118 scope.go:117] "RemoveContainer" containerID="b78a6f1ffe539d4eb30bf74ad5a43f3802cf0350d1aba256152e434c6b28d94e" Feb 23 07:13:47 crc kubenswrapper[5118]: I0223 07:13:47.454058 5118 scope.go:117] "RemoveContainer" containerID="d4786168035f880a50d9daf3c146a82f881d078e61fee117923247704b6bc372" Feb 23 07:14:02 crc kubenswrapper[5118]: I0223 07:14:02.975134 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:14:02 crc kubenswrapper[5118]: I0223 07:14:02.976083 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:14:32 crc kubenswrapper[5118]: I0223 07:14:32.975618 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:14:32 crc kubenswrapper[5118]: I0223 07:14:32.976578 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:14:32 crc kubenswrapper[5118]: I0223 07:14:32.976643 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:14:32 crc kubenswrapper[5118]: I0223 07:14:32.977507 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:14:32 crc kubenswrapper[5118]: I0223 07:14:32.977566 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" gracePeriod=600 Feb 23 07:14:33 crc kubenswrapper[5118]: E0223 07:14:33.106341 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:14:33 crc kubenswrapper[5118]: I0223 07:14:33.614751 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" exitCode=0 Feb 23 07:14:33 crc kubenswrapper[5118]: I0223 07:14:33.614821 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b"} Feb 23 07:14:33 crc kubenswrapper[5118]: I0223 07:14:33.615256 5118 scope.go:117] "RemoveContainer" containerID="4ac91484d2ab6a449ecd673a74a9b5daa98cf0d8d88bcca8f30a4d381279f2ab" Feb 23 07:14:33 crc kubenswrapper[5118]: I0223 07:14:33.615838 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:14:33 crc kubenswrapper[5118]: E0223 07:14:33.616159 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:14:44 crc kubenswrapper[5118]: I0223 07:14:44.697342 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:14:44 crc kubenswrapper[5118]: E0223 07:14:44.698536 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:14:47 crc kubenswrapper[5118]: I0223 07:14:47.847848 5118 scope.go:117] "RemoveContainer" containerID="b3addf3691e84cbed33a8caa1ed42d09867e2ded2b5b13a3ed5d86681396513a" Feb 23 07:14:47 crc kubenswrapper[5118]: I0223 07:14:47.877966 5118 scope.go:117] "RemoveContainer" containerID="0adb6c329a2cd02154518bf66f2b846b623f8eee18887b789bac01be55ba8b0b" Feb 23 07:14:47 crc kubenswrapper[5118]: I0223 07:14:47.919171 5118 scope.go:117] "RemoveContainer" containerID="85d21d0dac4a95b6e222c0bbe6b332c072d7e302a0738ab596e737641693a839" Feb 23 07:14:47 crc kubenswrapper[5118]: I0223 07:14:47.946285 5118 scope.go:117] "RemoveContainer" containerID="5b1673251e24d33e14cb885d1a30100b57c0b80b0883e50be4e26c0544e5eeaa" Feb 23 07:14:47 crc kubenswrapper[5118]: I0223 07:14:47.967070 5118 scope.go:117] "RemoveContainer" containerID="7aa9aba83a6a4faa226f1ac01b3aa1d78be1db352ff3cdcb6601be775a19cd5c" Feb 23 07:14:47 crc kubenswrapper[5118]: I0223 07:14:47.995328 5118 scope.go:117] "RemoveContainer" containerID="dfbcad5efe1813a218d7063b164a48376977a850a9936dc6b7bc7563d47686e9" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.048051 5118 scope.go:117] "RemoveContainer" containerID="f79ac5b11fb0deba4274f00ce33e6c2a41832523dee197459bcb484c55984024" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.070194 5118 scope.go:117] "RemoveContainer" containerID="896b4f95bb97fad9176afa275040bcd7ee90ab9dc9acb4253ebbdb98478bbd70" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.085718 5118 scope.go:117] "RemoveContainer" containerID="6820419b366ae8b47d31039250fb3d906f1f99ec9b556bf263a6f996823a2472" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.100717 5118 scope.go:117] "RemoveContainer" containerID="63d318b94849d84942e140978eb3d696a0f38b314c7ae1a84ee03f020ac59443" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.121785 5118 scope.go:117] "RemoveContainer" containerID="f9fbd71a03f2a93ba309cdce906910ae6003771c797503a1153318d351d6f24a" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.152516 5118 scope.go:117] "RemoveContainer" containerID="a065dfe336ce5b3881868f56481691176a9b31d3f9dbcb65b5a810d6d7860b51" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.171905 5118 scope.go:117] "RemoveContainer" containerID="6b8c6eb9b9ce37fbe932ac5074db82c6dfa9e1f23cb6e0b090172cccdaf73dcf" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.189756 5118 scope.go:117] "RemoveContainer" containerID="d4558ce9d6310e8db6eb2cdf9488a47360b540290b0f1dfc0509bc3c7c548193" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.212266 5118 scope.go:117] "RemoveContainer" containerID="b1c8d946aea11cb7af4ee8d18d5fddb1fc1d4ea9530beb309c55bad6ab1ac93a" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.250370 5118 scope.go:117] "RemoveContainer" containerID="4000f09676001090ea18cdb7df7926a1a769a2407d4b9008c3f8702217c3491e" Feb 23 07:14:48 crc kubenswrapper[5118]: I0223 07:14:48.274379 5118 scope.go:117] "RemoveContainer" containerID="85649c8553e19892cd3a25464dcbc4e9c7dbd2ca6f717c585fde0037df3d2ef1" Feb 23 07:14:56 crc kubenswrapper[5118]: I0223 07:14:56.697852 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:14:56 crc kubenswrapper[5118]: E0223 07:14:56.700618 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.185093 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56"] Feb 23 07:15:00 crc kubenswrapper[5118]: E0223 07:15:00.186067 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338273ec-7579-4cf8-810d-be812dd70ed8" containerName="registry-server" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.186093 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="338273ec-7579-4cf8-810d-be812dd70ed8" containerName="registry-server" Feb 23 07:15:00 crc kubenswrapper[5118]: E0223 07:15:00.186201 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338273ec-7579-4cf8-810d-be812dd70ed8" containerName="extract-utilities" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.186216 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="338273ec-7579-4cf8-810d-be812dd70ed8" containerName="extract-utilities" Feb 23 07:15:00 crc kubenswrapper[5118]: E0223 07:15:00.186247 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338273ec-7579-4cf8-810d-be812dd70ed8" containerName="extract-content" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.186263 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="338273ec-7579-4cf8-810d-be812dd70ed8" containerName="extract-content" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.186559 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="338273ec-7579-4cf8-810d-be812dd70ed8" containerName="registry-server" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.187389 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.193592 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.193874 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.200261 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56"] Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.367363 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xnmt\" (UniqueName: \"kubernetes.io/projected/239bb60c-59c0-4f58-af9f-aa12fc4226d5-kube-api-access-6xnmt\") pod \"collect-profiles-29530515-frp56\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.367434 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/239bb60c-59c0-4f58-af9f-aa12fc4226d5-config-volume\") pod \"collect-profiles-29530515-frp56\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.367477 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/239bb60c-59c0-4f58-af9f-aa12fc4226d5-secret-volume\") pod \"collect-profiles-29530515-frp56\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.468982 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xnmt\" (UniqueName: \"kubernetes.io/projected/239bb60c-59c0-4f58-af9f-aa12fc4226d5-kube-api-access-6xnmt\") pod \"collect-profiles-29530515-frp56\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.469063 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/239bb60c-59c0-4f58-af9f-aa12fc4226d5-config-volume\") pod \"collect-profiles-29530515-frp56\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.469129 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/239bb60c-59c0-4f58-af9f-aa12fc4226d5-secret-volume\") pod \"collect-profiles-29530515-frp56\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.470312 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/239bb60c-59c0-4f58-af9f-aa12fc4226d5-config-volume\") pod \"collect-profiles-29530515-frp56\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.477338 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/239bb60c-59c0-4f58-af9f-aa12fc4226d5-secret-volume\") pod \"collect-profiles-29530515-frp56\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.498137 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xnmt\" (UniqueName: \"kubernetes.io/projected/239bb60c-59c0-4f58-af9f-aa12fc4226d5-kube-api-access-6xnmt\") pod \"collect-profiles-29530515-frp56\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.525655 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:00 crc kubenswrapper[5118]: I0223 07:15:00.802055 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56"] Feb 23 07:15:01 crc kubenswrapper[5118]: I0223 07:15:01.008938 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" event={"ID":"239bb60c-59c0-4f58-af9f-aa12fc4226d5","Type":"ContainerStarted","Data":"8fb38ff183331b2d43d7578059a9c6e8443ca1266d5827a5ed19d3c5371aa1bf"} Feb 23 07:15:02 crc kubenswrapper[5118]: I0223 07:15:02.050420 5118 generic.go:334] "Generic (PLEG): container finished" podID="239bb60c-59c0-4f58-af9f-aa12fc4226d5" containerID="5a1a2212618fb7f02f27fc3c3d2b8b978cd62e1cbb98ea3df5895ffd931b29a9" exitCode=0 Feb 23 07:15:02 crc kubenswrapper[5118]: I0223 07:15:02.050467 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" event={"ID":"239bb60c-59c0-4f58-af9f-aa12fc4226d5","Type":"ContainerDied","Data":"5a1a2212618fb7f02f27fc3c3d2b8b978cd62e1cbb98ea3df5895ffd931b29a9"} Feb 23 07:15:03 crc kubenswrapper[5118]: I0223 07:15:03.495319 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:03 crc kubenswrapper[5118]: I0223 07:15:03.532205 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xnmt\" (UniqueName: \"kubernetes.io/projected/239bb60c-59c0-4f58-af9f-aa12fc4226d5-kube-api-access-6xnmt\") pod \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " Feb 23 07:15:03 crc kubenswrapper[5118]: I0223 07:15:03.532344 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/239bb60c-59c0-4f58-af9f-aa12fc4226d5-config-volume\") pod \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " Feb 23 07:15:03 crc kubenswrapper[5118]: I0223 07:15:03.532525 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/239bb60c-59c0-4f58-af9f-aa12fc4226d5-secret-volume\") pod \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\" (UID: \"239bb60c-59c0-4f58-af9f-aa12fc4226d5\") " Feb 23 07:15:03 crc kubenswrapper[5118]: I0223 07:15:03.533588 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/239bb60c-59c0-4f58-af9f-aa12fc4226d5-config-volume" (OuterVolumeSpecName: "config-volume") pod "239bb60c-59c0-4f58-af9f-aa12fc4226d5" (UID: "239bb60c-59c0-4f58-af9f-aa12fc4226d5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:15:03 crc kubenswrapper[5118]: I0223 07:15:03.539375 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/239bb60c-59c0-4f58-af9f-aa12fc4226d5-kube-api-access-6xnmt" (OuterVolumeSpecName: "kube-api-access-6xnmt") pod "239bb60c-59c0-4f58-af9f-aa12fc4226d5" (UID: "239bb60c-59c0-4f58-af9f-aa12fc4226d5"). InnerVolumeSpecName "kube-api-access-6xnmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:15:03 crc kubenswrapper[5118]: I0223 07:15:03.539490 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/239bb60c-59c0-4f58-af9f-aa12fc4226d5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "239bb60c-59c0-4f58-af9f-aa12fc4226d5" (UID: "239bb60c-59c0-4f58-af9f-aa12fc4226d5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:15:03 crc kubenswrapper[5118]: I0223 07:15:03.633957 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xnmt\" (UniqueName: \"kubernetes.io/projected/239bb60c-59c0-4f58-af9f-aa12fc4226d5-kube-api-access-6xnmt\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:03 crc kubenswrapper[5118]: I0223 07:15:03.633998 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/239bb60c-59c0-4f58-af9f-aa12fc4226d5-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:03 crc kubenswrapper[5118]: I0223 07:15:03.634010 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/239bb60c-59c0-4f58-af9f-aa12fc4226d5-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:15:04 crc kubenswrapper[5118]: I0223 07:15:04.072743 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" event={"ID":"239bb60c-59c0-4f58-af9f-aa12fc4226d5","Type":"ContainerDied","Data":"8fb38ff183331b2d43d7578059a9c6e8443ca1266d5827a5ed19d3c5371aa1bf"} Feb 23 07:15:04 crc kubenswrapper[5118]: I0223 07:15:04.072827 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fb38ff183331b2d43d7578059a9c6e8443ca1266d5827a5ed19d3c5371aa1bf" Feb 23 07:15:04 crc kubenswrapper[5118]: I0223 07:15:04.072820 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56" Feb 23 07:15:08 crc kubenswrapper[5118]: I0223 07:15:08.697831 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:15:08 crc kubenswrapper[5118]: E0223 07:15:08.698716 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:15:20 crc kubenswrapper[5118]: I0223 07:15:20.698233 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:15:20 crc kubenswrapper[5118]: E0223 07:15:20.699324 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:15:31 crc kubenswrapper[5118]: I0223 07:15:31.701530 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:15:31 crc kubenswrapper[5118]: E0223 07:15:31.704952 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:15:43 crc kubenswrapper[5118]: I0223 07:15:43.697953 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:15:43 crc kubenswrapper[5118]: E0223 07:15:43.699346 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:15:48 crc kubenswrapper[5118]: I0223 07:15:48.445427 5118 scope.go:117] "RemoveContainer" containerID="5285ecb4115449aa0975018a2c7a4ad15449770fc6a16a5b4cc0704a6eea208b" Feb 23 07:15:48 crc kubenswrapper[5118]: I0223 07:15:48.479168 5118 scope.go:117] "RemoveContainer" containerID="0751ec02f589e9689680bdbc3c817812f2676d3d07252002303baef1e65fe57b" Feb 23 07:15:48 crc kubenswrapper[5118]: I0223 07:15:48.528398 5118 scope.go:117] "RemoveContainer" containerID="63d5f3f40e0bd27af62b5fc9b31aa60381f047575c6ba94cbf1284e1e6bbf343" Feb 23 07:15:48 crc kubenswrapper[5118]: I0223 07:15:48.564239 5118 scope.go:117] "RemoveContainer" containerID="e6177a5edc264e84a87c302b966b45b66cce1a423d61e2b512f7e870b53fe9e4" Feb 23 07:15:48 crc kubenswrapper[5118]: I0223 07:15:48.628235 5118 scope.go:117] "RemoveContainer" containerID="399a4e9366242ca3df6176dc90c586ff3c7335dbed164ce0eb710274b70efc9d" Feb 23 07:15:48 crc kubenswrapper[5118]: I0223 07:15:48.663976 5118 scope.go:117] "RemoveContainer" containerID="af045486508fe8cab1c8b590668cb9fcd431b4bc730c81a19eb7129715e8530d" Feb 23 07:15:48 crc kubenswrapper[5118]: I0223 07:15:48.698850 5118 scope.go:117] "RemoveContainer" containerID="d26689d86956826776f6a68df424f29414b578bdb30f6c4e05418d6010a876b3" Feb 23 07:15:57 crc kubenswrapper[5118]: I0223 07:15:57.704468 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:15:57 crc kubenswrapper[5118]: E0223 07:15:57.705625 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:16:09 crc kubenswrapper[5118]: I0223 07:16:09.698286 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:16:09 crc kubenswrapper[5118]: E0223 07:16:09.699662 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:16:24 crc kubenswrapper[5118]: I0223 07:16:24.698992 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:16:24 crc kubenswrapper[5118]: E0223 07:16:24.700466 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:16:36 crc kubenswrapper[5118]: I0223 07:16:36.697321 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:16:36 crc kubenswrapper[5118]: E0223 07:16:36.698553 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:16:50 crc kubenswrapper[5118]: I0223 07:16:50.697211 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:16:50 crc kubenswrapper[5118]: E0223 07:16:50.698201 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:17:05 crc kubenswrapper[5118]: I0223 07:17:05.697447 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:17:05 crc kubenswrapper[5118]: E0223 07:17:05.698487 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:17:17 crc kubenswrapper[5118]: I0223 07:17:17.703601 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:17:17 crc kubenswrapper[5118]: E0223 07:17:17.704726 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:17:28 crc kubenswrapper[5118]: I0223 07:17:28.697289 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:17:28 crc kubenswrapper[5118]: E0223 07:17:28.698344 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.340765 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qr8l7"] Feb 23 07:17:38 crc kubenswrapper[5118]: E0223 07:17:38.342264 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="239bb60c-59c0-4f58-af9f-aa12fc4226d5" containerName="collect-profiles" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.342289 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="239bb60c-59c0-4f58-af9f-aa12fc4226d5" containerName="collect-profiles" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.342638 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="239bb60c-59c0-4f58-af9f-aa12fc4226d5" containerName="collect-profiles" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.347056 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.370478 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qr8l7"] Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.455814 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng6jz\" (UniqueName: \"kubernetes.io/projected/5eac0298-c6e9-470f-8402-c5d59e5d7f80-kube-api-access-ng6jz\") pod \"redhat-operators-qr8l7\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.456160 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-catalog-content\") pod \"redhat-operators-qr8l7\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.456225 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-utilities\") pod \"redhat-operators-qr8l7\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.558366 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng6jz\" (UniqueName: \"kubernetes.io/projected/5eac0298-c6e9-470f-8402-c5d59e5d7f80-kube-api-access-ng6jz\") pod \"redhat-operators-qr8l7\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.558639 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-catalog-content\") pod \"redhat-operators-qr8l7\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.558714 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-utilities\") pod \"redhat-operators-qr8l7\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.559351 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-catalog-content\") pod \"redhat-operators-qr8l7\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.559406 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-utilities\") pod \"redhat-operators-qr8l7\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.596190 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng6jz\" (UniqueName: \"kubernetes.io/projected/5eac0298-c6e9-470f-8402-c5d59e5d7f80-kube-api-access-ng6jz\") pod \"redhat-operators-qr8l7\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:38 crc kubenswrapper[5118]: I0223 07:17:38.680511 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:39 crc kubenswrapper[5118]: I0223 07:17:39.189775 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qr8l7"] Feb 23 07:17:39 crc kubenswrapper[5118]: I0223 07:17:39.666192 5118 generic.go:334] "Generic (PLEG): container finished" podID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerID="861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78" exitCode=0 Feb 23 07:17:39 crc kubenswrapper[5118]: I0223 07:17:39.666253 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr8l7" event={"ID":"5eac0298-c6e9-470f-8402-c5d59e5d7f80","Type":"ContainerDied","Data":"861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78"} Feb 23 07:17:39 crc kubenswrapper[5118]: I0223 07:17:39.666614 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr8l7" event={"ID":"5eac0298-c6e9-470f-8402-c5d59e5d7f80","Type":"ContainerStarted","Data":"9f4d345ca92d3503407bebb38b9650d299fff58a9104b245e9cb59ef4b516fe4"} Feb 23 07:17:39 crc kubenswrapper[5118]: I0223 07:17:39.669267 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:17:40 crc kubenswrapper[5118]: I0223 07:17:40.680272 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr8l7" event={"ID":"5eac0298-c6e9-470f-8402-c5d59e5d7f80","Type":"ContainerStarted","Data":"1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3"} Feb 23 07:17:41 crc kubenswrapper[5118]: I0223 07:17:41.694471 5118 generic.go:334] "Generic (PLEG): container finished" podID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerID="1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3" exitCode=0 Feb 23 07:17:41 crc kubenswrapper[5118]: I0223 07:17:41.694623 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr8l7" event={"ID":"5eac0298-c6e9-470f-8402-c5d59e5d7f80","Type":"ContainerDied","Data":"1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3"} Feb 23 07:17:41 crc kubenswrapper[5118]: I0223 07:17:41.698453 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:17:41 crc kubenswrapper[5118]: E0223 07:17:41.698869 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:17:42 crc kubenswrapper[5118]: I0223 07:17:42.711644 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr8l7" event={"ID":"5eac0298-c6e9-470f-8402-c5d59e5d7f80","Type":"ContainerStarted","Data":"b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15"} Feb 23 07:17:42 crc kubenswrapper[5118]: I0223 07:17:42.742118 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qr8l7" podStartSLOduration=2.2769613140000002 podStartE2EDuration="4.742083673s" podCreationTimestamp="2026-02-23 07:17:38 +0000 UTC" firstStartedPulling="2026-02-23 07:17:39.669051347 +0000 UTC m=+1922.672835920" lastFinishedPulling="2026-02-23 07:17:42.134173666 +0000 UTC m=+1925.137958279" observedRunningTime="2026-02-23 07:17:42.738838444 +0000 UTC m=+1925.742623057" watchObservedRunningTime="2026-02-23 07:17:42.742083673 +0000 UTC m=+1925.745868246" Feb 23 07:17:48 crc kubenswrapper[5118]: I0223 07:17:48.680860 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:48 crc kubenswrapper[5118]: I0223 07:17:48.683457 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:49 crc kubenswrapper[5118]: I0223 07:17:49.765710 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qr8l7" podUID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerName="registry-server" probeResult="failure" output=< Feb 23 07:17:49 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 07:17:49 crc kubenswrapper[5118]: > Feb 23 07:17:53 crc kubenswrapper[5118]: I0223 07:17:53.697627 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:17:53 crc kubenswrapper[5118]: E0223 07:17:53.698506 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:17:58 crc kubenswrapper[5118]: I0223 07:17:58.764520 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:58 crc kubenswrapper[5118]: I0223 07:17:58.853202 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:17:59 crc kubenswrapper[5118]: I0223 07:17:59.013068 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qr8l7"] Feb 23 07:17:59 crc kubenswrapper[5118]: I0223 07:17:59.863484 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qr8l7" podUID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerName="registry-server" containerID="cri-o://b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15" gracePeriod=2 Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.344381 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.399226 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-utilities\") pod \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.399327 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-catalog-content\") pod \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.399388 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng6jz\" (UniqueName: \"kubernetes.io/projected/5eac0298-c6e9-470f-8402-c5d59e5d7f80-kube-api-access-ng6jz\") pod \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\" (UID: \"5eac0298-c6e9-470f-8402-c5d59e5d7f80\") " Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.400792 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-utilities" (OuterVolumeSpecName: "utilities") pod "5eac0298-c6e9-470f-8402-c5d59e5d7f80" (UID: "5eac0298-c6e9-470f-8402-c5d59e5d7f80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.408972 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eac0298-c6e9-470f-8402-c5d59e5d7f80-kube-api-access-ng6jz" (OuterVolumeSpecName: "kube-api-access-ng6jz") pod "5eac0298-c6e9-470f-8402-c5d59e5d7f80" (UID: "5eac0298-c6e9-470f-8402-c5d59e5d7f80"). InnerVolumeSpecName "kube-api-access-ng6jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.501394 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.501432 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng6jz\" (UniqueName: \"kubernetes.io/projected/5eac0298-c6e9-470f-8402-c5d59e5d7f80-kube-api-access-ng6jz\") on node \"crc\" DevicePath \"\"" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.578180 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5eac0298-c6e9-470f-8402-c5d59e5d7f80" (UID: "5eac0298-c6e9-470f-8402-c5d59e5d7f80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.603772 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eac0298-c6e9-470f-8402-c5d59e5d7f80-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.877881 5118 generic.go:334] "Generic (PLEG): container finished" podID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerID="b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15" exitCode=0 Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.878009 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qr8l7" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.878003 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr8l7" event={"ID":"5eac0298-c6e9-470f-8402-c5d59e5d7f80","Type":"ContainerDied","Data":"b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15"} Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.880213 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qr8l7" event={"ID":"5eac0298-c6e9-470f-8402-c5d59e5d7f80","Type":"ContainerDied","Data":"9f4d345ca92d3503407bebb38b9650d299fff58a9104b245e9cb59ef4b516fe4"} Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.880260 5118 scope.go:117] "RemoveContainer" containerID="b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.907770 5118 scope.go:117] "RemoveContainer" containerID="1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.931986 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qr8l7"] Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.936848 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qr8l7"] Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.961460 5118 scope.go:117] "RemoveContainer" containerID="861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.984495 5118 scope.go:117] "RemoveContainer" containerID="b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15" Feb 23 07:18:00 crc kubenswrapper[5118]: E0223 07:18:00.985032 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15\": container with ID starting with b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15 not found: ID does not exist" containerID="b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.985066 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15"} err="failed to get container status \"b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15\": rpc error: code = NotFound desc = could not find container \"b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15\": container with ID starting with b3cb1fa481b9b26b53b5120cca2e02054cd4b091c9893f23d74e7eadef53fb15 not found: ID does not exist" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.985090 5118 scope.go:117] "RemoveContainer" containerID="1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3" Feb 23 07:18:00 crc kubenswrapper[5118]: E0223 07:18:00.985369 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3\": container with ID starting with 1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3 not found: ID does not exist" containerID="1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.985391 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3"} err="failed to get container status \"1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3\": rpc error: code = NotFound desc = could not find container \"1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3\": container with ID starting with 1504b1fa437ea99a15492266233777d126bd474d97a90337a1ae4653c203e1a3 not found: ID does not exist" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.985407 5118 scope.go:117] "RemoveContainer" containerID="861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78" Feb 23 07:18:00 crc kubenswrapper[5118]: E0223 07:18:00.985783 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78\": container with ID starting with 861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78 not found: ID does not exist" containerID="861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78" Feb 23 07:18:00 crc kubenswrapper[5118]: I0223 07:18:00.985801 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78"} err="failed to get container status \"861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78\": rpc error: code = NotFound desc = could not find container \"861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78\": container with ID starting with 861533b781feba4099b6668c4bf2501dadbbd590fd36b6dc67b4b4d16c260b78 not found: ID does not exist" Feb 23 07:18:01 crc kubenswrapper[5118]: I0223 07:18:01.715218 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" path="/var/lib/kubelet/pods/5eac0298-c6e9-470f-8402-c5d59e5d7f80/volumes" Feb 23 07:18:06 crc kubenswrapper[5118]: I0223 07:18:06.698089 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:18:06 crc kubenswrapper[5118]: E0223 07:18:06.699129 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:18:20 crc kubenswrapper[5118]: I0223 07:18:20.697955 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:18:20 crc kubenswrapper[5118]: E0223 07:18:20.699486 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:18:32 crc kubenswrapper[5118]: I0223 07:18:32.697204 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:18:32 crc kubenswrapper[5118]: E0223 07:18:32.698689 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:18:45 crc kubenswrapper[5118]: I0223 07:18:45.699830 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:18:45 crc kubenswrapper[5118]: E0223 07:18:45.701523 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:19:00 crc kubenswrapper[5118]: I0223 07:19:00.698653 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:19:00 crc kubenswrapper[5118]: E0223 07:19:00.699709 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:19:14 crc kubenswrapper[5118]: I0223 07:19:14.698573 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:19:14 crc kubenswrapper[5118]: E0223 07:19:14.699855 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:19:28 crc kubenswrapper[5118]: I0223 07:19:28.697382 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:19:28 crc kubenswrapper[5118]: E0223 07:19:28.700169 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:19:39 crc kubenswrapper[5118]: I0223 07:19:39.697582 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:19:39 crc kubenswrapper[5118]: I0223 07:19:39.909817 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"6cf3ce18b3e447596d5f957cc6d7496ecf83f37f59e43c14afff5b406bda44fb"} Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.214339 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r84x5"] Feb 23 07:20:54 crc kubenswrapper[5118]: E0223 07:20:54.215414 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerName="registry-server" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.215439 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerName="registry-server" Feb 23 07:20:54 crc kubenswrapper[5118]: E0223 07:20:54.215477 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerName="extract-utilities" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.215489 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerName="extract-utilities" Feb 23 07:20:54 crc kubenswrapper[5118]: E0223 07:20:54.215509 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerName="extract-content" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.215526 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerName="extract-content" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.215796 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eac0298-c6e9-470f-8402-c5d59e5d7f80" containerName="registry-server" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.217871 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.229427 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r84x5"] Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.363907 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-utilities\") pod \"redhat-marketplace-r84x5\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.363999 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-catalog-content\") pod \"redhat-marketplace-r84x5\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.364335 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcnrh\" (UniqueName: \"kubernetes.io/projected/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-kube-api-access-zcnrh\") pod \"redhat-marketplace-r84x5\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.466311 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcnrh\" (UniqueName: \"kubernetes.io/projected/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-kube-api-access-zcnrh\") pod \"redhat-marketplace-r84x5\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.466532 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-utilities\") pod \"redhat-marketplace-r84x5\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.466596 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-catalog-content\") pod \"redhat-marketplace-r84x5\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.467309 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-utilities\") pod \"redhat-marketplace-r84x5\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.467458 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-catalog-content\") pod \"redhat-marketplace-r84x5\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.502062 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcnrh\" (UniqueName: \"kubernetes.io/projected/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-kube-api-access-zcnrh\") pod \"redhat-marketplace-r84x5\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:54 crc kubenswrapper[5118]: I0223 07:20:54.536577 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:20:55 crc kubenswrapper[5118]: W0223 07:20:55.023965 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4dd70d3_9f6f_4ef0_859a_f4cd4b2d0fc9.slice/crio-18262b7997145e1a2532f93bc09febd6a29ec971e4a7b89ba1721702c0643b28 WatchSource:0}: Error finding container 18262b7997145e1a2532f93bc09febd6a29ec971e4a7b89ba1721702c0643b28: Status 404 returned error can't find the container with id 18262b7997145e1a2532f93bc09febd6a29ec971e4a7b89ba1721702c0643b28 Feb 23 07:20:55 crc kubenswrapper[5118]: I0223 07:20:55.027561 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r84x5"] Feb 23 07:20:55 crc kubenswrapper[5118]: I0223 07:20:55.689188 5118 generic.go:334] "Generic (PLEG): container finished" podID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerID="f95af26e1b525d6febf2152271811142c2fc057b1eec5e581870a9d870a156e1" exitCode=0 Feb 23 07:20:55 crc kubenswrapper[5118]: I0223 07:20:55.689270 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r84x5" event={"ID":"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9","Type":"ContainerDied","Data":"f95af26e1b525d6febf2152271811142c2fc057b1eec5e581870a9d870a156e1"} Feb 23 07:20:55 crc kubenswrapper[5118]: I0223 07:20:55.689316 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r84x5" event={"ID":"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9","Type":"ContainerStarted","Data":"18262b7997145e1a2532f93bc09febd6a29ec971e4a7b89ba1721702c0643b28"} Feb 23 07:20:56 crc kubenswrapper[5118]: I0223 07:20:56.703107 5118 generic.go:334] "Generic (PLEG): container finished" podID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerID="d2360ddb8f776f22ecc6218d547efcbfd4e204eb4c34644a1df9e0980d095b23" exitCode=0 Feb 23 07:20:56 crc kubenswrapper[5118]: I0223 07:20:56.703141 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r84x5" event={"ID":"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9","Type":"ContainerDied","Data":"d2360ddb8f776f22ecc6218d547efcbfd4e204eb4c34644a1df9e0980d095b23"} Feb 23 07:20:57 crc kubenswrapper[5118]: I0223 07:20:57.719408 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r84x5" event={"ID":"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9","Type":"ContainerStarted","Data":"721385d53d80be9c90df44fe213e7bface8f17e81a107d31ee466c4a46661e9e"} Feb 23 07:20:57 crc kubenswrapper[5118]: I0223 07:20:57.754051 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r84x5" podStartSLOduration=2.350418817 podStartE2EDuration="3.754023685s" podCreationTimestamp="2026-02-23 07:20:54 +0000 UTC" firstStartedPulling="2026-02-23 07:20:55.690948577 +0000 UTC m=+2118.694733160" lastFinishedPulling="2026-02-23 07:20:57.094553415 +0000 UTC m=+2120.098338028" observedRunningTime="2026-02-23 07:20:57.744721138 +0000 UTC m=+2120.748505791" watchObservedRunningTime="2026-02-23 07:20:57.754023685 +0000 UTC m=+2120.757808278" Feb 23 07:21:04 crc kubenswrapper[5118]: I0223 07:21:04.537569 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:21:04 crc kubenswrapper[5118]: I0223 07:21:04.538281 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:21:04 crc kubenswrapper[5118]: I0223 07:21:04.613627 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:21:04 crc kubenswrapper[5118]: I0223 07:21:04.842988 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:21:04 crc kubenswrapper[5118]: I0223 07:21:04.891378 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r84x5"] Feb 23 07:21:06 crc kubenswrapper[5118]: I0223 07:21:06.798606 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r84x5" podUID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerName="registry-server" containerID="cri-o://721385d53d80be9c90df44fe213e7bface8f17e81a107d31ee466c4a46661e9e" gracePeriod=2 Feb 23 07:21:07 crc kubenswrapper[5118]: I0223 07:21:07.811369 5118 generic.go:334] "Generic (PLEG): container finished" podID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerID="721385d53d80be9c90df44fe213e7bface8f17e81a107d31ee466c4a46661e9e" exitCode=0 Feb 23 07:21:07 crc kubenswrapper[5118]: I0223 07:21:07.811430 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r84x5" event={"ID":"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9","Type":"ContainerDied","Data":"721385d53d80be9c90df44fe213e7bface8f17e81a107d31ee466c4a46661e9e"} Feb 23 07:21:07 crc kubenswrapper[5118]: I0223 07:21:07.811708 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r84x5" event={"ID":"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9","Type":"ContainerDied","Data":"18262b7997145e1a2532f93bc09febd6a29ec971e4a7b89ba1721702c0643b28"} Feb 23 07:21:07 crc kubenswrapper[5118]: I0223 07:21:07.811734 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18262b7997145e1a2532f93bc09febd6a29ec971e4a7b89ba1721702c0643b28" Feb 23 07:21:07 crc kubenswrapper[5118]: I0223 07:21:07.811981 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.005928 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-catalog-content\") pod \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.006126 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcnrh\" (UniqueName: \"kubernetes.io/projected/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-kube-api-access-zcnrh\") pod \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.006200 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-utilities\") pod \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\" (UID: \"a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9\") " Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.007500 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-utilities" (OuterVolumeSpecName: "utilities") pod "a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" (UID: "a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.016144 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-kube-api-access-zcnrh" (OuterVolumeSpecName: "kube-api-access-zcnrh") pod "a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" (UID: "a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9"). InnerVolumeSpecName "kube-api-access-zcnrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.050894 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" (UID: "a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.108719 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.108770 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.108794 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcnrh\" (UniqueName: \"kubernetes.io/projected/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9-kube-api-access-zcnrh\") on node \"crc\" DevicePath \"\"" Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.820858 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r84x5" Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.885376 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r84x5"] Feb 23 07:21:08 crc kubenswrapper[5118]: I0223 07:21:08.900230 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r84x5"] Feb 23 07:21:09 crc kubenswrapper[5118]: I0223 07:21:09.717347 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" path="/var/lib/kubelet/pods/a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9/volumes" Feb 23 07:21:12 crc kubenswrapper[5118]: I0223 07:21:12.841647 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sd8gs"] Feb 23 07:21:12 crc kubenswrapper[5118]: E0223 07:21:12.848903 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerName="registry-server" Feb 23 07:21:12 crc kubenswrapper[5118]: I0223 07:21:12.848952 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerName="registry-server" Feb 23 07:21:12 crc kubenswrapper[5118]: E0223 07:21:12.848989 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerName="extract-utilities" Feb 23 07:21:12 crc kubenswrapper[5118]: I0223 07:21:12.849003 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerName="extract-utilities" Feb 23 07:21:12 crc kubenswrapper[5118]: E0223 07:21:12.849048 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerName="extract-content" Feb 23 07:21:12 crc kubenswrapper[5118]: I0223 07:21:12.849061 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerName="extract-content" Feb 23 07:21:12 crc kubenswrapper[5118]: I0223 07:21:12.849396 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4dd70d3-9f6f-4ef0-859a-f4cd4b2d0fc9" containerName="registry-server" Feb 23 07:21:12 crc kubenswrapper[5118]: I0223 07:21:12.851189 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:12 crc kubenswrapper[5118]: I0223 07:21:12.860217 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sd8gs"] Feb 23 07:21:12 crc kubenswrapper[5118]: I0223 07:21:12.902548 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfknh\" (UniqueName: \"kubernetes.io/projected/18c25a1b-405a-41f1-9696-968f1aae2f00-kube-api-access-dfknh\") pod \"community-operators-sd8gs\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:12 crc kubenswrapper[5118]: I0223 07:21:12.902687 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-catalog-content\") pod \"community-operators-sd8gs\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:12 crc kubenswrapper[5118]: I0223 07:21:12.902807 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-utilities\") pod \"community-operators-sd8gs\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:13 crc kubenswrapper[5118]: I0223 07:21:13.003731 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-catalog-content\") pod \"community-operators-sd8gs\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:13 crc kubenswrapper[5118]: I0223 07:21:13.003829 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-utilities\") pod \"community-operators-sd8gs\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:13 crc kubenswrapper[5118]: I0223 07:21:13.003875 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfknh\" (UniqueName: \"kubernetes.io/projected/18c25a1b-405a-41f1-9696-968f1aae2f00-kube-api-access-dfknh\") pod \"community-operators-sd8gs\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:13 crc kubenswrapper[5118]: I0223 07:21:13.004536 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-utilities\") pod \"community-operators-sd8gs\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:13 crc kubenswrapper[5118]: I0223 07:21:13.004718 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-catalog-content\") pod \"community-operators-sd8gs\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:13 crc kubenswrapper[5118]: I0223 07:21:13.040639 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfknh\" (UniqueName: \"kubernetes.io/projected/18c25a1b-405a-41f1-9696-968f1aae2f00-kube-api-access-dfknh\") pod \"community-operators-sd8gs\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:13 crc kubenswrapper[5118]: I0223 07:21:13.203937 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:13 crc kubenswrapper[5118]: I0223 07:21:13.729764 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sd8gs"] Feb 23 07:21:13 crc kubenswrapper[5118]: I0223 07:21:13.876954 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sd8gs" event={"ID":"18c25a1b-405a-41f1-9696-968f1aae2f00","Type":"ContainerStarted","Data":"86b23f5a73fc0730deb648cfe8dc9d09f85751cf03979887f4460a6428ed36d0"} Feb 23 07:21:14 crc kubenswrapper[5118]: I0223 07:21:14.889386 5118 generic.go:334] "Generic (PLEG): container finished" podID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerID="4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018" exitCode=0 Feb 23 07:21:14 crc kubenswrapper[5118]: I0223 07:21:14.889455 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sd8gs" event={"ID":"18c25a1b-405a-41f1-9696-968f1aae2f00","Type":"ContainerDied","Data":"4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018"} Feb 23 07:21:15 crc kubenswrapper[5118]: I0223 07:21:15.904925 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sd8gs" event={"ID":"18c25a1b-405a-41f1-9696-968f1aae2f00","Type":"ContainerStarted","Data":"4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a"} Feb 23 07:21:16 crc kubenswrapper[5118]: I0223 07:21:16.919474 5118 generic.go:334] "Generic (PLEG): container finished" podID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerID="4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a" exitCode=0 Feb 23 07:21:16 crc kubenswrapper[5118]: I0223 07:21:16.919552 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sd8gs" event={"ID":"18c25a1b-405a-41f1-9696-968f1aae2f00","Type":"ContainerDied","Data":"4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a"} Feb 23 07:21:17 crc kubenswrapper[5118]: I0223 07:21:17.933314 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sd8gs" event={"ID":"18c25a1b-405a-41f1-9696-968f1aae2f00","Type":"ContainerStarted","Data":"a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7"} Feb 23 07:21:17 crc kubenswrapper[5118]: I0223 07:21:17.970301 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sd8gs" podStartSLOduration=3.543749374 podStartE2EDuration="5.970279293s" podCreationTimestamp="2026-02-23 07:21:12 +0000 UTC" firstStartedPulling="2026-02-23 07:21:14.892440951 +0000 UTC m=+2137.896225534" lastFinishedPulling="2026-02-23 07:21:17.31897084 +0000 UTC m=+2140.322755453" observedRunningTime="2026-02-23 07:21:17.959251963 +0000 UTC m=+2140.963036596" watchObservedRunningTime="2026-02-23 07:21:17.970279293 +0000 UTC m=+2140.974063876" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.024723 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6wzcs"] Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.028008 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.048219 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6wzcs"] Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.048475 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-catalog-content\") pod \"certified-operators-6wzcs\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.048553 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r8h6\" (UniqueName: \"kubernetes.io/projected/69456a54-9a29-4036-af66-f6c601f7c43d-kube-api-access-4r8h6\") pod \"certified-operators-6wzcs\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.048637 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-utilities\") pod \"certified-operators-6wzcs\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.149892 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-catalog-content\") pod \"certified-operators-6wzcs\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.150015 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r8h6\" (UniqueName: \"kubernetes.io/projected/69456a54-9a29-4036-af66-f6c601f7c43d-kube-api-access-4r8h6\") pod \"certified-operators-6wzcs\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.150124 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-utilities\") pod \"certified-operators-6wzcs\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.150766 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-utilities\") pod \"certified-operators-6wzcs\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.151045 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-catalog-content\") pod \"certified-operators-6wzcs\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.179362 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r8h6\" (UniqueName: \"kubernetes.io/projected/69456a54-9a29-4036-af66-f6c601f7c43d-kube-api-access-4r8h6\") pod \"certified-operators-6wzcs\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.354393 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.660494 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6wzcs"] Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.970150 5118 generic.go:334] "Generic (PLEG): container finished" podID="69456a54-9a29-4036-af66-f6c601f7c43d" containerID="a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7" exitCode=0 Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.970422 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wzcs" event={"ID":"69456a54-9a29-4036-af66-f6c601f7c43d","Type":"ContainerDied","Data":"a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7"} Feb 23 07:21:21 crc kubenswrapper[5118]: I0223 07:21:21.970446 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wzcs" event={"ID":"69456a54-9a29-4036-af66-f6c601f7c43d","Type":"ContainerStarted","Data":"981e12ecbb9b64676ae14542cf4b8c8a2983b988e2b2d765325defe12b8854b8"} Feb 23 07:21:23 crc kubenswrapper[5118]: I0223 07:21:23.204818 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:23 crc kubenswrapper[5118]: I0223 07:21:23.205314 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:23 crc kubenswrapper[5118]: I0223 07:21:23.261014 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:24 crc kubenswrapper[5118]: I0223 07:21:24.053972 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.006511 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sd8gs"] Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.009231 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sd8gs" podUID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerName="registry-server" containerID="cri-o://a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7" gracePeriod=2 Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.679323 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.750165 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-utilities\") pod \"18c25a1b-405a-41f1-9696-968f1aae2f00\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.750265 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-catalog-content\") pod \"18c25a1b-405a-41f1-9696-968f1aae2f00\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.750299 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfknh\" (UniqueName: \"kubernetes.io/projected/18c25a1b-405a-41f1-9696-968f1aae2f00-kube-api-access-dfknh\") pod \"18c25a1b-405a-41f1-9696-968f1aae2f00\" (UID: \"18c25a1b-405a-41f1-9696-968f1aae2f00\") " Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.752669 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-utilities" (OuterVolumeSpecName: "utilities") pod "18c25a1b-405a-41f1-9696-968f1aae2f00" (UID: "18c25a1b-405a-41f1-9696-968f1aae2f00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.764356 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c25a1b-405a-41f1-9696-968f1aae2f00-kube-api-access-dfknh" (OuterVolumeSpecName: "kube-api-access-dfknh") pod "18c25a1b-405a-41f1-9696-968f1aae2f00" (UID: "18c25a1b-405a-41f1-9696-968f1aae2f00"). InnerVolumeSpecName "kube-api-access-dfknh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.815027 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18c25a1b-405a-41f1-9696-968f1aae2f00" (UID: "18c25a1b-405a-41f1-9696-968f1aae2f00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.851874 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.851923 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18c25a1b-405a-41f1-9696-968f1aae2f00-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:21:26 crc kubenswrapper[5118]: I0223 07:21:26.851960 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfknh\" (UniqueName: \"kubernetes.io/projected/18c25a1b-405a-41f1-9696-968f1aae2f00-kube-api-access-dfknh\") on node \"crc\" DevicePath \"\"" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.021046 5118 generic.go:334] "Generic (PLEG): container finished" podID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerID="a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7" exitCode=0 Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.021176 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sd8gs" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.021166 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sd8gs" event={"ID":"18c25a1b-405a-41f1-9696-968f1aae2f00","Type":"ContainerDied","Data":"a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7"} Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.021343 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sd8gs" event={"ID":"18c25a1b-405a-41f1-9696-968f1aae2f00","Type":"ContainerDied","Data":"86b23f5a73fc0730deb648cfe8dc9d09f85751cf03979887f4460a6428ed36d0"} Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.021378 5118 scope.go:117] "RemoveContainer" containerID="a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.026169 5118 generic.go:334] "Generic (PLEG): container finished" podID="69456a54-9a29-4036-af66-f6c601f7c43d" containerID="a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264" exitCode=0 Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.026200 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wzcs" event={"ID":"69456a54-9a29-4036-af66-f6c601f7c43d","Type":"ContainerDied","Data":"a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264"} Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.068135 5118 scope.go:117] "RemoveContainer" containerID="4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.090391 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sd8gs"] Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.097242 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sd8gs"] Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.104778 5118 scope.go:117] "RemoveContainer" containerID="4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.128821 5118 scope.go:117] "RemoveContainer" containerID="a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7" Feb 23 07:21:27 crc kubenswrapper[5118]: E0223 07:21:27.129515 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7\": container with ID starting with a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7 not found: ID does not exist" containerID="a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.129588 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7"} err="failed to get container status \"a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7\": rpc error: code = NotFound desc = could not find container \"a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7\": container with ID starting with a7e79e92b139133f9f33424264335b8b006a7157faebea254e27c8fb908589b7 not found: ID does not exist" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.129629 5118 scope.go:117] "RemoveContainer" containerID="4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a" Feb 23 07:21:27 crc kubenswrapper[5118]: E0223 07:21:27.130184 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a\": container with ID starting with 4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a not found: ID does not exist" containerID="4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.130248 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a"} err="failed to get container status \"4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a\": rpc error: code = NotFound desc = could not find container \"4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a\": container with ID starting with 4fc937cddbd110e6103e7a9c051a9a9ca6e5009e4a60f1b242a6cf5f7fe5d52a not found: ID does not exist" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.130291 5118 scope.go:117] "RemoveContainer" containerID="4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018" Feb 23 07:21:27 crc kubenswrapper[5118]: E0223 07:21:27.130792 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018\": container with ID starting with 4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018 not found: ID does not exist" containerID="4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.130845 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018"} err="failed to get container status \"4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018\": rpc error: code = NotFound desc = could not find container \"4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018\": container with ID starting with 4597542bcc422f86250699c87fcda94119dd09ba829175c4a3c74268183e8018 not found: ID does not exist" Feb 23 07:21:27 crc kubenswrapper[5118]: I0223 07:21:27.718614 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c25a1b-405a-41f1-9696-968f1aae2f00" path="/var/lib/kubelet/pods/18c25a1b-405a-41f1-9696-968f1aae2f00/volumes" Feb 23 07:21:28 crc kubenswrapper[5118]: I0223 07:21:28.037373 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wzcs" event={"ID":"69456a54-9a29-4036-af66-f6c601f7c43d","Type":"ContainerStarted","Data":"0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f"} Feb 23 07:21:28 crc kubenswrapper[5118]: I0223 07:21:28.067246 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6wzcs" podStartSLOduration=2.328626316 podStartE2EDuration="8.067219038s" podCreationTimestamp="2026-02-23 07:21:20 +0000 UTC" firstStartedPulling="2026-02-23 07:21:21.971485421 +0000 UTC m=+2144.975269994" lastFinishedPulling="2026-02-23 07:21:27.710078103 +0000 UTC m=+2150.713862716" observedRunningTime="2026-02-23 07:21:28.062620119 +0000 UTC m=+2151.066404732" watchObservedRunningTime="2026-02-23 07:21:28.067219038 +0000 UTC m=+2151.071003621" Feb 23 07:21:31 crc kubenswrapper[5118]: I0223 07:21:31.355224 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:31 crc kubenswrapper[5118]: I0223 07:21:31.356176 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:31 crc kubenswrapper[5118]: I0223 07:21:31.435489 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:32 crc kubenswrapper[5118]: I0223 07:21:32.130157 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 07:21:33 crc kubenswrapper[5118]: I0223 07:21:33.062384 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6wzcs"] Feb 23 07:21:33 crc kubenswrapper[5118]: I0223 07:21:33.213898 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qgpxb"] Feb 23 07:21:33 crc kubenswrapper[5118]: I0223 07:21:33.215340 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qgpxb" podUID="2eea3976-c9c9-4160-861a-251e0822d119" containerName="registry-server" containerID="cri-o://88f93f31c396438b5dc50e1f8b57df9623d49192bf1f6d9d4ae9f0531e266159" gracePeriod=2 Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.088791 5118 generic.go:334] "Generic (PLEG): container finished" podID="2eea3976-c9c9-4160-861a-251e0822d119" containerID="88f93f31c396438b5dc50e1f8b57df9623d49192bf1f6d9d4ae9f0531e266159" exitCode=0 Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.088871 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgpxb" event={"ID":"2eea3976-c9c9-4160-861a-251e0822d119","Type":"ContainerDied","Data":"88f93f31c396438b5dc50e1f8b57df9623d49192bf1f6d9d4ae9f0531e266159"} Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.453370 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.590730 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnl2c\" (UniqueName: \"kubernetes.io/projected/2eea3976-c9c9-4160-861a-251e0822d119-kube-api-access-lnl2c\") pod \"2eea3976-c9c9-4160-861a-251e0822d119\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.590809 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-catalog-content\") pod \"2eea3976-c9c9-4160-861a-251e0822d119\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.590934 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-utilities\") pod \"2eea3976-c9c9-4160-861a-251e0822d119\" (UID: \"2eea3976-c9c9-4160-861a-251e0822d119\") " Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.593319 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-utilities" (OuterVolumeSpecName: "utilities") pod "2eea3976-c9c9-4160-861a-251e0822d119" (UID: "2eea3976-c9c9-4160-861a-251e0822d119"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.598689 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eea3976-c9c9-4160-861a-251e0822d119-kube-api-access-lnl2c" (OuterVolumeSpecName: "kube-api-access-lnl2c") pod "2eea3976-c9c9-4160-861a-251e0822d119" (UID: "2eea3976-c9c9-4160-861a-251e0822d119"). InnerVolumeSpecName "kube-api-access-lnl2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.666448 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2eea3976-c9c9-4160-861a-251e0822d119" (UID: "2eea3976-c9c9-4160-861a-251e0822d119"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.692386 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.692422 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnl2c\" (UniqueName: \"kubernetes.io/projected/2eea3976-c9c9-4160-861a-251e0822d119-kube-api-access-lnl2c\") on node \"crc\" DevicePath \"\"" Feb 23 07:21:34 crc kubenswrapper[5118]: I0223 07:21:34.692432 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eea3976-c9c9-4160-861a-251e0822d119-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:21:35 crc kubenswrapper[5118]: I0223 07:21:35.100658 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qgpxb" event={"ID":"2eea3976-c9c9-4160-861a-251e0822d119","Type":"ContainerDied","Data":"ecf056beba4d1c834e9287ee2a800f2c2486f3b7dcc7cbaee46bc1e92227bea6"} Feb 23 07:21:35 crc kubenswrapper[5118]: I0223 07:21:35.101025 5118 scope.go:117] "RemoveContainer" containerID="88f93f31c396438b5dc50e1f8b57df9623d49192bf1f6d9d4ae9f0531e266159" Feb 23 07:21:35 crc kubenswrapper[5118]: I0223 07:21:35.100762 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qgpxb" Feb 23 07:21:35 crc kubenswrapper[5118]: I0223 07:21:35.138265 5118 scope.go:117] "RemoveContainer" containerID="1441700cfa0e441468f6451100683aad2b71408650770b08579d10650fc5d65b" Feb 23 07:21:35 crc kubenswrapper[5118]: I0223 07:21:35.139711 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qgpxb"] Feb 23 07:21:35 crc kubenswrapper[5118]: I0223 07:21:35.143934 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qgpxb"] Feb 23 07:21:35 crc kubenswrapper[5118]: I0223 07:21:35.164381 5118 scope.go:117] "RemoveContainer" containerID="1852d0a45e6c6264a786f75fdc453d244ecee2dcf202f634ca697c21f6bfaeaa" Feb 23 07:21:35 crc kubenswrapper[5118]: I0223 07:21:35.707603 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eea3976-c9c9-4160-861a-251e0822d119" path="/var/lib/kubelet/pods/2eea3976-c9c9-4160-861a-251e0822d119/volumes" Feb 23 07:22:02 crc kubenswrapper[5118]: I0223 07:22:02.975303 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:22:02 crc kubenswrapper[5118]: I0223 07:22:02.976328 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:22:32 crc kubenswrapper[5118]: I0223 07:22:32.975592 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:22:32 crc kubenswrapper[5118]: I0223 07:22:32.978248 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:23:02 crc kubenswrapper[5118]: I0223 07:23:02.975944 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:23:02 crc kubenswrapper[5118]: I0223 07:23:02.976847 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:23:02 crc kubenswrapper[5118]: I0223 07:23:02.976950 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:23:02 crc kubenswrapper[5118]: I0223 07:23:02.978078 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6cf3ce18b3e447596d5f957cc6d7496ecf83f37f59e43c14afff5b406bda44fb"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:23:02 crc kubenswrapper[5118]: I0223 07:23:02.978214 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://6cf3ce18b3e447596d5f957cc6d7496ecf83f37f59e43c14afff5b406bda44fb" gracePeriod=600 Feb 23 07:23:04 crc kubenswrapper[5118]: I0223 07:23:04.047495 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="6cf3ce18b3e447596d5f957cc6d7496ecf83f37f59e43c14afff5b406bda44fb" exitCode=0 Feb 23 07:23:04 crc kubenswrapper[5118]: I0223 07:23:04.047617 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"6cf3ce18b3e447596d5f957cc6d7496ecf83f37f59e43c14afff5b406bda44fb"} Feb 23 07:23:04 crc kubenswrapper[5118]: I0223 07:23:04.048574 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610"} Feb 23 07:23:04 crc kubenswrapper[5118]: I0223 07:23:04.048613 5118 scope.go:117] "RemoveContainer" containerID="04c8a3d5cd7363abeecab1a72e4499ed12f1a3c2b7246bd663d5744f4b6ca87b" Feb 23 07:25:32 crc kubenswrapper[5118]: I0223 07:25:32.975638 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:25:32 crc kubenswrapper[5118]: I0223 07:25:32.976960 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:26:02 crc kubenswrapper[5118]: I0223 07:26:02.974826 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:26:02 crc kubenswrapper[5118]: I0223 07:26:02.975459 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:26:32 crc kubenswrapper[5118]: I0223 07:26:32.974878 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:26:32 crc kubenswrapper[5118]: I0223 07:26:32.975476 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:26:32 crc kubenswrapper[5118]: I0223 07:26:32.975541 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:26:32 crc kubenswrapper[5118]: I0223 07:26:32.976378 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:26:32 crc kubenswrapper[5118]: I0223 07:26:32.976471 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" gracePeriod=600 Feb 23 07:26:33 crc kubenswrapper[5118]: E0223 07:26:33.113234 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:26:33 crc kubenswrapper[5118]: I0223 07:26:33.259560 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" exitCode=0 Feb 23 07:26:33 crc kubenswrapper[5118]: I0223 07:26:33.259632 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610"} Feb 23 07:26:33 crc kubenswrapper[5118]: I0223 07:26:33.259677 5118 scope.go:117] "RemoveContainer" containerID="6cf3ce18b3e447596d5f957cc6d7496ecf83f37f59e43c14afff5b406bda44fb" Feb 23 07:26:33 crc kubenswrapper[5118]: I0223 07:26:33.260406 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:26:33 crc kubenswrapper[5118]: E0223 07:26:33.260834 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:26:48 crc kubenswrapper[5118]: I0223 07:26:48.697736 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:26:48 crc kubenswrapper[5118]: E0223 07:26:48.698807 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:27:03 crc kubenswrapper[5118]: I0223 07:27:03.697552 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:27:03 crc kubenswrapper[5118]: E0223 07:27:03.699295 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:27:17 crc kubenswrapper[5118]: I0223 07:27:17.705215 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:27:17 crc kubenswrapper[5118]: E0223 07:27:17.706501 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:27:30 crc kubenswrapper[5118]: I0223 07:27:30.697735 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:27:30 crc kubenswrapper[5118]: E0223 07:27:30.700412 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:27:43 crc kubenswrapper[5118]: I0223 07:27:43.697303 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:27:43 crc kubenswrapper[5118]: E0223 07:27:43.698296 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:27:49 crc kubenswrapper[5118]: I0223 07:27:49.214393 5118 scope.go:117] "RemoveContainer" containerID="721385d53d80be9c90df44fe213e7bface8f17e81a107d31ee466c4a46661e9e" Feb 23 07:27:49 crc kubenswrapper[5118]: I0223 07:27:49.248687 5118 scope.go:117] "RemoveContainer" containerID="f95af26e1b525d6febf2152271811142c2fc057b1eec5e581870a9d870a156e1" Feb 23 07:27:49 crc kubenswrapper[5118]: I0223 07:27:49.290650 5118 scope.go:117] "RemoveContainer" containerID="d2360ddb8f776f22ecc6218d547efcbfd4e204eb4c34644a1df9e0980d095b23" Feb 23 07:27:55 crc kubenswrapper[5118]: I0223 07:27:55.697195 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:27:55 crc kubenswrapper[5118]: E0223 07:27:55.698241 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.624699 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9t84t"] Feb 23 07:28:00 crc kubenswrapper[5118]: E0223 07:28:00.626081 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eea3976-c9c9-4160-861a-251e0822d119" containerName="registry-server" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.626139 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eea3976-c9c9-4160-861a-251e0822d119" containerName="registry-server" Feb 23 07:28:00 crc kubenswrapper[5118]: E0223 07:28:00.626173 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eea3976-c9c9-4160-861a-251e0822d119" containerName="extract-content" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.626194 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eea3976-c9c9-4160-861a-251e0822d119" containerName="extract-content" Feb 23 07:28:00 crc kubenswrapper[5118]: E0223 07:28:00.626220 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerName="registry-server" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.626238 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerName="registry-server" Feb 23 07:28:00 crc kubenswrapper[5118]: E0223 07:28:00.626269 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerName="extract-content" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.626303 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerName="extract-content" Feb 23 07:28:00 crc kubenswrapper[5118]: E0223 07:28:00.626327 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eea3976-c9c9-4160-861a-251e0822d119" containerName="extract-utilities" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.626339 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eea3976-c9c9-4160-861a-251e0822d119" containerName="extract-utilities" Feb 23 07:28:00 crc kubenswrapper[5118]: E0223 07:28:00.626360 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerName="extract-utilities" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.626373 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerName="extract-utilities" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.626663 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c25a1b-405a-41f1-9696-968f1aae2f00" containerName="registry-server" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.626697 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eea3976-c9c9-4160-861a-251e0822d119" containerName="registry-server" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.633224 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.641793 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9t84t"] Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.759169 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-utilities\") pod \"redhat-operators-9t84t\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.759253 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-catalog-content\") pod \"redhat-operators-9t84t\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.759600 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxgtj\" (UniqueName: \"kubernetes.io/projected/0425d98a-36d6-4dda-9ebb-549bde985d14-kube-api-access-sxgtj\") pod \"redhat-operators-9t84t\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.861512 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-utilities\") pod \"redhat-operators-9t84t\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.861964 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-catalog-content\") pod \"redhat-operators-9t84t\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.862488 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxgtj\" (UniqueName: \"kubernetes.io/projected/0425d98a-36d6-4dda-9ebb-549bde985d14-kube-api-access-sxgtj\") pod \"redhat-operators-9t84t\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.862582 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-utilities\") pod \"redhat-operators-9t84t\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.862600 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-catalog-content\") pod \"redhat-operators-9t84t\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.904375 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxgtj\" (UniqueName: \"kubernetes.io/projected/0425d98a-36d6-4dda-9ebb-549bde985d14-kube-api-access-sxgtj\") pod \"redhat-operators-9t84t\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:00 crc kubenswrapper[5118]: I0223 07:28:00.976328 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:01 crc kubenswrapper[5118]: I0223 07:28:01.503197 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9t84t"] Feb 23 07:28:01 crc kubenswrapper[5118]: W0223 07:28:01.511239 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0425d98a_36d6_4dda_9ebb_549bde985d14.slice/crio-e1c9d58f9cedaa20ce07df11103e720ac0b1e2e48f7a94fb291cf8c222f11430 WatchSource:0}: Error finding container e1c9d58f9cedaa20ce07df11103e720ac0b1e2e48f7a94fb291cf8c222f11430: Status 404 returned error can't find the container with id e1c9d58f9cedaa20ce07df11103e720ac0b1e2e48f7a94fb291cf8c222f11430 Feb 23 07:28:02 crc kubenswrapper[5118]: I0223 07:28:02.106302 5118 generic.go:334] "Generic (PLEG): container finished" podID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerID="5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134" exitCode=0 Feb 23 07:28:02 crc kubenswrapper[5118]: I0223 07:28:02.106629 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t84t" event={"ID":"0425d98a-36d6-4dda-9ebb-549bde985d14","Type":"ContainerDied","Data":"5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134"} Feb 23 07:28:02 crc kubenswrapper[5118]: I0223 07:28:02.106656 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t84t" event={"ID":"0425d98a-36d6-4dda-9ebb-549bde985d14","Type":"ContainerStarted","Data":"e1c9d58f9cedaa20ce07df11103e720ac0b1e2e48f7a94fb291cf8c222f11430"} Feb 23 07:28:02 crc kubenswrapper[5118]: I0223 07:28:02.112388 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:28:03 crc kubenswrapper[5118]: I0223 07:28:03.115291 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t84t" event={"ID":"0425d98a-36d6-4dda-9ebb-549bde985d14","Type":"ContainerStarted","Data":"406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0"} Feb 23 07:28:04 crc kubenswrapper[5118]: I0223 07:28:04.127162 5118 generic.go:334] "Generic (PLEG): container finished" podID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerID="406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0" exitCode=0 Feb 23 07:28:04 crc kubenswrapper[5118]: I0223 07:28:04.127917 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t84t" event={"ID":"0425d98a-36d6-4dda-9ebb-549bde985d14","Type":"ContainerDied","Data":"406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0"} Feb 23 07:28:05 crc kubenswrapper[5118]: I0223 07:28:05.139813 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t84t" event={"ID":"0425d98a-36d6-4dda-9ebb-549bde985d14","Type":"ContainerStarted","Data":"d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b"} Feb 23 07:28:05 crc kubenswrapper[5118]: I0223 07:28:05.167552 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9t84t" podStartSLOduration=2.759656562 podStartE2EDuration="5.167530474s" podCreationTimestamp="2026-02-23 07:28:00 +0000 UTC" firstStartedPulling="2026-02-23 07:28:02.112109158 +0000 UTC m=+2545.115893721" lastFinishedPulling="2026-02-23 07:28:04.51998306 +0000 UTC m=+2547.523767633" observedRunningTime="2026-02-23 07:28:05.164171332 +0000 UTC m=+2548.167955935" watchObservedRunningTime="2026-02-23 07:28:05.167530474 +0000 UTC m=+2548.171315067" Feb 23 07:28:10 crc kubenswrapper[5118]: I0223 07:28:10.697464 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:28:10 crc kubenswrapper[5118]: E0223 07:28:10.698581 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:28:10 crc kubenswrapper[5118]: I0223 07:28:10.976974 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:10 crc kubenswrapper[5118]: I0223 07:28:10.977069 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:12 crc kubenswrapper[5118]: I0223 07:28:12.052076 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9t84t" podUID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerName="registry-server" probeResult="failure" output=< Feb 23 07:28:12 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 07:28:12 crc kubenswrapper[5118]: > Feb 23 07:28:21 crc kubenswrapper[5118]: I0223 07:28:21.055125 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:21 crc kubenswrapper[5118]: I0223 07:28:21.135436 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:21 crc kubenswrapper[5118]: I0223 07:28:21.313455 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9t84t"] Feb 23 07:28:22 crc kubenswrapper[5118]: I0223 07:28:22.328704 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9t84t" podUID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerName="registry-server" containerID="cri-o://d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b" gracePeriod=2 Feb 23 07:28:22 crc kubenswrapper[5118]: I0223 07:28:22.698480 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:28:22 crc kubenswrapper[5118]: E0223 07:28:22.699670 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:28:22 crc kubenswrapper[5118]: I0223 07:28:22.819521 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:22 crc kubenswrapper[5118]: I0223 07:28:22.860398 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxgtj\" (UniqueName: \"kubernetes.io/projected/0425d98a-36d6-4dda-9ebb-549bde985d14-kube-api-access-sxgtj\") pod \"0425d98a-36d6-4dda-9ebb-549bde985d14\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " Feb 23 07:28:22 crc kubenswrapper[5118]: I0223 07:28:22.860528 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-catalog-content\") pod \"0425d98a-36d6-4dda-9ebb-549bde985d14\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " Feb 23 07:28:22 crc kubenswrapper[5118]: I0223 07:28:22.860669 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-utilities\") pod \"0425d98a-36d6-4dda-9ebb-549bde985d14\" (UID: \"0425d98a-36d6-4dda-9ebb-549bde985d14\") " Feb 23 07:28:22 crc kubenswrapper[5118]: I0223 07:28:22.862866 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-utilities" (OuterVolumeSpecName: "utilities") pod "0425d98a-36d6-4dda-9ebb-549bde985d14" (UID: "0425d98a-36d6-4dda-9ebb-549bde985d14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:28:22 crc kubenswrapper[5118]: I0223 07:28:22.889357 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0425d98a-36d6-4dda-9ebb-549bde985d14-kube-api-access-sxgtj" (OuterVolumeSpecName: "kube-api-access-sxgtj") pod "0425d98a-36d6-4dda-9ebb-549bde985d14" (UID: "0425d98a-36d6-4dda-9ebb-549bde985d14"). InnerVolumeSpecName "kube-api-access-sxgtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:28:22 crc kubenswrapper[5118]: I0223 07:28:22.964275 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxgtj\" (UniqueName: \"kubernetes.io/projected/0425d98a-36d6-4dda-9ebb-549bde985d14-kube-api-access-sxgtj\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:22 crc kubenswrapper[5118]: I0223 07:28:22.964374 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.075971 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0425d98a-36d6-4dda-9ebb-549bde985d14" (UID: "0425d98a-36d6-4dda-9ebb-549bde985d14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.168368 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0425d98a-36d6-4dda-9ebb-549bde985d14-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.342902 5118 generic.go:334] "Generic (PLEG): container finished" podID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerID="d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b" exitCode=0 Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.343006 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9t84t" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.343179 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t84t" event={"ID":"0425d98a-36d6-4dda-9ebb-549bde985d14","Type":"ContainerDied","Data":"d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b"} Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.345062 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t84t" event={"ID":"0425d98a-36d6-4dda-9ebb-549bde985d14","Type":"ContainerDied","Data":"e1c9d58f9cedaa20ce07df11103e720ac0b1e2e48f7a94fb291cf8c222f11430"} Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.345146 5118 scope.go:117] "RemoveContainer" containerID="d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.391928 5118 scope.go:117] "RemoveContainer" containerID="406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.403341 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9t84t"] Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.416496 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9t84t"] Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.427697 5118 scope.go:117] "RemoveContainer" containerID="5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.457499 5118 scope.go:117] "RemoveContainer" containerID="d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b" Feb 23 07:28:23 crc kubenswrapper[5118]: E0223 07:28:23.458293 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b\": container with ID starting with d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b not found: ID does not exist" containerID="d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.458378 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b"} err="failed to get container status \"d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b\": rpc error: code = NotFound desc = could not find container \"d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b\": container with ID starting with d47afb7f7d1568103dde01d4efacc3093119b71c957b6c2cf0bfb0307bce9a5b not found: ID does not exist" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.458437 5118 scope.go:117] "RemoveContainer" containerID="406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0" Feb 23 07:28:23 crc kubenswrapper[5118]: E0223 07:28:23.459074 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0\": container with ID starting with 406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0 not found: ID does not exist" containerID="406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.459366 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0"} err="failed to get container status \"406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0\": rpc error: code = NotFound desc = could not find container \"406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0\": container with ID starting with 406a67fed9f734703d6e0ae2cfef91399cd3b264d941c0fa3ee94d23990b59e0 not found: ID does not exist" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.459534 5118 scope.go:117] "RemoveContainer" containerID="5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134" Feb 23 07:28:23 crc kubenswrapper[5118]: E0223 07:28:23.460379 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134\": container with ID starting with 5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134 not found: ID does not exist" containerID="5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.460439 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134"} err="failed to get container status \"5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134\": rpc error: code = NotFound desc = could not find container \"5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134\": container with ID starting with 5cbf9cd22da9ce9d9f2144749d46d58f0a1a169554491538fd73118d4e1f7134 not found: ID does not exist" Feb 23 07:28:23 crc kubenswrapper[5118]: I0223 07:28:23.715460 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0425d98a-36d6-4dda-9ebb-549bde985d14" path="/var/lib/kubelet/pods/0425d98a-36d6-4dda-9ebb-549bde985d14/volumes" Feb 23 07:28:33 crc kubenswrapper[5118]: I0223 07:28:33.698266 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:28:33 crc kubenswrapper[5118]: E0223 07:28:33.699655 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:28:48 crc kubenswrapper[5118]: I0223 07:28:48.697562 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:28:48 crc kubenswrapper[5118]: E0223 07:28:48.698498 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:29:02 crc kubenswrapper[5118]: I0223 07:29:02.698562 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:29:02 crc kubenswrapper[5118]: E0223 07:29:02.699483 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:29:15 crc kubenswrapper[5118]: I0223 07:29:15.697951 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:29:15 crc kubenswrapper[5118]: E0223 07:29:15.698777 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:29:26 crc kubenswrapper[5118]: I0223 07:29:26.698449 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:29:26 crc kubenswrapper[5118]: E0223 07:29:26.700149 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:29:39 crc kubenswrapper[5118]: I0223 07:29:39.697430 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:29:39 crc kubenswrapper[5118]: E0223 07:29:39.698695 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:29:54 crc kubenswrapper[5118]: I0223 07:29:54.697808 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:29:54 crc kubenswrapper[5118]: E0223 07:29:54.698825 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.155804 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn"] Feb 23 07:30:00 crc kubenswrapper[5118]: E0223 07:30:00.156499 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerName="extract-utilities" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.156514 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerName="extract-utilities" Feb 23 07:30:00 crc kubenswrapper[5118]: E0223 07:30:00.156532 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.156540 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[5118]: E0223 07:30:00.156572 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerName="extract-content" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.156581 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerName="extract-content" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.156748 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0425d98a-36d6-4dda-9ebb-549bde985d14" containerName="registry-server" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.157367 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.160328 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.160372 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.173839 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn"] Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.318658 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39bb2b60-4d35-423f-ac18-c48d017800cd-secret-volume\") pod \"collect-profiles-29530530-8zhjn\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.318720 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39bb2b60-4d35-423f-ac18-c48d017800cd-config-volume\") pod \"collect-profiles-29530530-8zhjn\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.318751 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chlz\" (UniqueName: \"kubernetes.io/projected/39bb2b60-4d35-423f-ac18-c48d017800cd-kube-api-access-6chlz\") pod \"collect-profiles-29530530-8zhjn\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.420325 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39bb2b60-4d35-423f-ac18-c48d017800cd-config-volume\") pod \"collect-profiles-29530530-8zhjn\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.420429 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6chlz\" (UniqueName: \"kubernetes.io/projected/39bb2b60-4d35-423f-ac18-c48d017800cd-kube-api-access-6chlz\") pod \"collect-profiles-29530530-8zhjn\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.420583 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39bb2b60-4d35-423f-ac18-c48d017800cd-secret-volume\") pod \"collect-profiles-29530530-8zhjn\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.421576 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39bb2b60-4d35-423f-ac18-c48d017800cd-config-volume\") pod \"collect-profiles-29530530-8zhjn\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.429984 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39bb2b60-4d35-423f-ac18-c48d017800cd-secret-volume\") pod \"collect-profiles-29530530-8zhjn\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.439049 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chlz\" (UniqueName: \"kubernetes.io/projected/39bb2b60-4d35-423f-ac18-c48d017800cd-kube-api-access-6chlz\") pod \"collect-profiles-29530530-8zhjn\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.480135 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:00 crc kubenswrapper[5118]: I0223 07:30:00.996141 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn"] Feb 23 07:30:01 crc kubenswrapper[5118]: I0223 07:30:01.619912 5118 generic.go:334] "Generic (PLEG): container finished" podID="39bb2b60-4d35-423f-ac18-c48d017800cd" containerID="123da3c0cc5c405777b801da8c3912f59cb81951ccbf7ca79f59b658d01a7388" exitCode=0 Feb 23 07:30:01 crc kubenswrapper[5118]: I0223 07:30:01.620044 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" event={"ID":"39bb2b60-4d35-423f-ac18-c48d017800cd","Type":"ContainerDied","Data":"123da3c0cc5c405777b801da8c3912f59cb81951ccbf7ca79f59b658d01a7388"} Feb 23 07:30:01 crc kubenswrapper[5118]: I0223 07:30:01.620270 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" event={"ID":"39bb2b60-4d35-423f-ac18-c48d017800cd","Type":"ContainerStarted","Data":"618dd669e6ed115f1e4a8f79267164b7e3701c0c998527915544ed85f9464300"} Feb 23 07:30:02 crc kubenswrapper[5118]: I0223 07:30:02.940770 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.067238 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39bb2b60-4d35-423f-ac18-c48d017800cd-secret-volume\") pod \"39bb2b60-4d35-423f-ac18-c48d017800cd\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.067360 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39bb2b60-4d35-423f-ac18-c48d017800cd-config-volume\") pod \"39bb2b60-4d35-423f-ac18-c48d017800cd\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.067491 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6chlz\" (UniqueName: \"kubernetes.io/projected/39bb2b60-4d35-423f-ac18-c48d017800cd-kube-api-access-6chlz\") pod \"39bb2b60-4d35-423f-ac18-c48d017800cd\" (UID: \"39bb2b60-4d35-423f-ac18-c48d017800cd\") " Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.069789 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39bb2b60-4d35-423f-ac18-c48d017800cd-config-volume" (OuterVolumeSpecName: "config-volume") pod "39bb2b60-4d35-423f-ac18-c48d017800cd" (UID: "39bb2b60-4d35-423f-ac18-c48d017800cd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.090271 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39bb2b60-4d35-423f-ac18-c48d017800cd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "39bb2b60-4d35-423f-ac18-c48d017800cd" (UID: "39bb2b60-4d35-423f-ac18-c48d017800cd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.090354 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39bb2b60-4d35-423f-ac18-c48d017800cd-kube-api-access-6chlz" (OuterVolumeSpecName: "kube-api-access-6chlz") pod "39bb2b60-4d35-423f-ac18-c48d017800cd" (UID: "39bb2b60-4d35-423f-ac18-c48d017800cd"). InnerVolumeSpecName "kube-api-access-6chlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.168853 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6chlz\" (UniqueName: \"kubernetes.io/projected/39bb2b60-4d35-423f-ac18-c48d017800cd-kube-api-access-6chlz\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.168905 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39bb2b60-4d35-423f-ac18-c48d017800cd-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.168917 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39bb2b60-4d35-423f-ac18-c48d017800cd-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.636899 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" event={"ID":"39bb2b60-4d35-423f-ac18-c48d017800cd","Type":"ContainerDied","Data":"618dd669e6ed115f1e4a8f79267164b7e3701c0c998527915544ed85f9464300"} Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.636957 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="618dd669e6ed115f1e4a8f79267164b7e3701c0c998527915544ed85f9464300" Feb 23 07:30:03 crc kubenswrapper[5118]: I0223 07:30:03.636993 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn" Feb 23 07:30:04 crc kubenswrapper[5118]: I0223 07:30:04.051092 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5"] Feb 23 07:30:04 crc kubenswrapper[5118]: I0223 07:30:04.061236 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530485-fmxd5"] Feb 23 07:30:05 crc kubenswrapper[5118]: I0223 07:30:05.710381 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03934e78-e05d-4971-a195-9a9df7443df3" path="/var/lib/kubelet/pods/03934e78-e05d-4971-a195-9a9df7443df3/volumes" Feb 23 07:30:08 crc kubenswrapper[5118]: I0223 07:30:08.697304 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:30:08 crc kubenswrapper[5118]: E0223 07:30:08.697995 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:30:20 crc kubenswrapper[5118]: I0223 07:30:20.696982 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:30:20 crc kubenswrapper[5118]: E0223 07:30:20.697643 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:30:34 crc kubenswrapper[5118]: I0223 07:30:34.697909 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:30:34 crc kubenswrapper[5118]: E0223 07:30:34.698899 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:30:48 crc kubenswrapper[5118]: I0223 07:30:48.697549 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:30:48 crc kubenswrapper[5118]: E0223 07:30:48.701298 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:30:49 crc kubenswrapper[5118]: I0223 07:30:49.433174 5118 scope.go:117] "RemoveContainer" containerID="69ad52863daa08e8010dc71adba88dc18e5ac3b5cfd724da71a0c774535ccb9f" Feb 23 07:31:03 crc kubenswrapper[5118]: I0223 07:31:03.697077 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:31:03 crc kubenswrapper[5118]: E0223 07:31:03.698170 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:31:18 crc kubenswrapper[5118]: I0223 07:31:18.697803 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:31:18 crc kubenswrapper[5118]: E0223 07:31:18.698895 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:31:30 crc kubenswrapper[5118]: I0223 07:31:30.699485 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:31:30 crc kubenswrapper[5118]: E0223 07:31:30.700601 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.612630 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c8lrw"] Feb 23 07:31:38 crc kubenswrapper[5118]: E0223 07:31:38.615141 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39bb2b60-4d35-423f-ac18-c48d017800cd" containerName="collect-profiles" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.615165 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="39bb2b60-4d35-423f-ac18-c48d017800cd" containerName="collect-profiles" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.615458 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="39bb2b60-4d35-423f-ac18-c48d017800cd" containerName="collect-profiles" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.617249 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.642599 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8lrw"] Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.792936 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k67zz"] Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.794552 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.810960 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-utilities\") pod \"certified-operators-c8lrw\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.810989 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k67zz"] Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.812388 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-catalog-content\") pod \"certified-operators-c8lrw\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.812499 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwj5t\" (UniqueName: \"kubernetes.io/projected/b3d3ea90-ed7d-4de3-8eb7-056672edad20-kube-api-access-mwj5t\") pod \"certified-operators-c8lrw\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.913713 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgf8g\" (UniqueName: \"kubernetes.io/projected/a788cf7a-36bb-4c30-ba47-38078526903f-kube-api-access-cgf8g\") pod \"community-operators-k67zz\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.913785 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-catalog-content\") pod \"certified-operators-c8lrw\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.913988 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwj5t\" (UniqueName: \"kubernetes.io/projected/b3d3ea90-ed7d-4de3-8eb7-056672edad20-kube-api-access-mwj5t\") pod \"certified-operators-c8lrw\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.914167 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-utilities\") pod \"community-operators-k67zz\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.914209 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-utilities\") pod \"certified-operators-c8lrw\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.914380 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-catalog-content\") pod \"community-operators-k67zz\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.914385 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-catalog-content\") pod \"certified-operators-c8lrw\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.914699 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-utilities\") pod \"certified-operators-c8lrw\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.948151 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwj5t\" (UniqueName: \"kubernetes.io/projected/b3d3ea90-ed7d-4de3-8eb7-056672edad20-kube-api-access-mwj5t\") pod \"certified-operators-c8lrw\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:38 crc kubenswrapper[5118]: I0223 07:31:38.957729 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.015711 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-utilities\") pod \"community-operators-k67zz\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.015845 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-catalog-content\") pod \"community-operators-k67zz\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.015927 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgf8g\" (UniqueName: \"kubernetes.io/projected/a788cf7a-36bb-4c30-ba47-38078526903f-kube-api-access-cgf8g\") pod \"community-operators-k67zz\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.016375 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-utilities\") pod \"community-operators-k67zz\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.016447 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-catalog-content\") pod \"community-operators-k67zz\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.039167 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgf8g\" (UniqueName: \"kubernetes.io/projected/a788cf7a-36bb-4c30-ba47-38078526903f-kube-api-access-cgf8g\") pod \"community-operators-k67zz\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.126246 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.423960 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k67zz"] Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.477691 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8lrw"] Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.575848 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67zz" event={"ID":"a788cf7a-36bb-4c30-ba47-38078526903f","Type":"ContainerStarted","Data":"dc6f7d5e31ac54705b6dcf43f4c76e1a46cf6c80466fa4274db76fe7a48bcdd1"} Feb 23 07:31:39 crc kubenswrapper[5118]: I0223 07:31:39.583747 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8lrw" event={"ID":"b3d3ea90-ed7d-4de3-8eb7-056672edad20","Type":"ContainerStarted","Data":"3263c5196f79e598d736fcf110454fa6c02244ceded81a212956f31c46d331a7"} Feb 23 07:31:40 crc kubenswrapper[5118]: I0223 07:31:40.597172 5118 generic.go:334] "Generic (PLEG): container finished" podID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerID="8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc" exitCode=0 Feb 23 07:31:40 crc kubenswrapper[5118]: I0223 07:31:40.597292 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8lrw" event={"ID":"b3d3ea90-ed7d-4de3-8eb7-056672edad20","Type":"ContainerDied","Data":"8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc"} Feb 23 07:31:40 crc kubenswrapper[5118]: I0223 07:31:40.601715 5118 generic.go:334] "Generic (PLEG): container finished" podID="a788cf7a-36bb-4c30-ba47-38078526903f" containerID="634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70" exitCode=0 Feb 23 07:31:40 crc kubenswrapper[5118]: I0223 07:31:40.601851 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67zz" event={"ID":"a788cf7a-36bb-4c30-ba47-38078526903f","Type":"ContainerDied","Data":"634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70"} Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.001496 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cklrn"] Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.003218 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.024235 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cklrn"] Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.080609 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-catalog-content\") pod \"redhat-marketplace-cklrn\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.080658 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-utilities\") pod \"redhat-marketplace-cklrn\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.080686 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl2bt\" (UniqueName: \"kubernetes.io/projected/fb745a88-2d1b-46b8-b592-54e4f0973919-kube-api-access-pl2bt\") pod \"redhat-marketplace-cklrn\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.182362 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-catalog-content\") pod \"redhat-marketplace-cklrn\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.182490 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-utilities\") pod \"redhat-marketplace-cklrn\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.182551 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl2bt\" (UniqueName: \"kubernetes.io/projected/fb745a88-2d1b-46b8-b592-54e4f0973919-kube-api-access-pl2bt\") pod \"redhat-marketplace-cklrn\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.182776 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-catalog-content\") pod \"redhat-marketplace-cklrn\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.183203 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-utilities\") pod \"redhat-marketplace-cklrn\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.215130 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl2bt\" (UniqueName: \"kubernetes.io/projected/fb745a88-2d1b-46b8-b592-54e4f0973919-kube-api-access-pl2bt\") pod \"redhat-marketplace-cklrn\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.337884 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.613188 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8lrw" event={"ID":"b3d3ea90-ed7d-4de3-8eb7-056672edad20","Type":"ContainerStarted","Data":"b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275"} Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.616729 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67zz" event={"ID":"a788cf7a-36bb-4c30-ba47-38078526903f","Type":"ContainerStarted","Data":"9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258"} Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.682251 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cklrn"] Feb 23 07:31:41 crc kubenswrapper[5118]: I0223 07:31:41.697039 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:31:41 crc kubenswrapper[5118]: W0223 07:31:41.735207 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb745a88_2d1b_46b8_b592_54e4f0973919.slice/crio-51d3b5322b2035c2f6b49ba1708ce2d1b964cae4833b418f1952f61b28a3ce53 WatchSource:0}: Error finding container 51d3b5322b2035c2f6b49ba1708ce2d1b964cae4833b418f1952f61b28a3ce53: Status 404 returned error can't find the container with id 51d3b5322b2035c2f6b49ba1708ce2d1b964cae4833b418f1952f61b28a3ce53 Feb 23 07:31:42 crc kubenswrapper[5118]: I0223 07:31:42.628551 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"4045ffb25c3eb14bb95e1d41a4c918ac196f68ba96217df220c80f68a7b2d042"} Feb 23 07:31:42 crc kubenswrapper[5118]: I0223 07:31:42.632250 5118 generic.go:334] "Generic (PLEG): container finished" podID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerID="b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275" exitCode=0 Feb 23 07:31:42 crc kubenswrapper[5118]: I0223 07:31:42.632374 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8lrw" event={"ID":"b3d3ea90-ed7d-4de3-8eb7-056672edad20","Type":"ContainerDied","Data":"b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275"} Feb 23 07:31:42 crc kubenswrapper[5118]: I0223 07:31:42.635599 5118 generic.go:334] "Generic (PLEG): container finished" podID="a788cf7a-36bb-4c30-ba47-38078526903f" containerID="9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258" exitCode=0 Feb 23 07:31:42 crc kubenswrapper[5118]: I0223 07:31:42.635674 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67zz" event={"ID":"a788cf7a-36bb-4c30-ba47-38078526903f","Type":"ContainerDied","Data":"9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258"} Feb 23 07:31:42 crc kubenswrapper[5118]: I0223 07:31:42.638545 5118 generic.go:334] "Generic (PLEG): container finished" podID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerID="2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e" exitCode=0 Feb 23 07:31:42 crc kubenswrapper[5118]: I0223 07:31:42.638606 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cklrn" event={"ID":"fb745a88-2d1b-46b8-b592-54e4f0973919","Type":"ContainerDied","Data":"2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e"} Feb 23 07:31:42 crc kubenswrapper[5118]: I0223 07:31:42.638640 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cklrn" event={"ID":"fb745a88-2d1b-46b8-b592-54e4f0973919","Type":"ContainerStarted","Data":"51d3b5322b2035c2f6b49ba1708ce2d1b964cae4833b418f1952f61b28a3ce53"} Feb 23 07:31:43 crc kubenswrapper[5118]: I0223 07:31:43.652486 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8lrw" event={"ID":"b3d3ea90-ed7d-4de3-8eb7-056672edad20","Type":"ContainerStarted","Data":"f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965"} Feb 23 07:31:43 crc kubenswrapper[5118]: I0223 07:31:43.656561 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67zz" event={"ID":"a788cf7a-36bb-4c30-ba47-38078526903f","Type":"ContainerStarted","Data":"1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d"} Feb 23 07:31:43 crc kubenswrapper[5118]: I0223 07:31:43.678300 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cklrn" event={"ID":"fb745a88-2d1b-46b8-b592-54e4f0973919","Type":"ContainerStarted","Data":"65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af"} Feb 23 07:31:43 crc kubenswrapper[5118]: I0223 07:31:43.696334 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c8lrw" podStartSLOduration=3.225824025 podStartE2EDuration="5.696314146s" podCreationTimestamp="2026-02-23 07:31:38 +0000 UTC" firstStartedPulling="2026-02-23 07:31:40.601426523 +0000 UTC m=+2763.605211146" lastFinishedPulling="2026-02-23 07:31:43.071916654 +0000 UTC m=+2766.075701267" observedRunningTime="2026-02-23 07:31:43.69524903 +0000 UTC m=+2766.699033603" watchObservedRunningTime="2026-02-23 07:31:43.696314146 +0000 UTC m=+2766.700098719" Feb 23 07:31:43 crc kubenswrapper[5118]: I0223 07:31:43.737866 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k67zz" podStartSLOduration=3.258332323 podStartE2EDuration="5.737844373s" podCreationTimestamp="2026-02-23 07:31:38 +0000 UTC" firstStartedPulling="2026-02-23 07:31:40.60504017 +0000 UTC m=+2763.608824773" lastFinishedPulling="2026-02-23 07:31:43.08455221 +0000 UTC m=+2766.088336823" observedRunningTime="2026-02-23 07:31:43.736363657 +0000 UTC m=+2766.740148230" watchObservedRunningTime="2026-02-23 07:31:43.737844373 +0000 UTC m=+2766.741628946" Feb 23 07:31:44 crc kubenswrapper[5118]: I0223 07:31:44.701574 5118 generic.go:334] "Generic (PLEG): container finished" podID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerID="65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af" exitCode=0 Feb 23 07:31:44 crc kubenswrapper[5118]: I0223 07:31:44.703722 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cklrn" event={"ID":"fb745a88-2d1b-46b8-b592-54e4f0973919","Type":"ContainerDied","Data":"65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af"} Feb 23 07:31:44 crc kubenswrapper[5118]: I0223 07:31:44.706823 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cklrn" event={"ID":"fb745a88-2d1b-46b8-b592-54e4f0973919","Type":"ContainerStarted","Data":"0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204"} Feb 23 07:31:44 crc kubenswrapper[5118]: I0223 07:31:44.746855 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cklrn" podStartSLOduration=3.239776254 podStartE2EDuration="4.746826841s" podCreationTimestamp="2026-02-23 07:31:40 +0000 UTC" firstStartedPulling="2026-02-23 07:31:42.643416002 +0000 UTC m=+2765.647200615" lastFinishedPulling="2026-02-23 07:31:44.150466629 +0000 UTC m=+2767.154251202" observedRunningTime="2026-02-23 07:31:44.739661177 +0000 UTC m=+2767.743445740" watchObservedRunningTime="2026-02-23 07:31:44.746826841 +0000 UTC m=+2767.750611424" Feb 23 07:31:48 crc kubenswrapper[5118]: I0223 07:31:48.958682 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:48 crc kubenswrapper[5118]: I0223 07:31:48.959674 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:49 crc kubenswrapper[5118]: I0223 07:31:49.047613 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:49 crc kubenswrapper[5118]: I0223 07:31:49.127222 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:49 crc kubenswrapper[5118]: I0223 07:31:49.127342 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:49 crc kubenswrapper[5118]: I0223 07:31:49.214597 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:49 crc kubenswrapper[5118]: I0223 07:31:49.809122 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:49 crc kubenswrapper[5118]: I0223 07:31:49.838752 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:51 crc kubenswrapper[5118]: I0223 07:31:51.338956 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:51 crc kubenswrapper[5118]: I0223 07:31:51.339612 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:51 crc kubenswrapper[5118]: I0223 07:31:51.399866 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8lrw"] Feb 23 07:31:51 crc kubenswrapper[5118]: I0223 07:31:51.407140 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:51 crc kubenswrapper[5118]: I0223 07:31:51.780821 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c8lrw" podUID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerName="registry-server" containerID="cri-o://f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965" gracePeriod=2 Feb 23 07:31:51 crc kubenswrapper[5118]: I0223 07:31:51.851888 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.292965 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.477920 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-utilities\") pod \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.477994 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwj5t\" (UniqueName: \"kubernetes.io/projected/b3d3ea90-ed7d-4de3-8eb7-056672edad20-kube-api-access-mwj5t\") pod \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.478142 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-catalog-content\") pod \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\" (UID: \"b3d3ea90-ed7d-4de3-8eb7-056672edad20\") " Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.479143 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-utilities" (OuterVolumeSpecName: "utilities") pod "b3d3ea90-ed7d-4de3-8eb7-056672edad20" (UID: "b3d3ea90-ed7d-4de3-8eb7-056672edad20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.486557 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d3ea90-ed7d-4de3-8eb7-056672edad20-kube-api-access-mwj5t" (OuterVolumeSpecName: "kube-api-access-mwj5t") pod "b3d3ea90-ed7d-4de3-8eb7-056672edad20" (UID: "b3d3ea90-ed7d-4de3-8eb7-056672edad20"). InnerVolumeSpecName "kube-api-access-mwj5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.580262 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwj5t\" (UniqueName: \"kubernetes.io/projected/b3d3ea90-ed7d-4de3-8eb7-056672edad20-kube-api-access-mwj5t\") on node \"crc\" DevicePath \"\"" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.580301 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.640920 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3d3ea90-ed7d-4de3-8eb7-056672edad20" (UID: "b3d3ea90-ed7d-4de3-8eb7-056672edad20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.681691 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d3ea90-ed7d-4de3-8eb7-056672edad20-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.798250 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k67zz"] Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.798269 5118 generic.go:334] "Generic (PLEG): container finished" podID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerID="f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965" exitCode=0 Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.798323 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8lrw" event={"ID":"b3d3ea90-ed7d-4de3-8eb7-056672edad20","Type":"ContainerDied","Data":"f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965"} Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.798362 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8lrw" event={"ID":"b3d3ea90-ed7d-4de3-8eb7-056672edad20","Type":"ContainerDied","Data":"3263c5196f79e598d736fcf110454fa6c02244ceded81a212956f31c46d331a7"} Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.798389 5118 scope.go:117] "RemoveContainer" containerID="f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.798449 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8lrw" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.799399 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k67zz" podUID="a788cf7a-36bb-4c30-ba47-38078526903f" containerName="registry-server" containerID="cri-o://1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d" gracePeriod=2 Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.836515 5118 scope.go:117] "RemoveContainer" containerID="b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.844948 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8lrw"] Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.855152 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c8lrw"] Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.877477 5118 scope.go:117] "RemoveContainer" containerID="8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.995377 5118 scope.go:117] "RemoveContainer" containerID="f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965" Feb 23 07:31:52 crc kubenswrapper[5118]: E0223 07:31:52.996649 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965\": container with ID starting with f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965 not found: ID does not exist" containerID="f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.996719 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965"} err="failed to get container status \"f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965\": rpc error: code = NotFound desc = could not find container \"f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965\": container with ID starting with f1c0c78121c3030e9db0670d9bd7b493fd3697bb6b9baded248650453e17a965 not found: ID does not exist" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.996792 5118 scope.go:117] "RemoveContainer" containerID="b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275" Feb 23 07:31:52 crc kubenswrapper[5118]: E0223 07:31:52.998172 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275\": container with ID starting with b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275 not found: ID does not exist" containerID="b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.998228 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275"} err="failed to get container status \"b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275\": rpc error: code = NotFound desc = could not find container \"b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275\": container with ID starting with b7ba660b961273fd0423d3df4e926389cd8f6f8d9983a06489b63a3054a77275 not found: ID does not exist" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.998265 5118 scope.go:117] "RemoveContainer" containerID="8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc" Feb 23 07:31:52 crc kubenswrapper[5118]: E0223 07:31:52.998855 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc\": container with ID starting with 8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc not found: ID does not exist" containerID="8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc" Feb 23 07:31:52 crc kubenswrapper[5118]: I0223 07:31:52.998894 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc"} err="failed to get container status \"8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc\": rpc error: code = NotFound desc = could not find container \"8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc\": container with ID starting with 8380e3223f8745bc70ea1a039d95765901e1fff071c10bb09cd170cc161e33bc not found: ID does not exist" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.239462 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.395450 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgf8g\" (UniqueName: \"kubernetes.io/projected/a788cf7a-36bb-4c30-ba47-38078526903f-kube-api-access-cgf8g\") pod \"a788cf7a-36bb-4c30-ba47-38078526903f\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.395550 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-utilities\") pod \"a788cf7a-36bb-4c30-ba47-38078526903f\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.395790 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-catalog-content\") pod \"a788cf7a-36bb-4c30-ba47-38078526903f\" (UID: \"a788cf7a-36bb-4c30-ba47-38078526903f\") " Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.396893 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-utilities" (OuterVolumeSpecName: "utilities") pod "a788cf7a-36bb-4c30-ba47-38078526903f" (UID: "a788cf7a-36bb-4c30-ba47-38078526903f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.406610 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a788cf7a-36bb-4c30-ba47-38078526903f-kube-api-access-cgf8g" (OuterVolumeSpecName: "kube-api-access-cgf8g") pod "a788cf7a-36bb-4c30-ba47-38078526903f" (UID: "a788cf7a-36bb-4c30-ba47-38078526903f"). InnerVolumeSpecName "kube-api-access-cgf8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.466773 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a788cf7a-36bb-4c30-ba47-38078526903f" (UID: "a788cf7a-36bb-4c30-ba47-38078526903f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.499983 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.500042 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgf8g\" (UniqueName: \"kubernetes.io/projected/a788cf7a-36bb-4c30-ba47-38078526903f-kube-api-access-cgf8g\") on node \"crc\" DevicePath \"\"" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.500055 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a788cf7a-36bb-4c30-ba47-38078526903f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.711213 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" path="/var/lib/kubelet/pods/b3d3ea90-ed7d-4de3-8eb7-056672edad20/volumes" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.812537 5118 generic.go:334] "Generic (PLEG): container finished" podID="a788cf7a-36bb-4c30-ba47-38078526903f" containerID="1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d" exitCode=0 Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.812615 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67zz" event={"ID":"a788cf7a-36bb-4c30-ba47-38078526903f","Type":"ContainerDied","Data":"1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d"} Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.812648 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67zz" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.812661 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67zz" event={"ID":"a788cf7a-36bb-4c30-ba47-38078526903f","Type":"ContainerDied","Data":"dc6f7d5e31ac54705b6dcf43f4c76e1a46cf6c80466fa4274db76fe7a48bcdd1"} Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.812697 5118 scope.go:117] "RemoveContainer" containerID="1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.846330 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k67zz"] Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.849021 5118 scope.go:117] "RemoveContainer" containerID="9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.852851 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k67zz"] Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.873040 5118 scope.go:117] "RemoveContainer" containerID="634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.889047 5118 scope.go:117] "RemoveContainer" containerID="1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d" Feb 23 07:31:53 crc kubenswrapper[5118]: E0223 07:31:53.889737 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d\": container with ID starting with 1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d not found: ID does not exist" containerID="1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.889804 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d"} err="failed to get container status \"1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d\": rpc error: code = NotFound desc = could not find container \"1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d\": container with ID starting with 1d0165ba07676f61c3414764d0662d8cf8dbe626e470d299e141ed19a369967d not found: ID does not exist" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.889837 5118 scope.go:117] "RemoveContainer" containerID="9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258" Feb 23 07:31:53 crc kubenswrapper[5118]: E0223 07:31:53.890349 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258\": container with ID starting with 9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258 not found: ID does not exist" containerID="9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.890443 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258"} err="failed to get container status \"9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258\": rpc error: code = NotFound desc = could not find container \"9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258\": container with ID starting with 9e608057c86c1384add5c7d08725a8962d3f0de4a0a48b1a0050b1d0532c7258 not found: ID does not exist" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.890521 5118 scope.go:117] "RemoveContainer" containerID="634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70" Feb 23 07:31:53 crc kubenswrapper[5118]: E0223 07:31:53.890992 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70\": container with ID starting with 634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70 not found: ID does not exist" containerID="634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70" Feb 23 07:31:53 crc kubenswrapper[5118]: I0223 07:31:53.891040 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70"} err="failed to get container status \"634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70\": rpc error: code = NotFound desc = could not find container \"634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70\": container with ID starting with 634d5736b9767f6cec0e20dab75da1bdc276b4078cf3d584ab1f0572d0c13d70 not found: ID does not exist" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.187423 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cklrn"] Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.189312 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cklrn" podUID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerName="registry-server" containerID="cri-o://0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204" gracePeriod=2 Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.708831 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a788cf7a-36bb-4c30-ba47-38078526903f" path="/var/lib/kubelet/pods/a788cf7a-36bb-4c30-ba47-38078526903f/volumes" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.715059 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.752880 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-utilities\") pod \"fb745a88-2d1b-46b8-b592-54e4f0973919\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.753151 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl2bt\" (UniqueName: \"kubernetes.io/projected/fb745a88-2d1b-46b8-b592-54e4f0973919-kube-api-access-pl2bt\") pod \"fb745a88-2d1b-46b8-b592-54e4f0973919\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.753186 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-catalog-content\") pod \"fb745a88-2d1b-46b8-b592-54e4f0973919\" (UID: \"fb745a88-2d1b-46b8-b592-54e4f0973919\") " Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.754332 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-utilities" (OuterVolumeSpecName: "utilities") pod "fb745a88-2d1b-46b8-b592-54e4f0973919" (UID: "fb745a88-2d1b-46b8-b592-54e4f0973919"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.760397 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb745a88-2d1b-46b8-b592-54e4f0973919-kube-api-access-pl2bt" (OuterVolumeSpecName: "kube-api-access-pl2bt") pod "fb745a88-2d1b-46b8-b592-54e4f0973919" (UID: "fb745a88-2d1b-46b8-b592-54e4f0973919"). InnerVolumeSpecName "kube-api-access-pl2bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.799000 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb745a88-2d1b-46b8-b592-54e4f0973919" (UID: "fb745a88-2d1b-46b8-b592-54e4f0973919"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.863668 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl2bt\" (UniqueName: \"kubernetes.io/projected/fb745a88-2d1b-46b8-b592-54e4f0973919-kube-api-access-pl2bt\") on node \"crc\" DevicePath \"\"" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.863747 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.863765 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb745a88-2d1b-46b8-b592-54e4f0973919-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.877452 5118 generic.go:334] "Generic (PLEG): container finished" podID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerID="0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204" exitCode=0 Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.877578 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cklrn" event={"ID":"fb745a88-2d1b-46b8-b592-54e4f0973919","Type":"ContainerDied","Data":"0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204"} Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.877636 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cklrn" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.877752 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cklrn" event={"ID":"fb745a88-2d1b-46b8-b592-54e4f0973919","Type":"ContainerDied","Data":"51d3b5322b2035c2f6b49ba1708ce2d1b964cae4833b418f1952f61b28a3ce53"} Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.877898 5118 scope.go:117] "RemoveContainer" containerID="0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.919127 5118 scope.go:117] "RemoveContainer" containerID="65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.928687 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cklrn"] Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.939693 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cklrn"] Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.945121 5118 scope.go:117] "RemoveContainer" containerID="2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.969729 5118 scope.go:117] "RemoveContainer" containerID="0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204" Feb 23 07:31:55 crc kubenswrapper[5118]: E0223 07:31:55.970763 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204\": container with ID starting with 0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204 not found: ID does not exist" containerID="0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.970884 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204"} err="failed to get container status \"0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204\": rpc error: code = NotFound desc = could not find container \"0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204\": container with ID starting with 0bb963ce3bf1b46c0526ed398feadb276789b5f89136a68a294ceab94ea31204 not found: ID does not exist" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.970968 5118 scope.go:117] "RemoveContainer" containerID="65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af" Feb 23 07:31:55 crc kubenswrapper[5118]: E0223 07:31:55.971773 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af\": container with ID starting with 65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af not found: ID does not exist" containerID="65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.971868 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af"} err="failed to get container status \"65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af\": rpc error: code = NotFound desc = could not find container \"65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af\": container with ID starting with 65595ba0eb6fbbfeccf08ca7265febbeb7d925bc72a3eede036f57e090fc97af not found: ID does not exist" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.971958 5118 scope.go:117] "RemoveContainer" containerID="2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e" Feb 23 07:31:55 crc kubenswrapper[5118]: E0223 07:31:55.972694 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e\": container with ID starting with 2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e not found: ID does not exist" containerID="2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e" Feb 23 07:31:55 crc kubenswrapper[5118]: I0223 07:31:55.972754 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e"} err="failed to get container status \"2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e\": rpc error: code = NotFound desc = could not find container \"2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e\": container with ID starting with 2d9878d8a72d8b33b345ba8e375b7cb6f9dc5096b3407f4bc7cfe25c6c489f1e not found: ID does not exist" Feb 23 07:31:57 crc kubenswrapper[5118]: I0223 07:31:57.713943 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb745a88-2d1b-46b8-b592-54e4f0973919" path="/var/lib/kubelet/pods/fb745a88-2d1b-46b8-b592-54e4f0973919/volumes" Feb 23 07:34:02 crc kubenswrapper[5118]: I0223 07:34:02.975430 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:34:02 crc kubenswrapper[5118]: I0223 07:34:02.976140 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:34:32 crc kubenswrapper[5118]: I0223 07:34:32.975473 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:34:32 crc kubenswrapper[5118]: I0223 07:34:32.976199 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:35:02 crc kubenswrapper[5118]: I0223 07:35:02.975223 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:35:02 crc kubenswrapper[5118]: I0223 07:35:02.976143 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:35:02 crc kubenswrapper[5118]: I0223 07:35:02.976233 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:35:02 crc kubenswrapper[5118]: I0223 07:35:02.977267 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4045ffb25c3eb14bb95e1d41a4c918ac196f68ba96217df220c80f68a7b2d042"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:35:02 crc kubenswrapper[5118]: I0223 07:35:02.977417 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://4045ffb25c3eb14bb95e1d41a4c918ac196f68ba96217df220c80f68a7b2d042" gracePeriod=600 Feb 23 07:35:03 crc kubenswrapper[5118]: I0223 07:35:03.789113 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="4045ffb25c3eb14bb95e1d41a4c918ac196f68ba96217df220c80f68a7b2d042" exitCode=0 Feb 23 07:35:03 crc kubenswrapper[5118]: I0223 07:35:03.789222 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"4045ffb25c3eb14bb95e1d41a4c918ac196f68ba96217df220c80f68a7b2d042"} Feb 23 07:35:03 crc kubenswrapper[5118]: I0223 07:35:03.789567 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35"} Feb 23 07:35:03 crc kubenswrapper[5118]: I0223 07:35:03.789593 5118 scope.go:117] "RemoveContainer" containerID="f2bbc63b4bec079f931ac5585e2c6e246cc7b8fb427f3d858748352e99638610" Feb 23 07:37:32 crc kubenswrapper[5118]: I0223 07:37:32.976306 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:37:32 crc kubenswrapper[5118]: I0223 07:37:32.977671 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:38:02 crc kubenswrapper[5118]: I0223 07:38:02.974945 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:38:02 crc kubenswrapper[5118]: I0223 07:38:02.975676 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.183680 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zwgmd"] Feb 23 07:38:32 crc kubenswrapper[5118]: E0223 07:38:32.184927 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerName="extract-utilities" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.184949 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerName="extract-utilities" Feb 23 07:38:32 crc kubenswrapper[5118]: E0223 07:38:32.184975 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerName="extract-content" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.184987 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerName="extract-content" Feb 23 07:38:32 crc kubenswrapper[5118]: E0223 07:38:32.185011 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a788cf7a-36bb-4c30-ba47-38078526903f" containerName="extract-utilities" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.185023 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a788cf7a-36bb-4c30-ba47-38078526903f" containerName="extract-utilities" Feb 23 07:38:32 crc kubenswrapper[5118]: E0223 07:38:32.185041 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a788cf7a-36bb-4c30-ba47-38078526903f" containerName="registry-server" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.185053 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a788cf7a-36bb-4c30-ba47-38078526903f" containerName="registry-server" Feb 23 07:38:32 crc kubenswrapper[5118]: E0223 07:38:32.185074 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerName="extract-utilities" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.185086 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerName="extract-utilities" Feb 23 07:38:32 crc kubenswrapper[5118]: E0223 07:38:32.185141 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerName="registry-server" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.185154 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerName="registry-server" Feb 23 07:38:32 crc kubenswrapper[5118]: E0223 07:38:32.185174 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerName="extract-content" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.185186 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerName="extract-content" Feb 23 07:38:32 crc kubenswrapper[5118]: E0223 07:38:32.185202 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerName="registry-server" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.185215 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerName="registry-server" Feb 23 07:38:32 crc kubenswrapper[5118]: E0223 07:38:32.185238 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a788cf7a-36bb-4c30-ba47-38078526903f" containerName="extract-content" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.185250 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a788cf7a-36bb-4c30-ba47-38078526903f" containerName="extract-content" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.185508 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d3ea90-ed7d-4de3-8eb7-056672edad20" containerName="registry-server" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.185539 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb745a88-2d1b-46b8-b592-54e4f0973919" containerName="registry-server" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.185562 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a788cf7a-36bb-4c30-ba47-38078526903f" containerName="registry-server" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.187216 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.225085 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwgmd"] Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.259524 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-catalog-content\") pod \"redhat-operators-zwgmd\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.259594 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrt9p\" (UniqueName: \"kubernetes.io/projected/83a43f13-de78-4133-9065-dc9a6aa0d692-kube-api-access-vrt9p\") pod \"redhat-operators-zwgmd\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.259663 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-utilities\") pod \"redhat-operators-zwgmd\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.360844 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-catalog-content\") pod \"redhat-operators-zwgmd\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.360954 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrt9p\" (UniqueName: \"kubernetes.io/projected/83a43f13-de78-4133-9065-dc9a6aa0d692-kube-api-access-vrt9p\") pod \"redhat-operators-zwgmd\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.361034 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-utilities\") pod \"redhat-operators-zwgmd\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.361717 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-catalog-content\") pod \"redhat-operators-zwgmd\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.361882 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-utilities\") pod \"redhat-operators-zwgmd\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.386860 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrt9p\" (UniqueName: \"kubernetes.io/projected/83a43f13-de78-4133-9065-dc9a6aa0d692-kube-api-access-vrt9p\") pod \"redhat-operators-zwgmd\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.524238 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.975536 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.976127 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.976204 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.977006 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.977137 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" gracePeriod=600 Feb 23 07:38:32 crc kubenswrapper[5118]: I0223 07:38:32.996541 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwgmd"] Feb 23 07:38:33 crc kubenswrapper[5118]: E0223 07:38:33.102235 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:38:33 crc kubenswrapper[5118]: I0223 07:38:33.899031 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" exitCode=0 Feb 23 07:38:33 crc kubenswrapper[5118]: I0223 07:38:33.899147 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35"} Feb 23 07:38:33 crc kubenswrapper[5118]: I0223 07:38:33.900275 5118 scope.go:117] "RemoveContainer" containerID="4045ffb25c3eb14bb95e1d41a4c918ac196f68ba96217df220c80f68a7b2d042" Feb 23 07:38:33 crc kubenswrapper[5118]: I0223 07:38:33.901250 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:38:33 crc kubenswrapper[5118]: E0223 07:38:33.901809 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:38:33 crc kubenswrapper[5118]: I0223 07:38:33.902119 5118 generic.go:334] "Generic (PLEG): container finished" podID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerID="44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12" exitCode=0 Feb 23 07:38:33 crc kubenswrapper[5118]: I0223 07:38:33.902191 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwgmd" event={"ID":"83a43f13-de78-4133-9065-dc9a6aa0d692","Type":"ContainerDied","Data":"44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12"} Feb 23 07:38:33 crc kubenswrapper[5118]: I0223 07:38:33.902236 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwgmd" event={"ID":"83a43f13-de78-4133-9065-dc9a6aa0d692","Type":"ContainerStarted","Data":"60c058eeff7cfdfd49483683041a778fe628e0aa937816fa0006480a3b6de8a2"} Feb 23 07:38:33 crc kubenswrapper[5118]: I0223 07:38:33.904758 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:38:35 crc kubenswrapper[5118]: I0223 07:38:35.931458 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwgmd" event={"ID":"83a43f13-de78-4133-9065-dc9a6aa0d692","Type":"ContainerStarted","Data":"0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289"} Feb 23 07:38:36 crc kubenswrapper[5118]: I0223 07:38:36.944801 5118 generic.go:334] "Generic (PLEG): container finished" podID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerID="0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289" exitCode=0 Feb 23 07:38:36 crc kubenswrapper[5118]: I0223 07:38:36.944866 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwgmd" event={"ID":"83a43f13-de78-4133-9065-dc9a6aa0d692","Type":"ContainerDied","Data":"0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289"} Feb 23 07:38:37 crc kubenswrapper[5118]: I0223 07:38:37.955433 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwgmd" event={"ID":"83a43f13-de78-4133-9065-dc9a6aa0d692","Type":"ContainerStarted","Data":"08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214"} Feb 23 07:38:37 crc kubenswrapper[5118]: I0223 07:38:37.987066 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zwgmd" podStartSLOduration=2.28570908 podStartE2EDuration="5.987048222s" podCreationTimestamp="2026-02-23 07:38:32 +0000 UTC" firstStartedPulling="2026-02-23 07:38:33.90449268 +0000 UTC m=+3176.908277273" lastFinishedPulling="2026-02-23 07:38:37.605831802 +0000 UTC m=+3180.609616415" observedRunningTime="2026-02-23 07:38:37.983047495 +0000 UTC m=+3180.986832068" watchObservedRunningTime="2026-02-23 07:38:37.987048222 +0000 UTC m=+3180.990832795" Feb 23 07:38:42 crc kubenswrapper[5118]: I0223 07:38:42.524407 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:42 crc kubenswrapper[5118]: I0223 07:38:42.524757 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:43 crc kubenswrapper[5118]: I0223 07:38:43.583188 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zwgmd" podUID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerName="registry-server" probeResult="failure" output=< Feb 23 07:38:43 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 07:38:43 crc kubenswrapper[5118]: > Feb 23 07:38:47 crc kubenswrapper[5118]: I0223 07:38:47.704697 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:38:47 crc kubenswrapper[5118]: E0223 07:38:47.705501 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:38:52 crc kubenswrapper[5118]: I0223 07:38:52.588922 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:52 crc kubenswrapper[5118]: I0223 07:38:52.670705 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:52 crc kubenswrapper[5118]: I0223 07:38:52.838837 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwgmd"] Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.096226 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zwgmd" podUID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerName="registry-server" containerID="cri-o://08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214" gracePeriod=2 Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.623948 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.709326 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-catalog-content\") pod \"83a43f13-de78-4133-9065-dc9a6aa0d692\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.709426 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrt9p\" (UniqueName: \"kubernetes.io/projected/83a43f13-de78-4133-9065-dc9a6aa0d692-kube-api-access-vrt9p\") pod \"83a43f13-de78-4133-9065-dc9a6aa0d692\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.709616 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-utilities\") pod \"83a43f13-de78-4133-9065-dc9a6aa0d692\" (UID: \"83a43f13-de78-4133-9065-dc9a6aa0d692\") " Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.711485 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-utilities" (OuterVolumeSpecName: "utilities") pod "83a43f13-de78-4133-9065-dc9a6aa0d692" (UID: "83a43f13-de78-4133-9065-dc9a6aa0d692"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.716043 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83a43f13-de78-4133-9065-dc9a6aa0d692-kube-api-access-vrt9p" (OuterVolumeSpecName: "kube-api-access-vrt9p") pod "83a43f13-de78-4133-9065-dc9a6aa0d692" (UID: "83a43f13-de78-4133-9065-dc9a6aa0d692"). InnerVolumeSpecName "kube-api-access-vrt9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.811916 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.811957 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrt9p\" (UniqueName: \"kubernetes.io/projected/83a43f13-de78-4133-9065-dc9a6aa0d692-kube-api-access-vrt9p\") on node \"crc\" DevicePath \"\"" Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.869454 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83a43f13-de78-4133-9065-dc9a6aa0d692" (UID: "83a43f13-de78-4133-9065-dc9a6aa0d692"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:38:54 crc kubenswrapper[5118]: I0223 07:38:54.913172 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83a43f13-de78-4133-9065-dc9a6aa0d692-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.112960 5118 generic.go:334] "Generic (PLEG): container finished" podID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerID="08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214" exitCode=0 Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.113039 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwgmd" event={"ID":"83a43f13-de78-4133-9065-dc9a6aa0d692","Type":"ContainerDied","Data":"08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214"} Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.113127 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwgmd" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.113180 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwgmd" event={"ID":"83a43f13-de78-4133-9065-dc9a6aa0d692","Type":"ContainerDied","Data":"60c058eeff7cfdfd49483683041a778fe628e0aa937816fa0006480a3b6de8a2"} Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.113243 5118 scope.go:117] "RemoveContainer" containerID="08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.158777 5118 scope.go:117] "RemoveContainer" containerID="0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.171049 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwgmd"] Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.183001 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zwgmd"] Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.210816 5118 scope.go:117] "RemoveContainer" containerID="44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.238649 5118 scope.go:117] "RemoveContainer" containerID="08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214" Feb 23 07:38:55 crc kubenswrapper[5118]: E0223 07:38:55.239194 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214\": container with ID starting with 08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214 not found: ID does not exist" containerID="08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.239242 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214"} err="failed to get container status \"08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214\": rpc error: code = NotFound desc = could not find container \"08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214\": container with ID starting with 08fab047f60bafba528da478b4fb52d8d64715ce58c4c86d6a381b59d6eb8214 not found: ID does not exist" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.239269 5118 scope.go:117] "RemoveContainer" containerID="0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289" Feb 23 07:38:55 crc kubenswrapper[5118]: E0223 07:38:55.239755 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289\": container with ID starting with 0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289 not found: ID does not exist" containerID="0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.239959 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289"} err="failed to get container status \"0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289\": rpc error: code = NotFound desc = could not find container \"0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289\": container with ID starting with 0bafbdefe7bceecfce5f34e318823f2525ce79faf9eb4c4ef60f593271616289 not found: ID does not exist" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.240166 5118 scope.go:117] "RemoveContainer" containerID="44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12" Feb 23 07:38:55 crc kubenswrapper[5118]: E0223 07:38:55.240968 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12\": container with ID starting with 44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12 not found: ID does not exist" containerID="44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.241013 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12"} err="failed to get container status \"44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12\": rpc error: code = NotFound desc = could not find container \"44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12\": container with ID starting with 44f308714e261f2387e388b8aaabbd6c28dff975a3c4bfd949b7443b142e5a12 not found: ID does not exist" Feb 23 07:38:55 crc kubenswrapper[5118]: I0223 07:38:55.715311 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83a43f13-de78-4133-9065-dc9a6aa0d692" path="/var/lib/kubelet/pods/83a43f13-de78-4133-9065-dc9a6aa0d692/volumes" Feb 23 07:38:58 crc kubenswrapper[5118]: I0223 07:38:58.698665 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:38:58 crc kubenswrapper[5118]: E0223 07:38:58.699141 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:39:13 crc kubenswrapper[5118]: I0223 07:39:13.697929 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:39:13 crc kubenswrapper[5118]: E0223 07:39:13.698935 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:39:24 crc kubenswrapper[5118]: I0223 07:39:24.697982 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:39:24 crc kubenswrapper[5118]: E0223 07:39:24.699236 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:39:35 crc kubenswrapper[5118]: I0223 07:39:35.697760 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:39:35 crc kubenswrapper[5118]: E0223 07:39:35.699055 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:39:48 crc kubenswrapper[5118]: I0223 07:39:48.698126 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:39:48 crc kubenswrapper[5118]: E0223 07:39:48.699176 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:39:59 crc kubenswrapper[5118]: I0223 07:39:59.697358 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:39:59 crc kubenswrapper[5118]: E0223 07:39:59.698520 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:40:10 crc kubenswrapper[5118]: I0223 07:40:10.696867 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:40:10 crc kubenswrapper[5118]: E0223 07:40:10.697547 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:40:25 crc kubenswrapper[5118]: I0223 07:40:25.697165 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:40:25 crc kubenswrapper[5118]: E0223 07:40:25.698336 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:40:39 crc kubenswrapper[5118]: I0223 07:40:39.696945 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:40:39 crc kubenswrapper[5118]: E0223 07:40:39.698192 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:40:54 crc kubenswrapper[5118]: I0223 07:40:54.698188 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:40:54 crc kubenswrapper[5118]: E0223 07:40:54.700076 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:41:06 crc kubenswrapper[5118]: I0223 07:41:06.697014 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:41:06 crc kubenswrapper[5118]: E0223 07:41:06.698308 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:41:19 crc kubenswrapper[5118]: I0223 07:41:19.697292 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:41:19 crc kubenswrapper[5118]: E0223 07:41:19.698393 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:41:30 crc kubenswrapper[5118]: I0223 07:41:30.697071 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:41:30 crc kubenswrapper[5118]: E0223 07:41:30.698388 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:41:43 crc kubenswrapper[5118]: I0223 07:41:43.698063 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:41:43 crc kubenswrapper[5118]: E0223 07:41:43.701123 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:41:55 crc kubenswrapper[5118]: I0223 07:41:55.698961 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:41:55 crc kubenswrapper[5118]: E0223 07:41:55.700303 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:42:08 crc kubenswrapper[5118]: I0223 07:42:08.697966 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:42:08 crc kubenswrapper[5118]: E0223 07:42:08.699223 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.604189 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k72xf"] Feb 23 07:42:17 crc kubenswrapper[5118]: E0223 07:42:17.605399 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerName="registry-server" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.605421 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerName="registry-server" Feb 23 07:42:17 crc kubenswrapper[5118]: E0223 07:42:17.605467 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerName="extract-content" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.605480 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerName="extract-content" Feb 23 07:42:17 crc kubenswrapper[5118]: E0223 07:42:17.605511 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerName="extract-utilities" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.605525 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerName="extract-utilities" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.605764 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="83a43f13-de78-4133-9065-dc9a6aa0d692" containerName="registry-server" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.608269 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.619394 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k72xf"] Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.741076 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-utilities\") pod \"certified-operators-k72xf\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.741731 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-catalog-content\") pod \"certified-operators-k72xf\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.742059 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jml27\" (UniqueName: \"kubernetes.io/projected/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-kube-api-access-jml27\") pod \"certified-operators-k72xf\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.843080 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-utilities\") pod \"certified-operators-k72xf\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.843175 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-catalog-content\") pod \"certified-operators-k72xf\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.843277 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jml27\" (UniqueName: \"kubernetes.io/projected/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-kube-api-access-jml27\") pod \"certified-operators-k72xf\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.844083 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-utilities\") pod \"certified-operators-k72xf\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.844248 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-catalog-content\") pod \"certified-operators-k72xf\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.872837 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jml27\" (UniqueName: \"kubernetes.io/projected/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-kube-api-access-jml27\") pod \"certified-operators-k72xf\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:17 crc kubenswrapper[5118]: I0223 07:42:17.957023 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:18 crc kubenswrapper[5118]: I0223 07:42:18.414259 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k72xf"] Feb 23 07:42:19 crc kubenswrapper[5118]: I0223 07:42:19.072658 5118 generic.go:334] "Generic (PLEG): container finished" podID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerID="da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d" exitCode=0 Feb 23 07:42:19 crc kubenswrapper[5118]: I0223 07:42:19.072716 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k72xf" event={"ID":"feaca131-5ecb-4a34-9b06-9bc9744b2ea8","Type":"ContainerDied","Data":"da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d"} Feb 23 07:42:19 crc kubenswrapper[5118]: I0223 07:42:19.073406 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k72xf" event={"ID":"feaca131-5ecb-4a34-9b06-9bc9744b2ea8","Type":"ContainerStarted","Data":"83c34095cb3b993bdf8be0530584db49846d969c28c41ed38574bc5420d749cc"} Feb 23 07:42:20 crc kubenswrapper[5118]: I0223 07:42:20.085054 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k72xf" event={"ID":"feaca131-5ecb-4a34-9b06-9bc9744b2ea8","Type":"ContainerStarted","Data":"4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc"} Feb 23 07:42:21 crc kubenswrapper[5118]: I0223 07:42:21.098722 5118 generic.go:334] "Generic (PLEG): container finished" podID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerID="4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc" exitCode=0 Feb 23 07:42:21 crc kubenswrapper[5118]: I0223 07:42:21.098861 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k72xf" event={"ID":"feaca131-5ecb-4a34-9b06-9bc9744b2ea8","Type":"ContainerDied","Data":"4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc"} Feb 23 07:42:22 crc kubenswrapper[5118]: I0223 07:42:22.111426 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k72xf" event={"ID":"feaca131-5ecb-4a34-9b06-9bc9744b2ea8","Type":"ContainerStarted","Data":"8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5"} Feb 23 07:42:22 crc kubenswrapper[5118]: I0223 07:42:22.140542 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k72xf" podStartSLOduration=2.6590370009999997 podStartE2EDuration="5.140524326s" podCreationTimestamp="2026-02-23 07:42:17 +0000 UTC" firstStartedPulling="2026-02-23 07:42:19.075055991 +0000 UTC m=+3402.078840614" lastFinishedPulling="2026-02-23 07:42:21.556543336 +0000 UTC m=+3404.560327939" observedRunningTime="2026-02-23 07:42:22.136592081 +0000 UTC m=+3405.140376654" watchObservedRunningTime="2026-02-23 07:42:22.140524326 +0000 UTC m=+3405.144308899" Feb 23 07:42:22 crc kubenswrapper[5118]: I0223 07:42:22.697887 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:42:22 crc kubenswrapper[5118]: E0223 07:42:22.698422 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:42:27 crc kubenswrapper[5118]: I0223 07:42:27.957913 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:27 crc kubenswrapper[5118]: I0223 07:42:27.958719 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:28 crc kubenswrapper[5118]: I0223 07:42:28.034055 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:28 crc kubenswrapper[5118]: I0223 07:42:28.248077 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:28 crc kubenswrapper[5118]: I0223 07:42:28.326937 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k72xf"] Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.184012 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k72xf" podUID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerName="registry-server" containerID="cri-o://8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5" gracePeriod=2 Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.658814 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.835376 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jml27\" (UniqueName: \"kubernetes.io/projected/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-kube-api-access-jml27\") pod \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.835537 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-catalog-content\") pod \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.835576 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-utilities\") pod \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\" (UID: \"feaca131-5ecb-4a34-9b06-9bc9744b2ea8\") " Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.837410 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-utilities" (OuterVolumeSpecName: "utilities") pod "feaca131-5ecb-4a34-9b06-9bc9744b2ea8" (UID: "feaca131-5ecb-4a34-9b06-9bc9744b2ea8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.842885 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-kube-api-access-jml27" (OuterVolumeSpecName: "kube-api-access-jml27") pod "feaca131-5ecb-4a34-9b06-9bc9744b2ea8" (UID: "feaca131-5ecb-4a34-9b06-9bc9744b2ea8"). InnerVolumeSpecName "kube-api-access-jml27". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.899992 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "feaca131-5ecb-4a34-9b06-9bc9744b2ea8" (UID: "feaca131-5ecb-4a34-9b06-9bc9744b2ea8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.937564 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jml27\" (UniqueName: \"kubernetes.io/projected/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-kube-api-access-jml27\") on node \"crc\" DevicePath \"\"" Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.937610 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:42:30 crc kubenswrapper[5118]: I0223 07:42:30.937624 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feaca131-5ecb-4a34-9b06-9bc9744b2ea8-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.200079 5118 generic.go:334] "Generic (PLEG): container finished" podID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerID="8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5" exitCode=0 Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.200185 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k72xf" event={"ID":"feaca131-5ecb-4a34-9b06-9bc9744b2ea8","Type":"ContainerDied","Data":"8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5"} Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.200229 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k72xf" event={"ID":"feaca131-5ecb-4a34-9b06-9bc9744b2ea8","Type":"ContainerDied","Data":"83c34095cb3b993bdf8be0530584db49846d969c28c41ed38574bc5420d749cc"} Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.200251 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k72xf" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.200277 5118 scope.go:117] "RemoveContainer" containerID="8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.231780 5118 scope.go:117] "RemoveContainer" containerID="4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.259480 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k72xf"] Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.272226 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k72xf"] Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.274600 5118 scope.go:117] "RemoveContainer" containerID="da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.301184 5118 scope.go:117] "RemoveContainer" containerID="8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5" Feb 23 07:42:31 crc kubenswrapper[5118]: E0223 07:42:31.301751 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5\": container with ID starting with 8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5 not found: ID does not exist" containerID="8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.301797 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5"} err="failed to get container status \"8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5\": rpc error: code = NotFound desc = could not find container \"8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5\": container with ID starting with 8d515dbcf9490be8aa5687e9f3422c6900cb9b24b7f5c322bade3690b14033a5 not found: ID does not exist" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.301832 5118 scope.go:117] "RemoveContainer" containerID="4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc" Feb 23 07:42:31 crc kubenswrapper[5118]: E0223 07:42:31.302271 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc\": container with ID starting with 4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc not found: ID does not exist" containerID="4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.302324 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc"} err="failed to get container status \"4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc\": rpc error: code = NotFound desc = could not find container \"4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc\": container with ID starting with 4a7e72772d2f278daba8046346fcf3e0808c9bfde324f87ac5b7685cb8b126cc not found: ID does not exist" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.302359 5118 scope.go:117] "RemoveContainer" containerID="da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d" Feb 23 07:42:31 crc kubenswrapper[5118]: E0223 07:42:31.302648 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d\": container with ID starting with da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d not found: ID does not exist" containerID="da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.302694 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d"} err="failed to get container status \"da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d\": rpc error: code = NotFound desc = could not find container \"da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d\": container with ID starting with da15ad2a67dd4ec3a1b2923dec8f7ef730739afc1e3e8aabca0cf6d849ae467d not found: ID does not exist" Feb 23 07:42:31 crc kubenswrapper[5118]: I0223 07:42:31.716503 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" path="/var/lib/kubelet/pods/feaca131-5ecb-4a34-9b06-9bc9744b2ea8/volumes" Feb 23 07:42:33 crc kubenswrapper[5118]: I0223 07:42:33.697358 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:42:33 crc kubenswrapper[5118]: E0223 07:42:33.698237 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:42:46 crc kubenswrapper[5118]: I0223 07:42:46.697504 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:42:46 crc kubenswrapper[5118]: E0223 07:42:46.698570 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.491402 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rs58j"] Feb 23 07:42:48 crc kubenswrapper[5118]: E0223 07:42:48.492636 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerName="extract-content" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.492661 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerName="extract-content" Feb 23 07:42:48 crc kubenswrapper[5118]: E0223 07:42:48.492712 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerName="registry-server" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.492725 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerName="registry-server" Feb 23 07:42:48 crc kubenswrapper[5118]: E0223 07:42:48.492744 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerName="extract-utilities" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.492758 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerName="extract-utilities" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.493068 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="feaca131-5ecb-4a34-9b06-9bc9744b2ea8" containerName="registry-server" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.495330 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.505027 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rs58j"] Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.655289 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-utilities\") pod \"redhat-marketplace-rs58j\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.655811 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-catalog-content\") pod \"redhat-marketplace-rs58j\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.655848 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x2mv\" (UniqueName: \"kubernetes.io/projected/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-kube-api-access-7x2mv\") pod \"redhat-marketplace-rs58j\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.758249 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-utilities\") pod \"redhat-marketplace-rs58j\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.758352 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-catalog-content\") pod \"redhat-marketplace-rs58j\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.758394 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x2mv\" (UniqueName: \"kubernetes.io/projected/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-kube-api-access-7x2mv\") pod \"redhat-marketplace-rs58j\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.758969 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-utilities\") pod \"redhat-marketplace-rs58j\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.759349 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-catalog-content\") pod \"redhat-marketplace-rs58j\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.795418 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x2mv\" (UniqueName: \"kubernetes.io/projected/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-kube-api-access-7x2mv\") pod \"redhat-marketplace-rs58j\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:48 crc kubenswrapper[5118]: I0223 07:42:48.856243 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:49 crc kubenswrapper[5118]: I0223 07:42:49.189796 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rs58j"] Feb 23 07:42:49 crc kubenswrapper[5118]: I0223 07:42:49.377774 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs58j" event={"ID":"d5a2c0bb-6ef8-42ea-8af9-11d016584a22","Type":"ContainerStarted","Data":"3716ac0309594d01cd882be0f39fc7abe8141a6a33faeea95203780ff60008d0"} Feb 23 07:42:50 crc kubenswrapper[5118]: I0223 07:42:50.392831 5118 generic.go:334] "Generic (PLEG): container finished" podID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerID="5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12" exitCode=0 Feb 23 07:42:50 crc kubenswrapper[5118]: I0223 07:42:50.393009 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs58j" event={"ID":"d5a2c0bb-6ef8-42ea-8af9-11d016584a22","Type":"ContainerDied","Data":"5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12"} Feb 23 07:42:51 crc kubenswrapper[5118]: I0223 07:42:51.416756 5118 generic.go:334] "Generic (PLEG): container finished" podID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerID="1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603" exitCode=0 Feb 23 07:42:51 crc kubenswrapper[5118]: I0223 07:42:51.416983 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs58j" event={"ID":"d5a2c0bb-6ef8-42ea-8af9-11d016584a22","Type":"ContainerDied","Data":"1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603"} Feb 23 07:42:52 crc kubenswrapper[5118]: I0223 07:42:52.429830 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs58j" event={"ID":"d5a2c0bb-6ef8-42ea-8af9-11d016584a22","Type":"ContainerStarted","Data":"d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8"} Feb 23 07:42:52 crc kubenswrapper[5118]: I0223 07:42:52.470864 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rs58j" podStartSLOduration=3.052015831 podStartE2EDuration="4.470829069s" podCreationTimestamp="2026-02-23 07:42:48 +0000 UTC" firstStartedPulling="2026-02-23 07:42:50.397264849 +0000 UTC m=+3433.401049462" lastFinishedPulling="2026-02-23 07:42:51.816078087 +0000 UTC m=+3434.819862700" observedRunningTime="2026-02-23 07:42:52.46464434 +0000 UTC m=+3435.468428923" watchObservedRunningTime="2026-02-23 07:42:52.470829069 +0000 UTC m=+3435.474613692" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.216971 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6k7p9"] Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.220032 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.244515 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6k7p9"] Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.329640 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-catalog-content\") pod \"community-operators-6k7p9\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.329713 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-utilities\") pod \"community-operators-6k7p9\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.330000 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpncd\" (UniqueName: \"kubernetes.io/projected/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-kube-api-access-rpncd\") pod \"community-operators-6k7p9\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.431137 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpncd\" (UniqueName: \"kubernetes.io/projected/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-kube-api-access-rpncd\") pod \"community-operators-6k7p9\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.431235 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-catalog-content\") pod \"community-operators-6k7p9\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.431272 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-utilities\") pod \"community-operators-6k7p9\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.431984 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-utilities\") pod \"community-operators-6k7p9\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.432147 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-catalog-content\") pod \"community-operators-6k7p9\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.458861 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpncd\" (UniqueName: \"kubernetes.io/projected/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-kube-api-access-rpncd\") pod \"community-operators-6k7p9\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:53 crc kubenswrapper[5118]: I0223 07:42:53.558295 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:42:54 crc kubenswrapper[5118]: I0223 07:42:54.046705 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6k7p9"] Feb 23 07:42:54 crc kubenswrapper[5118]: I0223 07:42:54.448005 5118 generic.go:334] "Generic (PLEG): container finished" podID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerID="39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe" exitCode=0 Feb 23 07:42:54 crc kubenswrapper[5118]: I0223 07:42:54.448051 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6k7p9" event={"ID":"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668","Type":"ContainerDied","Data":"39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe"} Feb 23 07:42:54 crc kubenswrapper[5118]: I0223 07:42:54.448483 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6k7p9" event={"ID":"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668","Type":"ContainerStarted","Data":"6f60d70bf82da2f396321d7e97dbc0394c235fcba0fa45bad27eec4b00292bc7"} Feb 23 07:42:55 crc kubenswrapper[5118]: I0223 07:42:55.464393 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6k7p9" event={"ID":"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668","Type":"ContainerStarted","Data":"22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744"} Feb 23 07:42:56 crc kubenswrapper[5118]: I0223 07:42:56.478306 5118 generic.go:334] "Generic (PLEG): container finished" podID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerID="22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744" exitCode=0 Feb 23 07:42:56 crc kubenswrapper[5118]: I0223 07:42:56.478463 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6k7p9" event={"ID":"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668","Type":"ContainerDied","Data":"22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744"} Feb 23 07:42:57 crc kubenswrapper[5118]: I0223 07:42:57.505848 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6k7p9" event={"ID":"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668","Type":"ContainerStarted","Data":"74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324"} Feb 23 07:42:57 crc kubenswrapper[5118]: I0223 07:42:57.546807 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6k7p9" podStartSLOduration=2.050329594 podStartE2EDuration="4.546764429s" podCreationTimestamp="2026-02-23 07:42:53 +0000 UTC" firstStartedPulling="2026-02-23 07:42:54.452341304 +0000 UTC m=+3437.456125917" lastFinishedPulling="2026-02-23 07:42:56.948776179 +0000 UTC m=+3439.952560752" observedRunningTime="2026-02-23 07:42:57.541988344 +0000 UTC m=+3440.545772917" watchObservedRunningTime="2026-02-23 07:42:57.546764429 +0000 UTC m=+3440.550549042" Feb 23 07:42:58 crc kubenswrapper[5118]: I0223 07:42:58.856565 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:58 crc kubenswrapper[5118]: I0223 07:42:58.857067 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:58 crc kubenswrapper[5118]: I0223 07:42:58.941018 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:42:59 crc kubenswrapper[5118]: I0223 07:42:59.601700 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:43:00 crc kubenswrapper[5118]: I0223 07:43:00.206332 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rs58j"] Feb 23 07:43:01 crc kubenswrapper[5118]: I0223 07:43:01.543907 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rs58j" podUID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerName="registry-server" containerID="cri-o://d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8" gracePeriod=2 Feb 23 07:43:01 crc kubenswrapper[5118]: I0223 07:43:01.698340 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:43:01 crc kubenswrapper[5118]: E0223 07:43:01.698709 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.078402 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.184538 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x2mv\" (UniqueName: \"kubernetes.io/projected/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-kube-api-access-7x2mv\") pod \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.184621 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-catalog-content\") pod \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.184683 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-utilities\") pod \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\" (UID: \"d5a2c0bb-6ef8-42ea-8af9-11d016584a22\") " Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.186322 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-utilities" (OuterVolumeSpecName: "utilities") pod "d5a2c0bb-6ef8-42ea-8af9-11d016584a22" (UID: "d5a2c0bb-6ef8-42ea-8af9-11d016584a22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.192364 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-kube-api-access-7x2mv" (OuterVolumeSpecName: "kube-api-access-7x2mv") pod "d5a2c0bb-6ef8-42ea-8af9-11d016584a22" (UID: "d5a2c0bb-6ef8-42ea-8af9-11d016584a22"). InnerVolumeSpecName "kube-api-access-7x2mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.232177 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5a2c0bb-6ef8-42ea-8af9-11d016584a22" (UID: "d5a2c0bb-6ef8-42ea-8af9-11d016584a22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.288008 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x2mv\" (UniqueName: \"kubernetes.io/projected/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-kube-api-access-7x2mv\") on node \"crc\" DevicePath \"\"" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.288080 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.288137 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a2c0bb-6ef8-42ea-8af9-11d016584a22-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.554624 5118 generic.go:334] "Generic (PLEG): container finished" podID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerID="d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8" exitCode=0 Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.554702 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs58j" event={"ID":"d5a2c0bb-6ef8-42ea-8af9-11d016584a22","Type":"ContainerDied","Data":"d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8"} Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.554756 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rs58j" event={"ID":"d5a2c0bb-6ef8-42ea-8af9-11d016584a22","Type":"ContainerDied","Data":"3716ac0309594d01cd882be0f39fc7abe8141a6a33faeea95203780ff60008d0"} Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.554774 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rs58j" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.554786 5118 scope.go:117] "RemoveContainer" containerID="d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.577978 5118 scope.go:117] "RemoveContainer" containerID="1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.608456 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rs58j"] Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.610883 5118 scope.go:117] "RemoveContainer" containerID="5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.616108 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rs58j"] Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.644331 5118 scope.go:117] "RemoveContainer" containerID="d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8" Feb 23 07:43:02 crc kubenswrapper[5118]: E0223 07:43:02.644922 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8\": container with ID starting with d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8 not found: ID does not exist" containerID="d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.644973 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8"} err="failed to get container status \"d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8\": rpc error: code = NotFound desc = could not find container \"d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8\": container with ID starting with d18fbae02b86eb53345be58b38dfde3f5496f106632d26338b96a7372cf511f8 not found: ID does not exist" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.645007 5118 scope.go:117] "RemoveContainer" containerID="1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603" Feb 23 07:43:02 crc kubenswrapper[5118]: E0223 07:43:02.645445 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603\": container with ID starting with 1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603 not found: ID does not exist" containerID="1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.645503 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603"} err="failed to get container status \"1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603\": rpc error: code = NotFound desc = could not find container \"1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603\": container with ID starting with 1b0e5b798b10e55cdc4d69d9776f07830c29e52c3ec35a5c14997812f6e6b603 not found: ID does not exist" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.645538 5118 scope.go:117] "RemoveContainer" containerID="5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12" Feb 23 07:43:02 crc kubenswrapper[5118]: E0223 07:43:02.645979 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12\": container with ID starting with 5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12 not found: ID does not exist" containerID="5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12" Feb 23 07:43:02 crc kubenswrapper[5118]: I0223 07:43:02.646015 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12"} err="failed to get container status \"5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12\": rpc error: code = NotFound desc = could not find container \"5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12\": container with ID starting with 5728a9c8fc3358f5454b0c09d2215a73a649bf1ac4732a0915c73c8eac20de12 not found: ID does not exist" Feb 23 07:43:03 crc kubenswrapper[5118]: I0223 07:43:03.559226 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:43:03 crc kubenswrapper[5118]: I0223 07:43:03.560594 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:43:03 crc kubenswrapper[5118]: I0223 07:43:03.639290 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:43:03 crc kubenswrapper[5118]: I0223 07:43:03.714591 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" path="/var/lib/kubelet/pods/d5a2c0bb-6ef8-42ea-8af9-11d016584a22/volumes" Feb 23 07:43:04 crc kubenswrapper[5118]: I0223 07:43:04.656564 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:43:05 crc kubenswrapper[5118]: I0223 07:43:05.605963 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6k7p9"] Feb 23 07:43:06 crc kubenswrapper[5118]: I0223 07:43:06.599547 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6k7p9" podUID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerName="registry-server" containerID="cri-o://74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324" gracePeriod=2 Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.078781 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.200537 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-utilities\") pod \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.200774 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpncd\" (UniqueName: \"kubernetes.io/projected/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-kube-api-access-rpncd\") pod \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.200852 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-catalog-content\") pod \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\" (UID: \"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668\") " Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.201880 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-utilities" (OuterVolumeSpecName: "utilities") pod "6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" (UID: "6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.212540 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-kube-api-access-rpncd" (OuterVolumeSpecName: "kube-api-access-rpncd") pod "6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" (UID: "6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668"). InnerVolumeSpecName "kube-api-access-rpncd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.296640 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" (UID: "6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.302902 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.302959 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpncd\" (UniqueName: \"kubernetes.io/projected/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-kube-api-access-rpncd\") on node \"crc\" DevicePath \"\"" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.302981 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.623267 5118 generic.go:334] "Generic (PLEG): container finished" podID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerID="74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324" exitCode=0 Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.623433 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6k7p9" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.624256 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6k7p9" event={"ID":"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668","Type":"ContainerDied","Data":"74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324"} Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.624502 5118 scope.go:117] "RemoveContainer" containerID="74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.624656 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6k7p9" event={"ID":"6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668","Type":"ContainerDied","Data":"6f60d70bf82da2f396321d7e97dbc0394c235fcba0fa45bad27eec4b00292bc7"} Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.660373 5118 scope.go:117] "RemoveContainer" containerID="22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.683711 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6k7p9"] Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.694603 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6k7p9"] Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.701746 5118 scope.go:117] "RemoveContainer" containerID="39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.710398 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" path="/var/lib/kubelet/pods/6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668/volumes" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.751151 5118 scope.go:117] "RemoveContainer" containerID="74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324" Feb 23 07:43:07 crc kubenswrapper[5118]: E0223 07:43:07.753263 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324\": container with ID starting with 74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324 not found: ID does not exist" containerID="74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.753317 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324"} err="failed to get container status \"74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324\": rpc error: code = NotFound desc = could not find container \"74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324\": container with ID starting with 74b072f755365abd123ca9a78b44b1c449260f6d4f02c18c7eaf588c2bd6b324 not found: ID does not exist" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.753348 5118 scope.go:117] "RemoveContainer" containerID="22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744" Feb 23 07:43:07 crc kubenswrapper[5118]: E0223 07:43:07.753875 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744\": container with ID starting with 22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744 not found: ID does not exist" containerID="22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.753898 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744"} err="failed to get container status \"22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744\": rpc error: code = NotFound desc = could not find container \"22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744\": container with ID starting with 22a3d3934f0ca498abacdda2d02b8357d46502bf4f9c525b6c0a4d665318e744 not found: ID does not exist" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.753915 5118 scope.go:117] "RemoveContainer" containerID="39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe" Feb 23 07:43:07 crc kubenswrapper[5118]: E0223 07:43:07.756505 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe\": container with ID starting with 39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe not found: ID does not exist" containerID="39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe" Feb 23 07:43:07 crc kubenswrapper[5118]: I0223 07:43:07.756576 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe"} err="failed to get container status \"39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe\": rpc error: code = NotFound desc = could not find container \"39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe\": container with ID starting with 39bb88fa599870918a1d60e4cb7c84b90491a4d7c4b05555d8d874e93603bcfe not found: ID does not exist" Feb 23 07:43:15 crc kubenswrapper[5118]: I0223 07:43:15.698534 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:43:15 crc kubenswrapper[5118]: E0223 07:43:15.699227 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:43:30 crc kubenswrapper[5118]: I0223 07:43:30.697455 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:43:30 crc kubenswrapper[5118]: E0223 07:43:30.698778 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:43:45 crc kubenswrapper[5118]: I0223 07:43:45.697728 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:43:46 crc kubenswrapper[5118]: I0223 07:43:46.006455 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"6404f02f8c8443c65debcde2916577f06d3f796a023d4c2a534d1e413492a976"} Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.168878 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792"] Feb 23 07:45:00 crc kubenswrapper[5118]: E0223 07:45:00.170317 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerName="registry-server" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.170340 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerName="registry-server" Feb 23 07:45:00 crc kubenswrapper[5118]: E0223 07:45:00.170359 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerName="registry-server" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.170365 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerName="registry-server" Feb 23 07:45:00 crc kubenswrapper[5118]: E0223 07:45:00.170401 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerName="extract-utilities" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.170410 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerName="extract-utilities" Feb 23 07:45:00 crc kubenswrapper[5118]: E0223 07:45:00.170427 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerName="extract-utilities" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.170433 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerName="extract-utilities" Feb 23 07:45:00 crc kubenswrapper[5118]: E0223 07:45:00.170444 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerName="extract-content" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.170450 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerName="extract-content" Feb 23 07:45:00 crc kubenswrapper[5118]: E0223 07:45:00.170459 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerName="extract-content" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.170465 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerName="extract-content" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.170640 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a2c0bb-6ef8-42ea-8af9-11d016584a22" containerName="registry-server" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.170666 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2e4c3f-22e2-4dc6-9c6f-3db3fd5c2668" containerName="registry-server" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.171280 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.174357 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.174369 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.185257 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792"] Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.305417 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59g9d\" (UniqueName: \"kubernetes.io/projected/033ed359-95df-44ef-bbfc-c9eee59cf768-kube-api-access-59g9d\") pod \"collect-profiles-29530545-b4792\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.305541 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/033ed359-95df-44ef-bbfc-c9eee59cf768-config-volume\") pod \"collect-profiles-29530545-b4792\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.305571 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/033ed359-95df-44ef-bbfc-c9eee59cf768-secret-volume\") pod \"collect-profiles-29530545-b4792\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.406970 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59g9d\" (UniqueName: \"kubernetes.io/projected/033ed359-95df-44ef-bbfc-c9eee59cf768-kube-api-access-59g9d\") pod \"collect-profiles-29530545-b4792\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.407410 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/033ed359-95df-44ef-bbfc-c9eee59cf768-config-volume\") pod \"collect-profiles-29530545-b4792\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.407547 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/033ed359-95df-44ef-bbfc-c9eee59cf768-secret-volume\") pod \"collect-profiles-29530545-b4792\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.408923 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/033ed359-95df-44ef-bbfc-c9eee59cf768-config-volume\") pod \"collect-profiles-29530545-b4792\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.423583 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/033ed359-95df-44ef-bbfc-c9eee59cf768-secret-volume\") pod \"collect-profiles-29530545-b4792\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.441465 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59g9d\" (UniqueName: \"kubernetes.io/projected/033ed359-95df-44ef-bbfc-c9eee59cf768-kube-api-access-59g9d\") pod \"collect-profiles-29530545-b4792\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.516824 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:00 crc kubenswrapper[5118]: I0223 07:45:00.797240 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792"] Feb 23 07:45:01 crc kubenswrapper[5118]: I0223 07:45:01.786378 5118 generic.go:334] "Generic (PLEG): container finished" podID="033ed359-95df-44ef-bbfc-c9eee59cf768" containerID="c924d9579c69ecb1af079b8248649efbb7e99c1e614b745c9fc0d5afc60daa31" exitCode=0 Feb 23 07:45:01 crc kubenswrapper[5118]: I0223 07:45:01.786441 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" event={"ID":"033ed359-95df-44ef-bbfc-c9eee59cf768","Type":"ContainerDied","Data":"c924d9579c69ecb1af079b8248649efbb7e99c1e614b745c9fc0d5afc60daa31"} Feb 23 07:45:01 crc kubenswrapper[5118]: I0223 07:45:01.786848 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" event={"ID":"033ed359-95df-44ef-bbfc-c9eee59cf768","Type":"ContainerStarted","Data":"e308ca5fda16d8d966c064f448c267b9a92fb914fa17da6afd6091232b026685"} Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.159146 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.260969 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/033ed359-95df-44ef-bbfc-c9eee59cf768-config-volume\") pod \"033ed359-95df-44ef-bbfc-c9eee59cf768\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.261050 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59g9d\" (UniqueName: \"kubernetes.io/projected/033ed359-95df-44ef-bbfc-c9eee59cf768-kube-api-access-59g9d\") pod \"033ed359-95df-44ef-bbfc-c9eee59cf768\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.261196 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/033ed359-95df-44ef-bbfc-c9eee59cf768-secret-volume\") pod \"033ed359-95df-44ef-bbfc-c9eee59cf768\" (UID: \"033ed359-95df-44ef-bbfc-c9eee59cf768\") " Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.262005 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033ed359-95df-44ef-bbfc-c9eee59cf768-config-volume" (OuterVolumeSpecName: "config-volume") pod "033ed359-95df-44ef-bbfc-c9eee59cf768" (UID: "033ed359-95df-44ef-bbfc-c9eee59cf768"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.269437 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033ed359-95df-44ef-bbfc-c9eee59cf768-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "033ed359-95df-44ef-bbfc-c9eee59cf768" (UID: "033ed359-95df-44ef-bbfc-c9eee59cf768"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.281347 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033ed359-95df-44ef-bbfc-c9eee59cf768-kube-api-access-59g9d" (OuterVolumeSpecName: "kube-api-access-59g9d") pod "033ed359-95df-44ef-bbfc-c9eee59cf768" (UID: "033ed359-95df-44ef-bbfc-c9eee59cf768"). InnerVolumeSpecName "kube-api-access-59g9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.363807 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/033ed359-95df-44ef-bbfc-c9eee59cf768-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.363905 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/033ed359-95df-44ef-bbfc-c9eee59cf768-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.363931 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59g9d\" (UniqueName: \"kubernetes.io/projected/033ed359-95df-44ef-bbfc-c9eee59cf768-kube-api-access-59g9d\") on node \"crc\" DevicePath \"\"" Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.808271 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" event={"ID":"033ed359-95df-44ef-bbfc-c9eee59cf768","Type":"ContainerDied","Data":"e308ca5fda16d8d966c064f448c267b9a92fb914fa17da6afd6091232b026685"} Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.808338 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e308ca5fda16d8d966c064f448c267b9a92fb914fa17da6afd6091232b026685" Feb 23 07:45:03 crc kubenswrapper[5118]: I0223 07:45:03.808345 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792" Feb 23 07:45:04 crc kubenswrapper[5118]: I0223 07:45:04.248530 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl"] Feb 23 07:45:04 crc kubenswrapper[5118]: I0223 07:45:04.255025 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530500-sqsbl"] Feb 23 07:45:05 crc kubenswrapper[5118]: I0223 07:45:05.714145 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64221749-8f82-47ac-8b0f-cfef43a17733" path="/var/lib/kubelet/pods/64221749-8f82-47ac-8b0f-cfef43a17733/volumes" Feb 23 07:45:49 crc kubenswrapper[5118]: I0223 07:45:49.845412 5118 scope.go:117] "RemoveContainer" containerID="daa505d71a81ac09727ecefc26fb5bbbe89811b2651cf6ab2719021aa0fa6d91" Feb 23 07:46:02 crc kubenswrapper[5118]: I0223 07:46:02.975242 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:46:02 crc kubenswrapper[5118]: I0223 07:46:02.975906 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:46:32 crc kubenswrapper[5118]: I0223 07:46:32.975239 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:46:32 crc kubenswrapper[5118]: I0223 07:46:32.977953 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:47:02 crc kubenswrapper[5118]: I0223 07:47:02.975947 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:47:02 crc kubenswrapper[5118]: I0223 07:47:02.976766 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:47:02 crc kubenswrapper[5118]: I0223 07:47:02.976838 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:47:02 crc kubenswrapper[5118]: I0223 07:47:02.977953 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6404f02f8c8443c65debcde2916577f06d3f796a023d4c2a534d1e413492a976"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:47:02 crc kubenswrapper[5118]: I0223 07:47:02.978047 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://6404f02f8c8443c65debcde2916577f06d3f796a023d4c2a534d1e413492a976" gracePeriod=600 Feb 23 07:47:03 crc kubenswrapper[5118]: I0223 07:47:03.961929 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="6404f02f8c8443c65debcde2916577f06d3f796a023d4c2a534d1e413492a976" exitCode=0 Feb 23 07:47:03 crc kubenswrapper[5118]: I0223 07:47:03.962030 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"6404f02f8c8443c65debcde2916577f06d3f796a023d4c2a534d1e413492a976"} Feb 23 07:47:03 crc kubenswrapper[5118]: I0223 07:47:03.963255 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1"} Feb 23 07:47:03 crc kubenswrapper[5118]: I0223 07:47:03.963298 5118 scope.go:117] "RemoveContainer" containerID="f752ddfc183b037f3466fb3afc66f824523b95427e54ba875312240c3dc04e35" Feb 23 07:49:32 crc kubenswrapper[5118]: I0223 07:49:32.975478 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:49:32 crc kubenswrapper[5118]: I0223 07:49:32.976301 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.439788 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f2mlg"] Feb 23 07:49:45 crc kubenswrapper[5118]: E0223 07:49:45.441465 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033ed359-95df-44ef-bbfc-c9eee59cf768" containerName="collect-profiles" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.441496 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="033ed359-95df-44ef-bbfc-c9eee59cf768" containerName="collect-profiles" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.441852 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="033ed359-95df-44ef-bbfc-c9eee59cf768" containerName="collect-profiles" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.444240 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.467038 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f2mlg"] Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.557288 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-catalog-content\") pod \"redhat-operators-f2mlg\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.557617 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-utilities\") pod \"redhat-operators-f2mlg\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.557716 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fh6\" (UniqueName: \"kubernetes.io/projected/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-kube-api-access-27fh6\") pod \"redhat-operators-f2mlg\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.660993 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-utilities\") pod \"redhat-operators-f2mlg\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.661149 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27fh6\" (UniqueName: \"kubernetes.io/projected/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-kube-api-access-27fh6\") pod \"redhat-operators-f2mlg\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.661552 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-catalog-content\") pod \"redhat-operators-f2mlg\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.662482 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-catalog-content\") pod \"redhat-operators-f2mlg\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.662821 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-utilities\") pod \"redhat-operators-f2mlg\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.701544 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fh6\" (UniqueName: \"kubernetes.io/projected/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-kube-api-access-27fh6\") pod \"redhat-operators-f2mlg\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:45 crc kubenswrapper[5118]: I0223 07:49:45.814417 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:46 crc kubenswrapper[5118]: I0223 07:49:46.278229 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f2mlg"] Feb 23 07:49:46 crc kubenswrapper[5118]: I0223 07:49:46.603016 5118 generic.go:334] "Generic (PLEG): container finished" podID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerID="1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57" exitCode=0 Feb 23 07:49:46 crc kubenswrapper[5118]: I0223 07:49:46.603084 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2mlg" event={"ID":"c7d6c928-3ddc-47e7-ac18-8961c5bb1055","Type":"ContainerDied","Data":"1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57"} Feb 23 07:49:46 crc kubenswrapper[5118]: I0223 07:49:46.603160 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2mlg" event={"ID":"c7d6c928-3ddc-47e7-ac18-8961c5bb1055","Type":"ContainerStarted","Data":"e22b48c16722910a477f037b6ce064aa88a5f60b5d706f99a4a6b0b8f740da6e"} Feb 23 07:49:46 crc kubenswrapper[5118]: I0223 07:49:46.605497 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:49:47 crc kubenswrapper[5118]: I0223 07:49:47.610596 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2mlg" event={"ID":"c7d6c928-3ddc-47e7-ac18-8961c5bb1055","Type":"ContainerStarted","Data":"a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0"} Feb 23 07:49:48 crc kubenswrapper[5118]: E0223 07:49:48.237526 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d6c928_3ddc_47e7_ac18_8961c5bb1055.slice/crio-a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d6c928_3ddc_47e7_ac18_8961c5bb1055.slice/crio-conmon-a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0.scope\": RecentStats: unable to find data in memory cache]" Feb 23 07:49:48 crc kubenswrapper[5118]: I0223 07:49:48.623192 5118 generic.go:334] "Generic (PLEG): container finished" podID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerID="a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0" exitCode=0 Feb 23 07:49:48 crc kubenswrapper[5118]: I0223 07:49:48.623298 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2mlg" event={"ID":"c7d6c928-3ddc-47e7-ac18-8961c5bb1055","Type":"ContainerDied","Data":"a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0"} Feb 23 07:49:49 crc kubenswrapper[5118]: I0223 07:49:49.635679 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2mlg" event={"ID":"c7d6c928-3ddc-47e7-ac18-8961c5bb1055","Type":"ContainerStarted","Data":"ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041"} Feb 23 07:49:49 crc kubenswrapper[5118]: I0223 07:49:49.669500 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f2mlg" podStartSLOduration=2.231829195 podStartE2EDuration="4.669479229s" podCreationTimestamp="2026-02-23 07:49:45 +0000 UTC" firstStartedPulling="2026-02-23 07:49:46.605198323 +0000 UTC m=+3849.608982896" lastFinishedPulling="2026-02-23 07:49:49.042848347 +0000 UTC m=+3852.046632930" observedRunningTime="2026-02-23 07:49:49.65753416 +0000 UTC m=+3852.661318783" watchObservedRunningTime="2026-02-23 07:49:49.669479229 +0000 UTC m=+3852.673263812" Feb 23 07:49:55 crc kubenswrapper[5118]: I0223 07:49:55.815463 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:55 crc kubenswrapper[5118]: I0223 07:49:55.815956 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:49:56 crc kubenswrapper[5118]: I0223 07:49:56.887595 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f2mlg" podUID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerName="registry-server" probeResult="failure" output=< Feb 23 07:49:56 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 07:49:56 crc kubenswrapper[5118]: > Feb 23 07:50:02 crc kubenswrapper[5118]: I0223 07:50:02.975301 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:50:02 crc kubenswrapper[5118]: I0223 07:50:02.975686 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:50:05 crc kubenswrapper[5118]: I0223 07:50:05.878580 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:50:05 crc kubenswrapper[5118]: I0223 07:50:05.963406 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:50:06 crc kubenswrapper[5118]: I0223 07:50:06.136272 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f2mlg"] Feb 23 07:50:07 crc kubenswrapper[5118]: I0223 07:50:07.810055 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f2mlg" podUID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerName="registry-server" containerID="cri-o://ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041" gracePeriod=2 Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.454245 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.545579 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27fh6\" (UniqueName: \"kubernetes.io/projected/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-kube-api-access-27fh6\") pod \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.545640 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-utilities\") pod \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.545720 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-catalog-content\") pod \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\" (UID: \"c7d6c928-3ddc-47e7-ac18-8961c5bb1055\") " Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.546641 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-utilities" (OuterVolumeSpecName: "utilities") pod "c7d6c928-3ddc-47e7-ac18-8961c5bb1055" (UID: "c7d6c928-3ddc-47e7-ac18-8961c5bb1055"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.569736 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-kube-api-access-27fh6" (OuterVolumeSpecName: "kube-api-access-27fh6") pod "c7d6c928-3ddc-47e7-ac18-8961c5bb1055" (UID: "c7d6c928-3ddc-47e7-ac18-8961c5bb1055"). InnerVolumeSpecName "kube-api-access-27fh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.647172 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27fh6\" (UniqueName: \"kubernetes.io/projected/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-kube-api-access-27fh6\") on node \"crc\" DevicePath \"\"" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.647217 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.716974 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7d6c928-3ddc-47e7-ac18-8961c5bb1055" (UID: "c7d6c928-3ddc-47e7-ac18-8961c5bb1055"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.749133 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d6c928-3ddc-47e7-ac18-8961c5bb1055-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.823005 5118 generic.go:334] "Generic (PLEG): container finished" podID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerID="ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041" exitCode=0 Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.823119 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2mlg" event={"ID":"c7d6c928-3ddc-47e7-ac18-8961c5bb1055","Type":"ContainerDied","Data":"ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041"} Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.823166 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f2mlg" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.824347 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f2mlg" event={"ID":"c7d6c928-3ddc-47e7-ac18-8961c5bb1055","Type":"ContainerDied","Data":"e22b48c16722910a477f037b6ce064aa88a5f60b5d706f99a4a6b0b8f740da6e"} Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.824392 5118 scope.go:117] "RemoveContainer" containerID="ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.871618 5118 scope.go:117] "RemoveContainer" containerID="a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.886530 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f2mlg"] Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.897437 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f2mlg"] Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.904067 5118 scope.go:117] "RemoveContainer" containerID="1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.955308 5118 scope.go:117] "RemoveContainer" containerID="ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041" Feb 23 07:50:08 crc kubenswrapper[5118]: E0223 07:50:08.956444 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041\": container with ID starting with ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041 not found: ID does not exist" containerID="ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.956559 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041"} err="failed to get container status \"ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041\": rpc error: code = NotFound desc = could not find container \"ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041\": container with ID starting with ae58e64d573ee451dbf256a1ff613b7ba83c48c78e662047be97f0fab5b21041 not found: ID does not exist" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.956620 5118 scope.go:117] "RemoveContainer" containerID="a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0" Feb 23 07:50:08 crc kubenswrapper[5118]: E0223 07:50:08.957389 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0\": container with ID starting with a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0 not found: ID does not exist" containerID="a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.957502 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0"} err="failed to get container status \"a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0\": rpc error: code = NotFound desc = could not find container \"a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0\": container with ID starting with a30884d9cd45ae918858bf7ecf8d5ad8b41f75506924edaadaf6624551c738a0 not found: ID does not exist" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.957558 5118 scope.go:117] "RemoveContainer" containerID="1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57" Feb 23 07:50:08 crc kubenswrapper[5118]: E0223 07:50:08.958243 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57\": container with ID starting with 1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57 not found: ID does not exist" containerID="1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57" Feb 23 07:50:08 crc kubenswrapper[5118]: I0223 07:50:08.958314 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57"} err="failed to get container status \"1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57\": rpc error: code = NotFound desc = could not find container \"1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57\": container with ID starting with 1484d6e868aab602adedbe06735fa4372e7fefbbbb6835e95bc529e7ec607e57 not found: ID does not exist" Feb 23 07:50:09 crc kubenswrapper[5118]: I0223 07:50:09.714798 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" path="/var/lib/kubelet/pods/c7d6c928-3ddc-47e7-ac18-8961c5bb1055/volumes" Feb 23 07:50:32 crc kubenswrapper[5118]: I0223 07:50:32.975971 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:50:32 crc kubenswrapper[5118]: I0223 07:50:32.976790 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:50:32 crc kubenswrapper[5118]: I0223 07:50:32.976886 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:50:32 crc kubenswrapper[5118]: I0223 07:50:32.978006 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:50:32 crc kubenswrapper[5118]: I0223 07:50:32.978139 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" gracePeriod=600 Feb 23 07:50:33 crc kubenswrapper[5118]: E0223 07:50:33.135198 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:50:34 crc kubenswrapper[5118]: I0223 07:50:34.077500 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" exitCode=0 Feb 23 07:50:34 crc kubenswrapper[5118]: I0223 07:50:34.077573 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1"} Feb 23 07:50:34 crc kubenswrapper[5118]: I0223 07:50:34.077631 5118 scope.go:117] "RemoveContainer" containerID="6404f02f8c8443c65debcde2916577f06d3f796a023d4c2a534d1e413492a976" Feb 23 07:50:34 crc kubenswrapper[5118]: I0223 07:50:34.079221 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:50:34 crc kubenswrapper[5118]: E0223 07:50:34.079671 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:50:45 crc kubenswrapper[5118]: I0223 07:50:45.698011 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:50:45 crc kubenswrapper[5118]: E0223 07:50:45.699037 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:50:58 crc kubenswrapper[5118]: I0223 07:50:58.698302 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:50:58 crc kubenswrapper[5118]: E0223 07:50:58.698967 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:51:12 crc kubenswrapper[5118]: I0223 07:51:12.697573 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:51:12 crc kubenswrapper[5118]: E0223 07:51:12.699472 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:51:24 crc kubenswrapper[5118]: I0223 07:51:24.698635 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:51:24 crc kubenswrapper[5118]: E0223 07:51:24.700218 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:51:35 crc kubenswrapper[5118]: I0223 07:51:35.697642 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:51:35 crc kubenswrapper[5118]: E0223 07:51:35.698616 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:51:47 crc kubenswrapper[5118]: I0223 07:51:47.703414 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:51:47 crc kubenswrapper[5118]: E0223 07:51:47.704075 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:52:01 crc kubenswrapper[5118]: I0223 07:52:01.697870 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:52:01 crc kubenswrapper[5118]: E0223 07:52:01.698912 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:52:15 crc kubenswrapper[5118]: I0223 07:52:15.697738 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:52:15 crc kubenswrapper[5118]: E0223 07:52:15.699264 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:52:29 crc kubenswrapper[5118]: I0223 07:52:29.697792 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:52:29 crc kubenswrapper[5118]: E0223 07:52:29.698793 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:52:32 crc kubenswrapper[5118]: I0223 07:52:32.949365 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ftj4b"] Feb 23 07:52:32 crc kubenswrapper[5118]: E0223 07:52:32.950278 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerName="registry-server" Feb 23 07:52:32 crc kubenswrapper[5118]: I0223 07:52:32.950296 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerName="registry-server" Feb 23 07:52:32 crc kubenswrapper[5118]: E0223 07:52:32.950315 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerName="extract-content" Feb 23 07:52:32 crc kubenswrapper[5118]: I0223 07:52:32.950323 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerName="extract-content" Feb 23 07:52:32 crc kubenswrapper[5118]: E0223 07:52:32.950357 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerName="extract-utilities" Feb 23 07:52:32 crc kubenswrapper[5118]: I0223 07:52:32.950366 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerName="extract-utilities" Feb 23 07:52:32 crc kubenswrapper[5118]: I0223 07:52:32.950557 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d6c928-3ddc-47e7-ac18-8961c5bb1055" containerName="registry-server" Feb 23 07:52:32 crc kubenswrapper[5118]: I0223 07:52:32.951993 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:32 crc kubenswrapper[5118]: I0223 07:52:32.985819 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftj4b"] Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.056975 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-utilities\") pod \"certified-operators-ftj4b\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.057083 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8xvr\" (UniqueName: \"kubernetes.io/projected/e8cff172-778b-4a9f-bb19-2cd6e64097fb-kube-api-access-z8xvr\") pod \"certified-operators-ftj4b\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.057274 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-catalog-content\") pod \"certified-operators-ftj4b\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.159782 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-utilities\") pod \"certified-operators-ftj4b\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.159852 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8xvr\" (UniqueName: \"kubernetes.io/projected/e8cff172-778b-4a9f-bb19-2cd6e64097fb-kube-api-access-z8xvr\") pod \"certified-operators-ftj4b\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.159946 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-catalog-content\") pod \"certified-operators-ftj4b\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.160638 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-catalog-content\") pod \"certified-operators-ftj4b\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.161032 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-utilities\") pod \"certified-operators-ftj4b\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.185891 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8xvr\" (UniqueName: \"kubernetes.io/projected/e8cff172-778b-4a9f-bb19-2cd6e64097fb-kube-api-access-z8xvr\") pod \"certified-operators-ftj4b\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.288707 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:33 crc kubenswrapper[5118]: I0223 07:52:33.763928 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftj4b"] Feb 23 07:52:34 crc kubenswrapper[5118]: I0223 07:52:34.256681 5118 generic.go:334] "Generic (PLEG): container finished" podID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerID="70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7" exitCode=0 Feb 23 07:52:34 crc kubenswrapper[5118]: I0223 07:52:34.256774 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftj4b" event={"ID":"e8cff172-778b-4a9f-bb19-2cd6e64097fb","Type":"ContainerDied","Data":"70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7"} Feb 23 07:52:34 crc kubenswrapper[5118]: I0223 07:52:34.256833 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftj4b" event={"ID":"e8cff172-778b-4a9f-bb19-2cd6e64097fb","Type":"ContainerStarted","Data":"559b421054df71a2e2b3d7bd6a46721646d1c35e80237c62cbb1ac07a1a2facd"} Feb 23 07:52:35 crc kubenswrapper[5118]: I0223 07:52:35.287172 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftj4b" event={"ID":"e8cff172-778b-4a9f-bb19-2cd6e64097fb","Type":"ContainerStarted","Data":"3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65"} Feb 23 07:52:36 crc kubenswrapper[5118]: I0223 07:52:36.299753 5118 generic.go:334] "Generic (PLEG): container finished" podID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerID="3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65" exitCode=0 Feb 23 07:52:36 crc kubenswrapper[5118]: I0223 07:52:36.299841 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftj4b" event={"ID":"e8cff172-778b-4a9f-bb19-2cd6e64097fb","Type":"ContainerDied","Data":"3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65"} Feb 23 07:52:38 crc kubenswrapper[5118]: I0223 07:52:38.321275 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftj4b" event={"ID":"e8cff172-778b-4a9f-bb19-2cd6e64097fb","Type":"ContainerStarted","Data":"94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782"} Feb 23 07:52:38 crc kubenswrapper[5118]: I0223 07:52:38.350290 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ftj4b" podStartSLOduration=3.832145985 podStartE2EDuration="6.350260815s" podCreationTimestamp="2026-02-23 07:52:32 +0000 UTC" firstStartedPulling="2026-02-23 07:52:34.258467474 +0000 UTC m=+4017.262252077" lastFinishedPulling="2026-02-23 07:52:36.776582334 +0000 UTC m=+4019.780366907" observedRunningTime="2026-02-23 07:52:38.346879994 +0000 UTC m=+4021.350664597" watchObservedRunningTime="2026-02-23 07:52:38.350260815 +0000 UTC m=+4021.354045438" Feb 23 07:52:40 crc kubenswrapper[5118]: I0223 07:52:40.707087 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:52:40 crc kubenswrapper[5118]: E0223 07:52:40.707670 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:52:43 crc kubenswrapper[5118]: I0223 07:52:43.290020 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:43 crc kubenswrapper[5118]: I0223 07:52:43.291476 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:43 crc kubenswrapper[5118]: I0223 07:52:43.371821 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:43 crc kubenswrapper[5118]: I0223 07:52:43.437623 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:45 crc kubenswrapper[5118]: I0223 07:52:45.918644 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftj4b"] Feb 23 07:52:45 crc kubenswrapper[5118]: I0223 07:52:45.919581 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ftj4b" podUID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerName="registry-server" containerID="cri-o://94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782" gracePeriod=2 Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.330280 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.395058 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-utilities\") pod \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.395210 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8xvr\" (UniqueName: \"kubernetes.io/projected/e8cff172-778b-4a9f-bb19-2cd6e64097fb-kube-api-access-z8xvr\") pod \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.395264 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-catalog-content\") pod \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\" (UID: \"e8cff172-778b-4a9f-bb19-2cd6e64097fb\") " Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.396363 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-utilities" (OuterVolumeSpecName: "utilities") pod "e8cff172-778b-4a9f-bb19-2cd6e64097fb" (UID: "e8cff172-778b-4a9f-bb19-2cd6e64097fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.408947 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8cff172-778b-4a9f-bb19-2cd6e64097fb-kube-api-access-z8xvr" (OuterVolumeSpecName: "kube-api-access-z8xvr") pod "e8cff172-778b-4a9f-bb19-2cd6e64097fb" (UID: "e8cff172-778b-4a9f-bb19-2cd6e64097fb"). InnerVolumeSpecName "kube-api-access-z8xvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.410205 5118 generic.go:334] "Generic (PLEG): container finished" podID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerID="94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782" exitCode=0 Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.410269 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftj4b" event={"ID":"e8cff172-778b-4a9f-bb19-2cd6e64097fb","Type":"ContainerDied","Data":"94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782"} Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.410308 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftj4b" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.410337 5118 scope.go:117] "RemoveContainer" containerID="94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.410318 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftj4b" event={"ID":"e8cff172-778b-4a9f-bb19-2cd6e64097fb","Type":"ContainerDied","Data":"559b421054df71a2e2b3d7bd6a46721646d1c35e80237c62cbb1ac07a1a2facd"} Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.448358 5118 scope.go:117] "RemoveContainer" containerID="3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.479387 5118 scope.go:117] "RemoveContainer" containerID="70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.485654 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8cff172-778b-4a9f-bb19-2cd6e64097fb" (UID: "e8cff172-778b-4a9f-bb19-2cd6e64097fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.497882 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.497938 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8xvr\" (UniqueName: \"kubernetes.io/projected/e8cff172-778b-4a9f-bb19-2cd6e64097fb-kube-api-access-z8xvr\") on node \"crc\" DevicePath \"\"" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.497974 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8cff172-778b-4a9f-bb19-2cd6e64097fb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.503607 5118 scope.go:117] "RemoveContainer" containerID="94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782" Feb 23 07:52:46 crc kubenswrapper[5118]: E0223 07:52:46.509678 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782\": container with ID starting with 94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782 not found: ID does not exist" containerID="94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.509761 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782"} err="failed to get container status \"94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782\": rpc error: code = NotFound desc = could not find container \"94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782\": container with ID starting with 94bb84bdf5213f12cb39ec8ad699f03095f4401d456cccd40dc586fe71f3b782 not found: ID does not exist" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.509810 5118 scope.go:117] "RemoveContainer" containerID="3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65" Feb 23 07:52:46 crc kubenswrapper[5118]: E0223 07:52:46.510409 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65\": container with ID starting with 3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65 not found: ID does not exist" containerID="3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.510477 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65"} err="failed to get container status \"3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65\": rpc error: code = NotFound desc = could not find container \"3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65\": container with ID starting with 3468545dd5e844b9c88df07db35dd020a91820e171af37696f29113fe116aa65 not found: ID does not exist" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.510509 5118 scope.go:117] "RemoveContainer" containerID="70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7" Feb 23 07:52:46 crc kubenswrapper[5118]: E0223 07:52:46.511171 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7\": container with ID starting with 70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7 not found: ID does not exist" containerID="70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.511284 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7"} err="failed to get container status \"70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7\": rpc error: code = NotFound desc = could not find container \"70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7\": container with ID starting with 70e9f7f52559049b764c4266528d8bc92b9c0e68503a7205c5ce2d855cd116e7 not found: ID does not exist" Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.759718 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftj4b"] Feb 23 07:52:46 crc kubenswrapper[5118]: I0223 07:52:46.766191 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ftj4b"] Feb 23 07:52:47 crc kubenswrapper[5118]: I0223 07:52:47.718742 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" path="/var/lib/kubelet/pods/e8cff172-778b-4a9f-bb19-2cd6e64097fb/volumes" Feb 23 07:52:54 crc kubenswrapper[5118]: I0223 07:52:54.697271 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:52:54 crc kubenswrapper[5118]: E0223 07:52:54.698630 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:53:05 crc kubenswrapper[5118]: I0223 07:53:05.698057 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:53:05 crc kubenswrapper[5118]: E0223 07:53:05.699589 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:53:16 crc kubenswrapper[5118]: I0223 07:53:16.697687 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:53:16 crc kubenswrapper[5118]: E0223 07:53:16.698464 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.065135 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8xzvd"] Feb 23 07:53:30 crc kubenswrapper[5118]: E0223 07:53:30.066714 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerName="extract-utilities" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.066776 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerName="extract-utilities" Feb 23 07:53:30 crc kubenswrapper[5118]: E0223 07:53:30.066809 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerName="registry-server" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.066825 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerName="registry-server" Feb 23 07:53:30 crc kubenswrapper[5118]: E0223 07:53:30.066853 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerName="extract-content" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.066870 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerName="extract-content" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.069711 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8cff172-778b-4a9f-bb19-2cd6e64097fb" containerName="registry-server" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.071887 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.085205 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xzvd"] Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.114129 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-catalog-content\") pod \"community-operators-8xzvd\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.114275 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh9m6\" (UniqueName: \"kubernetes.io/projected/38443ff1-0890-4277-a2db-c85e96ab95fa-kube-api-access-xh9m6\") pod \"community-operators-8xzvd\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.114347 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-utilities\") pod \"community-operators-8xzvd\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.216085 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-catalog-content\") pod \"community-operators-8xzvd\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.216655 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh9m6\" (UniqueName: \"kubernetes.io/projected/38443ff1-0890-4277-a2db-c85e96ab95fa-kube-api-access-xh9m6\") pod \"community-operators-8xzvd\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.217416 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-utilities\") pod \"community-operators-8xzvd\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.216940 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-catalog-content\") pod \"community-operators-8xzvd\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.218214 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-utilities\") pod \"community-operators-8xzvd\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.242741 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh9m6\" (UniqueName: \"kubernetes.io/projected/38443ff1-0890-4277-a2db-c85e96ab95fa-kube-api-access-xh9m6\") pod \"community-operators-8xzvd\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.411637 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:30 crc kubenswrapper[5118]: I0223 07:53:30.930015 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8xzvd"] Feb 23 07:53:30 crc kubenswrapper[5118]: W0223 07:53:30.944422 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38443ff1_0890_4277_a2db_c85e96ab95fa.slice/crio-6a81ccab57b79fc50099fabdd1711645c5556e2931ea976ce8cd86a9a86f588f WatchSource:0}: Error finding container 6a81ccab57b79fc50099fabdd1711645c5556e2931ea976ce8cd86a9a86f588f: Status 404 returned error can't find the container with id 6a81ccab57b79fc50099fabdd1711645c5556e2931ea976ce8cd86a9a86f588f Feb 23 07:53:31 crc kubenswrapper[5118]: I0223 07:53:31.697878 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:53:31 crc kubenswrapper[5118]: E0223 07:53:31.698238 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:53:31 crc kubenswrapper[5118]: I0223 07:53:31.911616 5118 generic.go:334] "Generic (PLEG): container finished" podID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerID="f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad" exitCode=0 Feb 23 07:53:31 crc kubenswrapper[5118]: I0223 07:53:31.911711 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xzvd" event={"ID":"38443ff1-0890-4277-a2db-c85e96ab95fa","Type":"ContainerDied","Data":"f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad"} Feb 23 07:53:31 crc kubenswrapper[5118]: I0223 07:53:31.911770 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xzvd" event={"ID":"38443ff1-0890-4277-a2db-c85e96ab95fa","Type":"ContainerStarted","Data":"6a81ccab57b79fc50099fabdd1711645c5556e2931ea976ce8cd86a9a86f588f"} Feb 23 07:53:32 crc kubenswrapper[5118]: I0223 07:53:32.923925 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xzvd" event={"ID":"38443ff1-0890-4277-a2db-c85e96ab95fa","Type":"ContainerStarted","Data":"5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d"} Feb 23 07:53:33 crc kubenswrapper[5118]: I0223 07:53:33.937259 5118 generic.go:334] "Generic (PLEG): container finished" podID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerID="5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d" exitCode=0 Feb 23 07:53:33 crc kubenswrapper[5118]: I0223 07:53:33.937338 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xzvd" event={"ID":"38443ff1-0890-4277-a2db-c85e96ab95fa","Type":"ContainerDied","Data":"5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d"} Feb 23 07:53:34 crc kubenswrapper[5118]: I0223 07:53:34.959179 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xzvd" event={"ID":"38443ff1-0890-4277-a2db-c85e96ab95fa","Type":"ContainerStarted","Data":"a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e"} Feb 23 07:53:34 crc kubenswrapper[5118]: I0223 07:53:34.987976 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8xzvd" podStartSLOduration=2.578784971 podStartE2EDuration="4.987953026s" podCreationTimestamp="2026-02-23 07:53:30 +0000 UTC" firstStartedPulling="2026-02-23 07:53:31.914855974 +0000 UTC m=+4074.918640567" lastFinishedPulling="2026-02-23 07:53:34.324024019 +0000 UTC m=+4077.327808622" observedRunningTime="2026-02-23 07:53:34.984253136 +0000 UTC m=+4077.988037739" watchObservedRunningTime="2026-02-23 07:53:34.987953026 +0000 UTC m=+4077.991737609" Feb 23 07:53:40 crc kubenswrapper[5118]: I0223 07:53:40.412391 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:40 crc kubenswrapper[5118]: I0223 07:53:40.413754 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:40 crc kubenswrapper[5118]: I0223 07:53:40.489045 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:41 crc kubenswrapper[5118]: I0223 07:53:41.089703 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:41 crc kubenswrapper[5118]: I0223 07:53:41.185176 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xzvd"] Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.036291 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8xzvd" podUID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerName="registry-server" containerID="cri-o://a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e" gracePeriod=2 Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.351757 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r4k7r"] Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.360835 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.377581 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4k7r"] Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.459377 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-utilities\") pod \"redhat-marketplace-r4k7r\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.459461 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r64fd\" (UniqueName: \"kubernetes.io/projected/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-kube-api-access-r64fd\") pod \"redhat-marketplace-r4k7r\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.459493 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-catalog-content\") pod \"redhat-marketplace-r4k7r\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.560790 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r64fd\" (UniqueName: \"kubernetes.io/projected/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-kube-api-access-r64fd\") pod \"redhat-marketplace-r4k7r\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.560870 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-catalog-content\") pod \"redhat-marketplace-r4k7r\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.561453 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-utilities\") pod \"redhat-marketplace-r4k7r\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.562225 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-utilities\") pod \"redhat-marketplace-r4k7r\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.562740 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-catalog-content\") pod \"redhat-marketplace-r4k7r\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.566115 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.593149 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r64fd\" (UniqueName: \"kubernetes.io/projected/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-kube-api-access-r64fd\") pod \"redhat-marketplace-r4k7r\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.663402 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-utilities\") pod \"38443ff1-0890-4277-a2db-c85e96ab95fa\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.663537 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh9m6\" (UniqueName: \"kubernetes.io/projected/38443ff1-0890-4277-a2db-c85e96ab95fa-kube-api-access-xh9m6\") pod \"38443ff1-0890-4277-a2db-c85e96ab95fa\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.663591 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-catalog-content\") pod \"38443ff1-0890-4277-a2db-c85e96ab95fa\" (UID: \"38443ff1-0890-4277-a2db-c85e96ab95fa\") " Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.664497 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-utilities" (OuterVolumeSpecName: "utilities") pod "38443ff1-0890-4277-a2db-c85e96ab95fa" (UID: "38443ff1-0890-4277-a2db-c85e96ab95fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.671756 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38443ff1-0890-4277-a2db-c85e96ab95fa-kube-api-access-xh9m6" (OuterVolumeSpecName: "kube-api-access-xh9m6") pod "38443ff1-0890-4277-a2db-c85e96ab95fa" (UID: "38443ff1-0890-4277-a2db-c85e96ab95fa"). InnerVolumeSpecName "kube-api-access-xh9m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.712112 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38443ff1-0890-4277-a2db-c85e96ab95fa" (UID: "38443ff1-0890-4277-a2db-c85e96ab95fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.716434 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.765421 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.765462 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh9m6\" (UniqueName: \"kubernetes.io/projected/38443ff1-0890-4277-a2db-c85e96ab95fa-kube-api-access-xh9m6\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:43 crc kubenswrapper[5118]: I0223 07:53:43.765475 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38443ff1-0890-4277-a2db-c85e96ab95fa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.048385 5118 generic.go:334] "Generic (PLEG): container finished" podID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerID="a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e" exitCode=0 Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.048481 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xzvd" event={"ID":"38443ff1-0890-4277-a2db-c85e96ab95fa","Type":"ContainerDied","Data":"a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e"} Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.048690 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8xzvd" event={"ID":"38443ff1-0890-4277-a2db-c85e96ab95fa","Type":"ContainerDied","Data":"6a81ccab57b79fc50099fabdd1711645c5556e2931ea976ce8cd86a9a86f588f"} Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.048545 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8xzvd" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.048710 5118 scope.go:117] "RemoveContainer" containerID="a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.070556 5118 scope.go:117] "RemoveContainer" containerID="5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.088229 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8xzvd"] Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.091644 5118 scope.go:117] "RemoveContainer" containerID="f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.093285 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8xzvd"] Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.112302 5118 scope.go:117] "RemoveContainer" containerID="a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e" Feb 23 07:53:44 crc kubenswrapper[5118]: E0223 07:53:44.112755 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e\": container with ID starting with a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e not found: ID does not exist" containerID="a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.112791 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e"} err="failed to get container status \"a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e\": rpc error: code = NotFound desc = could not find container \"a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e\": container with ID starting with a418037fa69659db8f84ff0dba85966bc33e0c8b5bc452e5a31db3ce73cc185e not found: ID does not exist" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.112823 5118 scope.go:117] "RemoveContainer" containerID="5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d" Feb 23 07:53:44 crc kubenswrapper[5118]: E0223 07:53:44.113141 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d\": container with ID starting with 5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d not found: ID does not exist" containerID="5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.113167 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d"} err="failed to get container status \"5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d\": rpc error: code = NotFound desc = could not find container \"5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d\": container with ID starting with 5ca991bb1864f058dcab303290000d52bc8e780dad798b78434f28fdba45ab3d not found: ID does not exist" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.113184 5118 scope.go:117] "RemoveContainer" containerID="f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad" Feb 23 07:53:44 crc kubenswrapper[5118]: E0223 07:53:44.113515 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad\": container with ID starting with f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad not found: ID does not exist" containerID="f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.113560 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad"} err="failed to get container status \"f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad\": rpc error: code = NotFound desc = could not find container \"f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad\": container with ID starting with f7c0aa426a0dde979280531bc3636b9910d662d63db1850536ffe79db9d0a3ad not found: ID does not exist" Feb 23 07:53:44 crc kubenswrapper[5118]: I0223 07:53:44.186289 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4k7r"] Feb 23 07:53:44 crc kubenswrapper[5118]: W0223 07:53:44.218663 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod312fb082_ccee_4d0b_8e43_c4bd425f0d8e.slice/crio-b19426d7d12a589f37bae56e881b46c64a1c65278b6be7b228939d9438d862ae WatchSource:0}: Error finding container b19426d7d12a589f37bae56e881b46c64a1c65278b6be7b228939d9438d862ae: Status 404 returned error can't find the container with id b19426d7d12a589f37bae56e881b46c64a1c65278b6be7b228939d9438d862ae Feb 23 07:53:45 crc kubenswrapper[5118]: I0223 07:53:45.065822 5118 generic.go:334] "Generic (PLEG): container finished" podID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerID="7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89" exitCode=0 Feb 23 07:53:45 crc kubenswrapper[5118]: I0223 07:53:45.065932 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4k7r" event={"ID":"312fb082-ccee-4d0b-8e43-c4bd425f0d8e","Type":"ContainerDied","Data":"7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89"} Feb 23 07:53:45 crc kubenswrapper[5118]: I0223 07:53:45.066421 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4k7r" event={"ID":"312fb082-ccee-4d0b-8e43-c4bd425f0d8e","Type":"ContainerStarted","Data":"b19426d7d12a589f37bae56e881b46c64a1c65278b6be7b228939d9438d862ae"} Feb 23 07:53:45 crc kubenswrapper[5118]: I0223 07:53:45.696980 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:53:45 crc kubenswrapper[5118]: E0223 07:53:45.697411 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:53:45 crc kubenswrapper[5118]: I0223 07:53:45.715319 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38443ff1-0890-4277-a2db-c85e96ab95fa" path="/var/lib/kubelet/pods/38443ff1-0890-4277-a2db-c85e96ab95fa/volumes" Feb 23 07:53:46 crc kubenswrapper[5118]: I0223 07:53:46.081246 5118 generic.go:334] "Generic (PLEG): container finished" podID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerID="59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2" exitCode=0 Feb 23 07:53:46 crc kubenswrapper[5118]: I0223 07:53:46.081312 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4k7r" event={"ID":"312fb082-ccee-4d0b-8e43-c4bd425f0d8e","Type":"ContainerDied","Data":"59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2"} Feb 23 07:53:47 crc kubenswrapper[5118]: I0223 07:53:47.094765 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4k7r" event={"ID":"312fb082-ccee-4d0b-8e43-c4bd425f0d8e","Type":"ContainerStarted","Data":"055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414"} Feb 23 07:53:47 crc kubenswrapper[5118]: I0223 07:53:47.129766 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r4k7r" podStartSLOduration=2.74640161 podStartE2EDuration="4.129735022s" podCreationTimestamp="2026-02-23 07:53:43 +0000 UTC" firstStartedPulling="2026-02-23 07:53:45.068970847 +0000 UTC m=+4088.072755460" lastFinishedPulling="2026-02-23 07:53:46.452304259 +0000 UTC m=+4089.456088872" observedRunningTime="2026-02-23 07:53:47.127014246 +0000 UTC m=+4090.130798809" watchObservedRunningTime="2026-02-23 07:53:47.129735022 +0000 UTC m=+4090.133519635" Feb 23 07:53:53 crc kubenswrapper[5118]: I0223 07:53:53.716644 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:53 crc kubenswrapper[5118]: I0223 07:53:53.717496 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:53 crc kubenswrapper[5118]: I0223 07:53:53.780031 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:54 crc kubenswrapper[5118]: I0223 07:53:54.204627 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:54 crc kubenswrapper[5118]: I0223 07:53:54.256088 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4k7r"] Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.177587 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r4k7r" podUID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerName="registry-server" containerID="cri-o://055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414" gracePeriod=2 Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.627011 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.703196 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-utilities\") pod \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.703358 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r64fd\" (UniqueName: \"kubernetes.io/projected/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-kube-api-access-r64fd\") pod \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.703592 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-catalog-content\") pod \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\" (UID: \"312fb082-ccee-4d0b-8e43-c4bd425f0d8e\") " Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.704705 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-utilities" (OuterVolumeSpecName: "utilities") pod "312fb082-ccee-4d0b-8e43-c4bd425f0d8e" (UID: "312fb082-ccee-4d0b-8e43-c4bd425f0d8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.712941 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-kube-api-access-r64fd" (OuterVolumeSpecName: "kube-api-access-r64fd") pod "312fb082-ccee-4d0b-8e43-c4bd425f0d8e" (UID: "312fb082-ccee-4d0b-8e43-c4bd425f0d8e"). InnerVolumeSpecName "kube-api-access-r64fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.730776 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "312fb082-ccee-4d0b-8e43-c4bd425f0d8e" (UID: "312fb082-ccee-4d0b-8e43-c4bd425f0d8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.805801 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.806743 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r64fd\" (UniqueName: \"kubernetes.io/projected/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-kube-api-access-r64fd\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:56 crc kubenswrapper[5118]: I0223 07:53:56.806784 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/312fb082-ccee-4d0b-8e43-c4bd425f0d8e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.191755 5118 generic.go:334] "Generic (PLEG): container finished" podID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerID="055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414" exitCode=0 Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.191830 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4k7r" event={"ID":"312fb082-ccee-4d0b-8e43-c4bd425f0d8e","Type":"ContainerDied","Data":"055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414"} Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.191862 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r4k7r" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.191926 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r4k7r" event={"ID":"312fb082-ccee-4d0b-8e43-c4bd425f0d8e","Type":"ContainerDied","Data":"b19426d7d12a589f37bae56e881b46c64a1c65278b6be7b228939d9438d862ae"} Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.191963 5118 scope.go:117] "RemoveContainer" containerID="055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.226979 5118 scope.go:117] "RemoveContainer" containerID="59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.254942 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4k7r"] Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.266183 5118 scope.go:117] "RemoveContainer" containerID="7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.267309 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r4k7r"] Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.290468 5118 scope.go:117] "RemoveContainer" containerID="055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414" Feb 23 07:53:57 crc kubenswrapper[5118]: E0223 07:53:57.291170 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414\": container with ID starting with 055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414 not found: ID does not exist" containerID="055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.291239 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414"} err="failed to get container status \"055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414\": rpc error: code = NotFound desc = could not find container \"055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414\": container with ID starting with 055a988d2d36193634e100eabdf225b0d25f867407a50021a9a9cc95a624d414 not found: ID does not exist" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.291282 5118 scope.go:117] "RemoveContainer" containerID="59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2" Feb 23 07:53:57 crc kubenswrapper[5118]: E0223 07:53:57.291858 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2\": container with ID starting with 59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2 not found: ID does not exist" containerID="59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.291888 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2"} err="failed to get container status \"59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2\": rpc error: code = NotFound desc = could not find container \"59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2\": container with ID starting with 59a6aa78f424d0d71f9488029ab15205771c5a744eff678fa8091c8582c889a2 not found: ID does not exist" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.291921 5118 scope.go:117] "RemoveContainer" containerID="7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89" Feb 23 07:53:57 crc kubenswrapper[5118]: E0223 07:53:57.292329 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89\": container with ID starting with 7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89 not found: ID does not exist" containerID="7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.292386 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89"} err="failed to get container status \"7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89\": rpc error: code = NotFound desc = could not find container \"7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89\": container with ID starting with 7f27582a01155e0152dc030ddef5902cc532367e4bb31ba1fdbfa5f7123bdb89 not found: ID does not exist" Feb 23 07:53:57 crc kubenswrapper[5118]: I0223 07:53:57.719719 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" path="/var/lib/kubelet/pods/312fb082-ccee-4d0b-8e43-c4bd425f0d8e/volumes" Feb 23 07:53:58 crc kubenswrapper[5118]: I0223 07:53:58.698487 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:53:58 crc kubenswrapper[5118]: E0223 07:53:58.699414 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:54:09 crc kubenswrapper[5118]: I0223 07:54:09.697804 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:54:09 crc kubenswrapper[5118]: E0223 07:54:09.699009 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:54:22 crc kubenswrapper[5118]: I0223 07:54:22.698282 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:54:22 crc kubenswrapper[5118]: E0223 07:54:22.699451 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:54:33 crc kubenswrapper[5118]: I0223 07:54:33.697852 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:54:33 crc kubenswrapper[5118]: E0223 07:54:33.699068 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:54:46 crc kubenswrapper[5118]: I0223 07:54:46.699082 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:54:46 crc kubenswrapper[5118]: E0223 07:54:46.702067 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:55:01 crc kubenswrapper[5118]: I0223 07:55:01.698722 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:55:01 crc kubenswrapper[5118]: E0223 07:55:01.700294 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:55:13 crc kubenswrapper[5118]: I0223 07:55:13.698475 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:55:13 crc kubenswrapper[5118]: E0223 07:55:13.699886 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:55:26 crc kubenswrapper[5118]: I0223 07:55:26.698281 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:55:26 crc kubenswrapper[5118]: E0223 07:55:26.699987 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 07:55:41 crc kubenswrapper[5118]: I0223 07:55:41.697820 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:55:43 crc kubenswrapper[5118]: I0223 07:55:43.261130 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"03f2329de6b5a92237d57da1a256a4f60b244685b3649e86a5af6e626ef42f56"} Feb 23 07:58:02 crc kubenswrapper[5118]: I0223 07:58:02.975093 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:58:02 crc kubenswrapper[5118]: I0223 07:58:02.975769 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:58:32 crc kubenswrapper[5118]: I0223 07:58:32.975400 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:58:32 crc kubenswrapper[5118]: I0223 07:58:32.976209 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:59:02 crc kubenswrapper[5118]: I0223 07:59:02.975757 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 07:59:02 crc kubenswrapper[5118]: I0223 07:59:02.976776 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 07:59:02 crc kubenswrapper[5118]: I0223 07:59:02.976842 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 07:59:02 crc kubenswrapper[5118]: I0223 07:59:02.977765 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"03f2329de6b5a92237d57da1a256a4f60b244685b3649e86a5af6e626ef42f56"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 07:59:02 crc kubenswrapper[5118]: I0223 07:59:02.977849 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://03f2329de6b5a92237d57da1a256a4f60b244685b3649e86a5af6e626ef42f56" gracePeriod=600 Feb 23 07:59:03 crc kubenswrapper[5118]: I0223 07:59:03.781702 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="03f2329de6b5a92237d57da1a256a4f60b244685b3649e86a5af6e626ef42f56" exitCode=0 Feb 23 07:59:03 crc kubenswrapper[5118]: I0223 07:59:03.781801 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"03f2329de6b5a92237d57da1a256a4f60b244685b3649e86a5af6e626ef42f56"} Feb 23 07:59:03 crc kubenswrapper[5118]: I0223 07:59:03.782353 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63"} Feb 23 07:59:03 crc kubenswrapper[5118]: I0223 07:59:03.782388 5118 scope.go:117] "RemoveContainer" containerID="77105b2815a9689d4f963ea9287acf30c8d4707072eb208a40dc9fbbb892b5c1" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.151766 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cv7x7"] Feb 23 07:59:46 crc kubenswrapper[5118]: E0223 07:59:46.152645 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerName="registry-server" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.152662 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerName="registry-server" Feb 23 07:59:46 crc kubenswrapper[5118]: E0223 07:59:46.152682 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerName="extract-utilities" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.152691 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerName="extract-utilities" Feb 23 07:59:46 crc kubenswrapper[5118]: E0223 07:59:46.152713 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerName="registry-server" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.152722 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerName="registry-server" Feb 23 07:59:46 crc kubenswrapper[5118]: E0223 07:59:46.152733 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerName="extract-content" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.152764 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerName="extract-content" Feb 23 07:59:46 crc kubenswrapper[5118]: E0223 07:59:46.152778 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerName="extract-content" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.152785 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerName="extract-content" Feb 23 07:59:46 crc kubenswrapper[5118]: E0223 07:59:46.152798 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerName="extract-utilities" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.152807 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerName="extract-utilities" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.153025 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="312fb082-ccee-4d0b-8e43-c4bd425f0d8e" containerName="registry-server" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.153046 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="38443ff1-0890-4277-a2db-c85e96ab95fa" containerName="registry-server" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.156415 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.166695 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cv7x7"] Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.189088 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xp9f\" (UniqueName: \"kubernetes.io/projected/570dbf79-52d9-4920-90ed-8e5c56332179-kube-api-access-4xp9f\") pod \"redhat-operators-cv7x7\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.189305 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-catalog-content\") pod \"redhat-operators-cv7x7\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.189376 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-utilities\") pod \"redhat-operators-cv7x7\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.291441 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xp9f\" (UniqueName: \"kubernetes.io/projected/570dbf79-52d9-4920-90ed-8e5c56332179-kube-api-access-4xp9f\") pod \"redhat-operators-cv7x7\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.291517 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-catalog-content\") pod \"redhat-operators-cv7x7\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.291540 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-utilities\") pod \"redhat-operators-cv7x7\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.292088 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-catalog-content\") pod \"redhat-operators-cv7x7\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.292200 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-utilities\") pod \"redhat-operators-cv7x7\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.320952 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xp9f\" (UniqueName: \"kubernetes.io/projected/570dbf79-52d9-4920-90ed-8e5c56332179-kube-api-access-4xp9f\") pod \"redhat-operators-cv7x7\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:46 crc kubenswrapper[5118]: I0223 07:59:46.485347 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:47 crc kubenswrapper[5118]: I0223 07:59:47.036816 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cv7x7"] Feb 23 07:59:47 crc kubenswrapper[5118]: I0223 07:59:47.219250 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv7x7" event={"ID":"570dbf79-52d9-4920-90ed-8e5c56332179","Type":"ContainerStarted","Data":"9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b"} Feb 23 07:59:47 crc kubenswrapper[5118]: I0223 07:59:47.219332 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv7x7" event={"ID":"570dbf79-52d9-4920-90ed-8e5c56332179","Type":"ContainerStarted","Data":"81dfaa1fd251123f53cf2f4479a0f4be05ae66fccd37f5ac03dd9e28d3ba8733"} Feb 23 07:59:48 crc kubenswrapper[5118]: I0223 07:59:48.229915 5118 generic.go:334] "Generic (PLEG): container finished" podID="570dbf79-52d9-4920-90ed-8e5c56332179" containerID="9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b" exitCode=0 Feb 23 07:59:48 crc kubenswrapper[5118]: I0223 07:59:48.230001 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv7x7" event={"ID":"570dbf79-52d9-4920-90ed-8e5c56332179","Type":"ContainerDied","Data":"9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b"} Feb 23 07:59:48 crc kubenswrapper[5118]: I0223 07:59:48.234166 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 07:59:49 crc kubenswrapper[5118]: I0223 07:59:49.245410 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv7x7" event={"ID":"570dbf79-52d9-4920-90ed-8e5c56332179","Type":"ContainerStarted","Data":"84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a"} Feb 23 07:59:50 crc kubenswrapper[5118]: I0223 07:59:50.269641 5118 generic.go:334] "Generic (PLEG): container finished" podID="570dbf79-52d9-4920-90ed-8e5c56332179" containerID="84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a" exitCode=0 Feb 23 07:59:50 crc kubenswrapper[5118]: I0223 07:59:50.269821 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv7x7" event={"ID":"570dbf79-52d9-4920-90ed-8e5c56332179","Type":"ContainerDied","Data":"84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a"} Feb 23 07:59:51 crc kubenswrapper[5118]: I0223 07:59:51.298828 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv7x7" event={"ID":"570dbf79-52d9-4920-90ed-8e5c56332179","Type":"ContainerStarted","Data":"534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75"} Feb 23 07:59:51 crc kubenswrapper[5118]: I0223 07:59:51.323274 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cv7x7" podStartSLOduration=2.8706118160000003 podStartE2EDuration="5.323255829s" podCreationTimestamp="2026-02-23 07:59:46 +0000 UTC" firstStartedPulling="2026-02-23 07:59:48.23370517 +0000 UTC m=+4451.237489773" lastFinishedPulling="2026-02-23 07:59:50.686349173 +0000 UTC m=+4453.690133786" observedRunningTime="2026-02-23 07:59:51.317427079 +0000 UTC m=+4454.321211662" watchObservedRunningTime="2026-02-23 07:59:51.323255829 +0000 UTC m=+4454.327040402" Feb 23 07:59:56 crc kubenswrapper[5118]: I0223 07:59:56.485974 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:56 crc kubenswrapper[5118]: I0223 07:59:56.486611 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 07:59:57 crc kubenswrapper[5118]: I0223 07:59:57.535687 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cv7x7" podUID="570dbf79-52d9-4920-90ed-8e5c56332179" containerName="registry-server" probeResult="failure" output=< Feb 23 07:59:57 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 07:59:57 crc kubenswrapper[5118]: > Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.217918 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj"] Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.220139 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.224458 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.225469 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.226973 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj"] Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.333643 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgb4r\" (UniqueName: \"kubernetes.io/projected/917bdb73-96fa-41cf-b160-791f8e4503b7-kube-api-access-sgb4r\") pod \"collect-profiles-29530560-kr8cj\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.333711 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/917bdb73-96fa-41cf-b160-791f8e4503b7-config-volume\") pod \"collect-profiles-29530560-kr8cj\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.333772 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/917bdb73-96fa-41cf-b160-791f8e4503b7-secret-volume\") pod \"collect-profiles-29530560-kr8cj\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.435213 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgb4r\" (UniqueName: \"kubernetes.io/projected/917bdb73-96fa-41cf-b160-791f8e4503b7-kube-api-access-sgb4r\") pod \"collect-profiles-29530560-kr8cj\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.435323 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/917bdb73-96fa-41cf-b160-791f8e4503b7-config-volume\") pod \"collect-profiles-29530560-kr8cj\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.435429 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/917bdb73-96fa-41cf-b160-791f8e4503b7-secret-volume\") pod \"collect-profiles-29530560-kr8cj\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.436511 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/917bdb73-96fa-41cf-b160-791f8e4503b7-config-volume\") pod \"collect-profiles-29530560-kr8cj\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.445966 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/917bdb73-96fa-41cf-b160-791f8e4503b7-secret-volume\") pod \"collect-profiles-29530560-kr8cj\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.457228 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgb4r\" (UniqueName: \"kubernetes.io/projected/917bdb73-96fa-41cf-b160-791f8e4503b7-kube-api-access-sgb4r\") pod \"collect-profiles-29530560-kr8cj\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:00 crc kubenswrapper[5118]: I0223 08:00:00.549861 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:01 crc kubenswrapper[5118]: I0223 08:00:01.485979 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj"] Feb 23 08:00:01 crc kubenswrapper[5118]: W0223 08:00:01.496019 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod917bdb73_96fa_41cf_b160_791f8e4503b7.slice/crio-27dbf8c7747b78a438887ce228a7d491aa7d995f87b526e105e1751fb8f4484c WatchSource:0}: Error finding container 27dbf8c7747b78a438887ce228a7d491aa7d995f87b526e105e1751fb8f4484c: Status 404 returned error can't find the container with id 27dbf8c7747b78a438887ce228a7d491aa7d995f87b526e105e1751fb8f4484c Feb 23 08:00:02 crc kubenswrapper[5118]: I0223 08:00:02.392923 5118 generic.go:334] "Generic (PLEG): container finished" podID="917bdb73-96fa-41cf-b160-791f8e4503b7" containerID="f4fba51cf8567d8478c874550c14c64c3d870a99fe36a9035a8c0549a98d79ee" exitCode=0 Feb 23 08:00:02 crc kubenswrapper[5118]: I0223 08:00:02.393007 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" event={"ID":"917bdb73-96fa-41cf-b160-791f8e4503b7","Type":"ContainerDied","Data":"f4fba51cf8567d8478c874550c14c64c3d870a99fe36a9035a8c0549a98d79ee"} Feb 23 08:00:02 crc kubenswrapper[5118]: I0223 08:00:02.393580 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" event={"ID":"917bdb73-96fa-41cf-b160-791f8e4503b7","Type":"ContainerStarted","Data":"27dbf8c7747b78a438887ce228a7d491aa7d995f87b526e105e1751fb8f4484c"} Feb 23 08:00:03 crc kubenswrapper[5118]: I0223 08:00:03.770747 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:03 crc kubenswrapper[5118]: I0223 08:00:03.785040 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/917bdb73-96fa-41cf-b160-791f8e4503b7-secret-volume\") pod \"917bdb73-96fa-41cf-b160-791f8e4503b7\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " Feb 23 08:00:03 crc kubenswrapper[5118]: I0223 08:00:03.785134 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgb4r\" (UniqueName: \"kubernetes.io/projected/917bdb73-96fa-41cf-b160-791f8e4503b7-kube-api-access-sgb4r\") pod \"917bdb73-96fa-41cf-b160-791f8e4503b7\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " Feb 23 08:00:03 crc kubenswrapper[5118]: I0223 08:00:03.785272 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/917bdb73-96fa-41cf-b160-791f8e4503b7-config-volume\") pod \"917bdb73-96fa-41cf-b160-791f8e4503b7\" (UID: \"917bdb73-96fa-41cf-b160-791f8e4503b7\") " Feb 23 08:00:03 crc kubenswrapper[5118]: I0223 08:00:03.785973 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/917bdb73-96fa-41cf-b160-791f8e4503b7-config-volume" (OuterVolumeSpecName: "config-volume") pod "917bdb73-96fa-41cf-b160-791f8e4503b7" (UID: "917bdb73-96fa-41cf-b160-791f8e4503b7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:00:03 crc kubenswrapper[5118]: I0223 08:00:03.794210 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/917bdb73-96fa-41cf-b160-791f8e4503b7-kube-api-access-sgb4r" (OuterVolumeSpecName: "kube-api-access-sgb4r") pod "917bdb73-96fa-41cf-b160-791f8e4503b7" (UID: "917bdb73-96fa-41cf-b160-791f8e4503b7"). InnerVolumeSpecName "kube-api-access-sgb4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:00:03 crc kubenswrapper[5118]: I0223 08:00:03.794908 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917bdb73-96fa-41cf-b160-791f8e4503b7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "917bdb73-96fa-41cf-b160-791f8e4503b7" (UID: "917bdb73-96fa-41cf-b160-791f8e4503b7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:00:03 crc kubenswrapper[5118]: I0223 08:00:03.886992 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/917bdb73-96fa-41cf-b160-791f8e4503b7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:03 crc kubenswrapper[5118]: I0223 08:00:03.887342 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/917bdb73-96fa-41cf-b160-791f8e4503b7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:03 crc kubenswrapper[5118]: I0223 08:00:03.887469 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgb4r\" (UniqueName: \"kubernetes.io/projected/917bdb73-96fa-41cf-b160-791f8e4503b7-kube-api-access-sgb4r\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:04 crc kubenswrapper[5118]: I0223 08:00:04.413761 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" event={"ID":"917bdb73-96fa-41cf-b160-791f8e4503b7","Type":"ContainerDied","Data":"27dbf8c7747b78a438887ce228a7d491aa7d995f87b526e105e1751fb8f4484c"} Feb 23 08:00:04 crc kubenswrapper[5118]: I0223 08:00:04.413822 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27dbf8c7747b78a438887ce228a7d491aa7d995f87b526e105e1751fb8f4484c" Feb 23 08:00:04 crc kubenswrapper[5118]: I0223 08:00:04.414506 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj" Feb 23 08:00:04 crc kubenswrapper[5118]: I0223 08:00:04.878363 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56"] Feb 23 08:00:04 crc kubenswrapper[5118]: I0223 08:00:04.885918 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530515-frp56"] Feb 23 08:00:05 crc kubenswrapper[5118]: I0223 08:00:05.716462 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="239bb60c-59c0-4f58-af9f-aa12fc4226d5" path="/var/lib/kubelet/pods/239bb60c-59c0-4f58-af9f-aa12fc4226d5/volumes" Feb 23 08:00:06 crc kubenswrapper[5118]: I0223 08:00:06.554581 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 08:00:06 crc kubenswrapper[5118]: I0223 08:00:06.648550 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 08:00:06 crc kubenswrapper[5118]: I0223 08:00:06.811987 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cv7x7"] Feb 23 08:00:08 crc kubenswrapper[5118]: I0223 08:00:08.468970 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cv7x7" podUID="570dbf79-52d9-4920-90ed-8e5c56332179" containerName="registry-server" containerID="cri-o://534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75" gracePeriod=2 Feb 23 08:00:08 crc kubenswrapper[5118]: I0223 08:00:08.973868 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.075077 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-utilities\") pod \"570dbf79-52d9-4920-90ed-8e5c56332179\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.075472 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xp9f\" (UniqueName: \"kubernetes.io/projected/570dbf79-52d9-4920-90ed-8e5c56332179-kube-api-access-4xp9f\") pod \"570dbf79-52d9-4920-90ed-8e5c56332179\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.075617 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-catalog-content\") pod \"570dbf79-52d9-4920-90ed-8e5c56332179\" (UID: \"570dbf79-52d9-4920-90ed-8e5c56332179\") " Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.076139 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-utilities" (OuterVolumeSpecName: "utilities") pod "570dbf79-52d9-4920-90ed-8e5c56332179" (UID: "570dbf79-52d9-4920-90ed-8e5c56332179"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.076473 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.083414 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/570dbf79-52d9-4920-90ed-8e5c56332179-kube-api-access-4xp9f" (OuterVolumeSpecName: "kube-api-access-4xp9f") pod "570dbf79-52d9-4920-90ed-8e5c56332179" (UID: "570dbf79-52d9-4920-90ed-8e5c56332179"). InnerVolumeSpecName "kube-api-access-4xp9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.177424 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xp9f\" (UniqueName: \"kubernetes.io/projected/570dbf79-52d9-4920-90ed-8e5c56332179-kube-api-access-4xp9f\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.246836 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "570dbf79-52d9-4920-90ed-8e5c56332179" (UID: "570dbf79-52d9-4920-90ed-8e5c56332179"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.279073 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/570dbf79-52d9-4920-90ed-8e5c56332179-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.481235 5118 generic.go:334] "Generic (PLEG): container finished" podID="570dbf79-52d9-4920-90ed-8e5c56332179" containerID="534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75" exitCode=0 Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.481295 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv7x7" event={"ID":"570dbf79-52d9-4920-90ed-8e5c56332179","Type":"ContainerDied","Data":"534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75"} Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.481326 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cv7x7" event={"ID":"570dbf79-52d9-4920-90ed-8e5c56332179","Type":"ContainerDied","Data":"81dfaa1fd251123f53cf2f4479a0f4be05ae66fccd37f5ac03dd9e28d3ba8733"} Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.481347 5118 scope.go:117] "RemoveContainer" containerID="534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.481433 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cv7x7" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.518570 5118 scope.go:117] "RemoveContainer" containerID="84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.553535 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cv7x7"] Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.565191 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cv7x7"] Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.565534 5118 scope.go:117] "RemoveContainer" containerID="9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.611381 5118 scope.go:117] "RemoveContainer" containerID="534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75" Feb 23 08:00:09 crc kubenswrapper[5118]: E0223 08:00:09.611923 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75\": container with ID starting with 534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75 not found: ID does not exist" containerID="534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.611961 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75"} err="failed to get container status \"534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75\": rpc error: code = NotFound desc = could not find container \"534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75\": container with ID starting with 534617f1aaa3a92c0c6720300149818ce085d149fa3ea3ed7ebda958f09a6e75 not found: ID does not exist" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.611986 5118 scope.go:117] "RemoveContainer" containerID="84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a" Feb 23 08:00:09 crc kubenswrapper[5118]: E0223 08:00:09.612343 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a\": container with ID starting with 84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a not found: ID does not exist" containerID="84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.612376 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a"} err="failed to get container status \"84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a\": rpc error: code = NotFound desc = could not find container \"84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a\": container with ID starting with 84312418bfbad712f3abacb6ee0a8f967ad10004fe870f57007d20ec469fc06a not found: ID does not exist" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.612390 5118 scope.go:117] "RemoveContainer" containerID="9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b" Feb 23 08:00:09 crc kubenswrapper[5118]: E0223 08:00:09.612728 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b\": container with ID starting with 9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b not found: ID does not exist" containerID="9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.612804 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b"} err="failed to get container status \"9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b\": rpc error: code = NotFound desc = could not find container \"9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b\": container with ID starting with 9d7f963837385a7d5083d5e7a21350e25a6d8cfa945ec43a37d20ead4716a68b not found: ID does not exist" Feb 23 08:00:09 crc kubenswrapper[5118]: I0223 08:00:09.715180 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="570dbf79-52d9-4920-90ed-8e5c56332179" path="/var/lib/kubelet/pods/570dbf79-52d9-4920-90ed-8e5c56332179/volumes" Feb 23 08:00:50 crc kubenswrapper[5118]: I0223 08:00:50.275140 5118 scope.go:117] "RemoveContainer" containerID="5a1a2212618fb7f02f27fc3c3d2b8b978cd62e1cbb98ea3df5895ffd931b29a9" Feb 23 08:01:32 crc kubenswrapper[5118]: I0223 08:01:32.974958 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:01:32 crc kubenswrapper[5118]: I0223 08:01:32.975631 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:02:02 crc kubenswrapper[5118]: I0223 08:02:02.975922 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:02:02 crc kubenswrapper[5118]: I0223 08:02:02.976585 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:02:32 crc kubenswrapper[5118]: I0223 08:02:32.975369 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:02:32 crc kubenswrapper[5118]: I0223 08:02:32.976904 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:02:32 crc kubenswrapper[5118]: I0223 08:02:32.977041 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 08:02:32 crc kubenswrapper[5118]: I0223 08:02:32.978049 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:02:32 crc kubenswrapper[5118]: I0223 08:02:32.978188 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" gracePeriod=600 Feb 23 08:02:33 crc kubenswrapper[5118]: E0223 08:02:33.134490 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:02:33 crc kubenswrapper[5118]: I0223 08:02:33.205849 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" exitCode=0 Feb 23 08:02:33 crc kubenswrapper[5118]: I0223 08:02:33.205974 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63"} Feb 23 08:02:33 crc kubenswrapper[5118]: I0223 08:02:33.206495 5118 scope.go:117] "RemoveContainer" containerID="03f2329de6b5a92237d57da1a256a4f60b244685b3649e86a5af6e626ef42f56" Feb 23 08:02:33 crc kubenswrapper[5118]: I0223 08:02:33.213827 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:02:33 crc kubenswrapper[5118]: E0223 08:02:33.214485 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:02:46 crc kubenswrapper[5118]: I0223 08:02:46.698185 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:02:46 crc kubenswrapper[5118]: E0223 08:02:46.699382 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:03:00 crc kubenswrapper[5118]: I0223 08:03:00.698327 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:03:00 crc kubenswrapper[5118]: E0223 08:03:00.700139 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.800766 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x5rhj"] Feb 23 08:03:02 crc kubenswrapper[5118]: E0223 08:03:02.801796 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="570dbf79-52d9-4920-90ed-8e5c56332179" containerName="extract-utilities" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.801813 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="570dbf79-52d9-4920-90ed-8e5c56332179" containerName="extract-utilities" Feb 23 08:03:02 crc kubenswrapper[5118]: E0223 08:03:02.801830 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="917bdb73-96fa-41cf-b160-791f8e4503b7" containerName="collect-profiles" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.801838 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="917bdb73-96fa-41cf-b160-791f8e4503b7" containerName="collect-profiles" Feb 23 08:03:02 crc kubenswrapper[5118]: E0223 08:03:02.801858 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="570dbf79-52d9-4920-90ed-8e5c56332179" containerName="extract-content" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.801868 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="570dbf79-52d9-4920-90ed-8e5c56332179" containerName="extract-content" Feb 23 08:03:02 crc kubenswrapper[5118]: E0223 08:03:02.801889 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="570dbf79-52d9-4920-90ed-8e5c56332179" containerName="registry-server" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.801897 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="570dbf79-52d9-4920-90ed-8e5c56332179" containerName="registry-server" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.802092 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="570dbf79-52d9-4920-90ed-8e5c56332179" containerName="registry-server" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.802144 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="917bdb73-96fa-41cf-b160-791f8e4503b7" containerName="collect-profiles" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.803668 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.815247 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x5rhj"] Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.848515 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-catalog-content\") pod \"certified-operators-x5rhj\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.848765 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8njg\" (UniqueName: \"kubernetes.io/projected/c6ca64fa-f0a0-43d8-be95-99853896652e-kube-api-access-d8njg\") pod \"certified-operators-x5rhj\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.849116 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-utilities\") pod \"certified-operators-x5rhj\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.950204 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-utilities\") pod \"certified-operators-x5rhj\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.950268 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-catalog-content\") pod \"certified-operators-x5rhj\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.950322 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8njg\" (UniqueName: \"kubernetes.io/projected/c6ca64fa-f0a0-43d8-be95-99853896652e-kube-api-access-d8njg\") pod \"certified-operators-x5rhj\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.950773 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-utilities\") pod \"certified-operators-x5rhj\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.950977 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-catalog-content\") pod \"certified-operators-x5rhj\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:02 crc kubenswrapper[5118]: I0223 08:03:02.978834 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8njg\" (UniqueName: \"kubernetes.io/projected/c6ca64fa-f0a0-43d8-be95-99853896652e-kube-api-access-d8njg\") pod \"certified-operators-x5rhj\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:03 crc kubenswrapper[5118]: I0223 08:03:03.143369 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:03 crc kubenswrapper[5118]: I0223 08:03:03.684427 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x5rhj"] Feb 23 08:03:04 crc kubenswrapper[5118]: I0223 08:03:04.539992 5118 generic.go:334] "Generic (PLEG): container finished" podID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerID="8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e" exitCode=0 Feb 23 08:03:04 crc kubenswrapper[5118]: I0223 08:03:04.540138 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rhj" event={"ID":"c6ca64fa-f0a0-43d8-be95-99853896652e","Type":"ContainerDied","Data":"8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e"} Feb 23 08:03:04 crc kubenswrapper[5118]: I0223 08:03:04.543558 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rhj" event={"ID":"c6ca64fa-f0a0-43d8-be95-99853896652e","Type":"ContainerStarted","Data":"c07fe43471a891c90dde84c399105de5224241e1fde25e8ba7190f6f29a51871"} Feb 23 08:03:05 crc kubenswrapper[5118]: I0223 08:03:05.552730 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rhj" event={"ID":"c6ca64fa-f0a0-43d8-be95-99853896652e","Type":"ContainerStarted","Data":"8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5"} Feb 23 08:03:05 crc kubenswrapper[5118]: E0223 08:03:05.693517 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ca64fa_f0a0_43d8_be95_99853896652e.slice/crio-8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ca64fa_f0a0_43d8_be95_99853896652e.slice/crio-conmon-8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5.scope\": RecentStats: unable to find data in memory cache]" Feb 23 08:03:06 crc kubenswrapper[5118]: I0223 08:03:06.563817 5118 generic.go:334] "Generic (PLEG): container finished" podID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerID="8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5" exitCode=0 Feb 23 08:03:06 crc kubenswrapper[5118]: I0223 08:03:06.563905 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rhj" event={"ID":"c6ca64fa-f0a0-43d8-be95-99853896652e","Type":"ContainerDied","Data":"8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5"} Feb 23 08:03:07 crc kubenswrapper[5118]: I0223 08:03:07.576321 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rhj" event={"ID":"c6ca64fa-f0a0-43d8-be95-99853896652e","Type":"ContainerStarted","Data":"23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96"} Feb 23 08:03:07 crc kubenswrapper[5118]: I0223 08:03:07.612421 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x5rhj" podStartSLOduration=3.176324453 podStartE2EDuration="5.612393386s" podCreationTimestamp="2026-02-23 08:03:02 +0000 UTC" firstStartedPulling="2026-02-23 08:03:04.543169528 +0000 UTC m=+4647.546954131" lastFinishedPulling="2026-02-23 08:03:06.979238461 +0000 UTC m=+4649.983023064" observedRunningTime="2026-02-23 08:03:07.605377837 +0000 UTC m=+4650.609162420" watchObservedRunningTime="2026-02-23 08:03:07.612393386 +0000 UTC m=+4650.616178009" Feb 23 08:03:11 crc kubenswrapper[5118]: I0223 08:03:11.698034 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:03:11 crc kubenswrapper[5118]: E0223 08:03:11.700579 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:03:13 crc kubenswrapper[5118]: I0223 08:03:13.144021 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:13 crc kubenswrapper[5118]: I0223 08:03:13.144386 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:13 crc kubenswrapper[5118]: I0223 08:03:13.225440 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:13 crc kubenswrapper[5118]: I0223 08:03:13.724792 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:13 crc kubenswrapper[5118]: I0223 08:03:13.791880 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x5rhj"] Feb 23 08:03:15 crc kubenswrapper[5118]: I0223 08:03:15.684576 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x5rhj" podUID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerName="registry-server" containerID="cri-o://23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96" gracePeriod=2 Feb 23 08:03:15 crc kubenswrapper[5118]: E0223 08:03:15.919032 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.638883 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.703703 5118 generic.go:334] "Generic (PLEG): container finished" podID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerID="23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96" exitCode=0 Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.703757 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rhj" event={"ID":"c6ca64fa-f0a0-43d8-be95-99853896652e","Type":"ContainerDied","Data":"23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96"} Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.704385 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x5rhj" event={"ID":"c6ca64fa-f0a0-43d8-be95-99853896652e","Type":"ContainerDied","Data":"c07fe43471a891c90dde84c399105de5224241e1fde25e8ba7190f6f29a51871"} Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.704432 5118 scope.go:117] "RemoveContainer" containerID="23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.703810 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x5rhj" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.712676 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-catalog-content\") pod \"c6ca64fa-f0a0-43d8-be95-99853896652e\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.712831 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-utilities\") pod \"c6ca64fa-f0a0-43d8-be95-99853896652e\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.713031 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8njg\" (UniqueName: \"kubernetes.io/projected/c6ca64fa-f0a0-43d8-be95-99853896652e-kube-api-access-d8njg\") pod \"c6ca64fa-f0a0-43d8-be95-99853896652e\" (UID: \"c6ca64fa-f0a0-43d8-be95-99853896652e\") " Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.714004 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-utilities" (OuterVolumeSpecName: "utilities") pod "c6ca64fa-f0a0-43d8-be95-99853896652e" (UID: "c6ca64fa-f0a0-43d8-be95-99853896652e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.721233 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ca64fa-f0a0-43d8-be95-99853896652e-kube-api-access-d8njg" (OuterVolumeSpecName: "kube-api-access-d8njg") pod "c6ca64fa-f0a0-43d8-be95-99853896652e" (UID: "c6ca64fa-f0a0-43d8-be95-99853896652e"). InnerVolumeSpecName "kube-api-access-d8njg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.737953 5118 scope.go:117] "RemoveContainer" containerID="8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.775325 5118 scope.go:117] "RemoveContainer" containerID="8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.788927 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6ca64fa-f0a0-43d8-be95-99853896652e" (UID: "c6ca64fa-f0a0-43d8-be95-99853896652e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.807758 5118 scope.go:117] "RemoveContainer" containerID="23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96" Feb 23 08:03:16 crc kubenswrapper[5118]: E0223 08:03:16.808215 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96\": container with ID starting with 23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96 not found: ID does not exist" containerID="23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.808261 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96"} err="failed to get container status \"23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96\": rpc error: code = NotFound desc = could not find container \"23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96\": container with ID starting with 23f02156804ac34abb4c4a20160c76f3748fc316dbc40ca38a07c88d0679ca96 not found: ID does not exist" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.808295 5118 scope.go:117] "RemoveContainer" containerID="8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5" Feb 23 08:03:16 crc kubenswrapper[5118]: E0223 08:03:16.808742 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5\": container with ID starting with 8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5 not found: ID does not exist" containerID="8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.808788 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5"} err="failed to get container status \"8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5\": rpc error: code = NotFound desc = could not find container \"8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5\": container with ID starting with 8fb40f08c3a69a6db2eae9ffedf9585fa597884b2454361560533d77b082baa5 not found: ID does not exist" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.808813 5118 scope.go:117] "RemoveContainer" containerID="8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e" Feb 23 08:03:16 crc kubenswrapper[5118]: E0223 08:03:16.809244 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e\": container with ID starting with 8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e not found: ID does not exist" containerID="8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.809315 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e"} err="failed to get container status \"8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e\": rpc error: code = NotFound desc = could not find container \"8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e\": container with ID starting with 8a684506e0c7a6ef2f8ba0fd36cd88d4dae3dfba854820394c289b754a02768e not found: ID does not exist" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.815569 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.815605 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ca64fa-f0a0-43d8-be95-99853896652e-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:03:16 crc kubenswrapper[5118]: I0223 08:03:16.815624 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8njg\" (UniqueName: \"kubernetes.io/projected/c6ca64fa-f0a0-43d8-be95-99853896652e-kube-api-access-d8njg\") on node \"crc\" DevicePath \"\"" Feb 23 08:03:17 crc kubenswrapper[5118]: I0223 08:03:17.063696 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x5rhj"] Feb 23 08:03:17 crc kubenswrapper[5118]: I0223 08:03:17.075024 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x5rhj"] Feb 23 08:03:17 crc kubenswrapper[5118]: I0223 08:03:17.715968 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ca64fa-f0a0-43d8-be95-99853896652e" path="/var/lib/kubelet/pods/c6ca64fa-f0a0-43d8-be95-99853896652e/volumes" Feb 23 08:03:25 crc kubenswrapper[5118]: I0223 08:03:25.698233 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:03:25 crc kubenswrapper[5118]: E0223 08:03:25.699762 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:03:38 crc kubenswrapper[5118]: I0223 08:03:38.697591 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:03:38 crc kubenswrapper[5118]: E0223 08:03:38.699201 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:03:51 crc kubenswrapper[5118]: I0223 08:03:51.697420 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:03:51 crc kubenswrapper[5118]: E0223 08:03:51.698344 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:04:04 crc kubenswrapper[5118]: I0223 08:04:04.697233 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:04:04 crc kubenswrapper[5118]: E0223 08:04:04.698617 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:04:19 crc kubenswrapper[5118]: I0223 08:04:19.697777 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:04:19 crc kubenswrapper[5118]: E0223 08:04:19.698592 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:04:30 crc kubenswrapper[5118]: I0223 08:04:30.697802 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:04:30 crc kubenswrapper[5118]: E0223 08:04:30.698536 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:04:44 crc kubenswrapper[5118]: I0223 08:04:44.697671 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:04:44 crc kubenswrapper[5118]: E0223 08:04:44.698953 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:04:55 crc kubenswrapper[5118]: I0223 08:04:55.696799 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:04:55 crc kubenswrapper[5118]: E0223 08:04:55.697507 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:05:09 crc kubenswrapper[5118]: I0223 08:05:09.698066 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:05:09 crc kubenswrapper[5118]: E0223 08:05:09.699072 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:05:23 crc kubenswrapper[5118]: I0223 08:05:23.697317 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:05:23 crc kubenswrapper[5118]: E0223 08:05:23.698192 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:05:36 crc kubenswrapper[5118]: I0223 08:05:36.697755 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:05:36 crc kubenswrapper[5118]: E0223 08:05:36.699266 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:05:47 crc kubenswrapper[5118]: I0223 08:05:47.713802 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:05:47 crc kubenswrapper[5118]: E0223 08:05:47.724754 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:06:00 crc kubenswrapper[5118]: I0223 08:06:00.697881 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:06:00 crc kubenswrapper[5118]: E0223 08:06:00.698776 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:06:11 crc kubenswrapper[5118]: I0223 08:06:11.699567 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:06:11 crc kubenswrapper[5118]: E0223 08:06:11.703685 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:06:25 crc kubenswrapper[5118]: I0223 08:06:25.698435 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:06:25 crc kubenswrapper[5118]: E0223 08:06:25.699800 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:06:40 crc kubenswrapper[5118]: I0223 08:06:40.697454 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:06:40 crc kubenswrapper[5118]: E0223 08:06:40.698221 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:06:52 crc kubenswrapper[5118]: I0223 08:06:52.697387 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:06:52 crc kubenswrapper[5118]: E0223 08:06:52.698446 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:07:07 crc kubenswrapper[5118]: I0223 08:07:07.705350 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:07:07 crc kubenswrapper[5118]: E0223 08:07:07.706797 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:07:21 crc kubenswrapper[5118]: I0223 08:07:21.697966 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:07:21 crc kubenswrapper[5118]: E0223 08:07:21.698964 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:07:35 crc kubenswrapper[5118]: I0223 08:07:35.697697 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:07:36 crc kubenswrapper[5118]: I0223 08:07:36.273061 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"06c13a9258503f19184e9cb3f622c606a2eb03b98d69f5116a81e486b888cb16"} Feb 23 08:08:12 crc kubenswrapper[5118]: I0223 08:08:12.983637 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2lj8f"] Feb 23 08:08:13 crc kubenswrapper[5118]: E0223 08:08:12.997962 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerName="registry-server" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:12.998003 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerName="registry-server" Feb 23 08:08:13 crc kubenswrapper[5118]: E0223 08:08:12.998028 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerName="extract-content" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:12.998042 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerName="extract-content" Feb 23 08:08:13 crc kubenswrapper[5118]: E0223 08:08:12.998078 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerName="extract-utilities" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:12.998121 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerName="extract-utilities" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:12.998429 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ca64fa-f0a0-43d8-be95-99853896652e" containerName="registry-server" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.000671 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.005162 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lj8f"] Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.191201 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-catalog-content\") pod \"community-operators-2lj8f\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.191294 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z7td\" (UniqueName: \"kubernetes.io/projected/42c7e4d1-0fbf-4984-864e-e795724aeed5-kube-api-access-4z7td\") pod \"community-operators-2lj8f\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.192017 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-utilities\") pod \"community-operators-2lj8f\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.294031 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z7td\" (UniqueName: \"kubernetes.io/projected/42c7e4d1-0fbf-4984-864e-e795724aeed5-kube-api-access-4z7td\") pod \"community-operators-2lj8f\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.294341 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-utilities\") pod \"community-operators-2lj8f\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.294548 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-catalog-content\") pod \"community-operators-2lj8f\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.295083 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-utilities\") pod \"community-operators-2lj8f\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.295568 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-catalog-content\") pod \"community-operators-2lj8f\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.326041 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z7td\" (UniqueName: \"kubernetes.io/projected/42c7e4d1-0fbf-4984-864e-e795724aeed5-kube-api-access-4z7td\") pod \"community-operators-2lj8f\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.366759 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:13 crc kubenswrapper[5118]: I0223 08:08:13.917065 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lj8f"] Feb 23 08:08:14 crc kubenswrapper[5118]: I0223 08:08:14.669338 5118 generic.go:334] "Generic (PLEG): container finished" podID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerID="af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc" exitCode=0 Feb 23 08:08:14 crc kubenswrapper[5118]: I0223 08:08:14.669419 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lj8f" event={"ID":"42c7e4d1-0fbf-4984-864e-e795724aeed5","Type":"ContainerDied","Data":"af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc"} Feb 23 08:08:14 crc kubenswrapper[5118]: I0223 08:08:14.669862 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lj8f" event={"ID":"42c7e4d1-0fbf-4984-864e-e795724aeed5","Type":"ContainerStarted","Data":"0ff4f8edcd988f869207144582505d6dda6d150966ebfe168bf370a602598368"} Feb 23 08:08:14 crc kubenswrapper[5118]: I0223 08:08:14.672410 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:08:15 crc kubenswrapper[5118]: I0223 08:08:15.681092 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lj8f" event={"ID":"42c7e4d1-0fbf-4984-864e-e795724aeed5","Type":"ContainerStarted","Data":"5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc"} Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.568316 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7vtpf"] Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.570194 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.583647 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vtpf"] Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.701519 5118 generic.go:334] "Generic (PLEG): container finished" podID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerID="5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc" exitCode=0 Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.701621 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lj8f" event={"ID":"42c7e4d1-0fbf-4984-864e-e795724aeed5","Type":"ContainerDied","Data":"5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc"} Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.760145 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-catalog-content\") pod \"redhat-marketplace-7vtpf\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.760213 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww7g5\" (UniqueName: \"kubernetes.io/projected/a3687254-2082-48d4-842a-c3060748b30e-kube-api-access-ww7g5\") pod \"redhat-marketplace-7vtpf\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.760280 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-utilities\") pod \"redhat-marketplace-7vtpf\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.862391 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-catalog-content\") pod \"redhat-marketplace-7vtpf\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.862468 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww7g5\" (UniqueName: \"kubernetes.io/projected/a3687254-2082-48d4-842a-c3060748b30e-kube-api-access-ww7g5\") pod \"redhat-marketplace-7vtpf\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.862608 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-utilities\") pod \"redhat-marketplace-7vtpf\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.863342 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-catalog-content\") pod \"redhat-marketplace-7vtpf\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.863392 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-utilities\") pod \"redhat-marketplace-7vtpf\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.891407 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww7g5\" (UniqueName: \"kubernetes.io/projected/a3687254-2082-48d4-842a-c3060748b30e-kube-api-access-ww7g5\") pod \"redhat-marketplace-7vtpf\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:16 crc kubenswrapper[5118]: I0223 08:08:16.901883 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:17 crc kubenswrapper[5118]: I0223 08:08:17.397390 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vtpf"] Feb 23 08:08:17 crc kubenswrapper[5118]: I0223 08:08:17.714286 5118 generic.go:334] "Generic (PLEG): container finished" podID="a3687254-2082-48d4-842a-c3060748b30e" containerID="189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f" exitCode=0 Feb 23 08:08:17 crc kubenswrapper[5118]: I0223 08:08:17.714392 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vtpf" event={"ID":"a3687254-2082-48d4-842a-c3060748b30e","Type":"ContainerDied","Data":"189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f"} Feb 23 08:08:17 crc kubenswrapper[5118]: I0223 08:08:17.714430 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vtpf" event={"ID":"a3687254-2082-48d4-842a-c3060748b30e","Type":"ContainerStarted","Data":"0bba59c6029685357dce367edcc9e291915e3330af6b5992d157cbcc5484827b"} Feb 23 08:08:17 crc kubenswrapper[5118]: I0223 08:08:17.719673 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lj8f" event={"ID":"42c7e4d1-0fbf-4984-864e-e795724aeed5","Type":"ContainerStarted","Data":"5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f"} Feb 23 08:08:17 crc kubenswrapper[5118]: I0223 08:08:17.765435 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2lj8f" podStartSLOduration=3.337690188 podStartE2EDuration="5.76540314s" podCreationTimestamp="2026-02-23 08:08:12 +0000 UTC" firstStartedPulling="2026-02-23 08:08:14.67205222 +0000 UTC m=+4957.675836803" lastFinishedPulling="2026-02-23 08:08:17.099765182 +0000 UTC m=+4960.103549755" observedRunningTime="2026-02-23 08:08:17.752518919 +0000 UTC m=+4960.756303512" watchObservedRunningTime="2026-02-23 08:08:17.76540314 +0000 UTC m=+4960.769187743" Feb 23 08:08:18 crc kubenswrapper[5118]: I0223 08:08:18.734209 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vtpf" event={"ID":"a3687254-2082-48d4-842a-c3060748b30e","Type":"ContainerStarted","Data":"bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f"} Feb 23 08:08:19 crc kubenswrapper[5118]: I0223 08:08:19.744526 5118 generic.go:334] "Generic (PLEG): container finished" podID="a3687254-2082-48d4-842a-c3060748b30e" containerID="bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f" exitCode=0 Feb 23 08:08:19 crc kubenswrapper[5118]: I0223 08:08:19.744734 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vtpf" event={"ID":"a3687254-2082-48d4-842a-c3060748b30e","Type":"ContainerDied","Data":"bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f"} Feb 23 08:08:20 crc kubenswrapper[5118]: I0223 08:08:20.761467 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vtpf" event={"ID":"a3687254-2082-48d4-842a-c3060748b30e","Type":"ContainerStarted","Data":"31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f"} Feb 23 08:08:20 crc kubenswrapper[5118]: I0223 08:08:20.788508 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7vtpf" podStartSLOduration=2.362438715 podStartE2EDuration="4.788481536s" podCreationTimestamp="2026-02-23 08:08:16 +0000 UTC" firstStartedPulling="2026-02-23 08:08:17.716133592 +0000 UTC m=+4960.719918195" lastFinishedPulling="2026-02-23 08:08:20.142176423 +0000 UTC m=+4963.145961016" observedRunningTime="2026-02-23 08:08:20.783975117 +0000 UTC m=+4963.787759690" watchObservedRunningTime="2026-02-23 08:08:20.788481536 +0000 UTC m=+4963.792266099" Feb 23 08:08:23 crc kubenswrapper[5118]: I0223 08:08:23.367087 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:23 crc kubenswrapper[5118]: I0223 08:08:23.367852 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:23 crc kubenswrapper[5118]: I0223 08:08:23.420483 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:23 crc kubenswrapper[5118]: I0223 08:08:23.853248 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:25 crc kubenswrapper[5118]: I0223 08:08:25.362503 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2lj8f"] Feb 23 08:08:25 crc kubenswrapper[5118]: I0223 08:08:25.827598 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2lj8f" podUID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerName="registry-server" containerID="cri-o://5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f" gracePeriod=2 Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.290326 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.455214 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-catalog-content\") pod \"42c7e4d1-0fbf-4984-864e-e795724aeed5\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.455313 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-utilities\") pod \"42c7e4d1-0fbf-4984-864e-e795724aeed5\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.455413 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z7td\" (UniqueName: \"kubernetes.io/projected/42c7e4d1-0fbf-4984-864e-e795724aeed5-kube-api-access-4z7td\") pod \"42c7e4d1-0fbf-4984-864e-e795724aeed5\" (UID: \"42c7e4d1-0fbf-4984-864e-e795724aeed5\") " Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.456464 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-utilities" (OuterVolumeSpecName: "utilities") pod "42c7e4d1-0fbf-4984-864e-e795724aeed5" (UID: "42c7e4d1-0fbf-4984-864e-e795724aeed5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.462694 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c7e4d1-0fbf-4984-864e-e795724aeed5-kube-api-access-4z7td" (OuterVolumeSpecName: "kube-api-access-4z7td") pod "42c7e4d1-0fbf-4984-864e-e795724aeed5" (UID: "42c7e4d1-0fbf-4984-864e-e795724aeed5"). InnerVolumeSpecName "kube-api-access-4z7td". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.533138 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42c7e4d1-0fbf-4984-864e-e795724aeed5" (UID: "42c7e4d1-0fbf-4984-864e-e795724aeed5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.557524 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.557584 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42c7e4d1-0fbf-4984-864e-e795724aeed5-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.557597 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z7td\" (UniqueName: \"kubernetes.io/projected/42c7e4d1-0fbf-4984-864e-e795724aeed5-kube-api-access-4z7td\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.838828 5118 generic.go:334] "Generic (PLEG): container finished" podID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerID="5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f" exitCode=0 Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.838893 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lj8f" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.838915 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lj8f" event={"ID":"42c7e4d1-0fbf-4984-864e-e795724aeed5","Type":"ContainerDied","Data":"5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f"} Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.839029 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lj8f" event={"ID":"42c7e4d1-0fbf-4984-864e-e795724aeed5","Type":"ContainerDied","Data":"0ff4f8edcd988f869207144582505d6dda6d150966ebfe168bf370a602598368"} Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.839061 5118 scope.go:117] "RemoveContainer" containerID="5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.861181 5118 scope.go:117] "RemoveContainer" containerID="5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.887511 5118 scope.go:117] "RemoveContainer" containerID="af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.896192 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2lj8f"] Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.902227 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.902293 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.902897 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2lj8f"] Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.916897 5118 scope.go:117] "RemoveContainer" containerID="5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f" Feb 23 08:08:26 crc kubenswrapper[5118]: E0223 08:08:26.917529 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f\": container with ID starting with 5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f not found: ID does not exist" containerID="5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.917589 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f"} err="failed to get container status \"5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f\": rpc error: code = NotFound desc = could not find container \"5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f\": container with ID starting with 5c2c1130ba83b16407d4568a0236c564c78f329494db59026b6baa04c7a9459f not found: ID does not exist" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.917635 5118 scope.go:117] "RemoveContainer" containerID="5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc" Feb 23 08:08:26 crc kubenswrapper[5118]: E0223 08:08:26.918149 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc\": container with ID starting with 5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc not found: ID does not exist" containerID="5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.918172 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc"} err="failed to get container status \"5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc\": rpc error: code = NotFound desc = could not find container \"5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc\": container with ID starting with 5f35da7107fd005b389fa5500b7ff1aec8c89ce5b94acfe0dee83e6c68e6cacc not found: ID does not exist" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.918186 5118 scope.go:117] "RemoveContainer" containerID="af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc" Feb 23 08:08:26 crc kubenswrapper[5118]: E0223 08:08:26.918513 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc\": container with ID starting with af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc not found: ID does not exist" containerID="af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.918554 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc"} err="failed to get container status \"af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc\": rpc error: code = NotFound desc = could not find container \"af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc\": container with ID starting with af3c6626d78c072cbe5ac4da8edf4ef8eab9c73167123c8fd72cce21f1594efc not found: ID does not exist" Feb 23 08:08:26 crc kubenswrapper[5118]: I0223 08:08:26.978316 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:27 crc kubenswrapper[5118]: I0223 08:08:27.714745 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c7e4d1-0fbf-4984-864e-e795724aeed5" path="/var/lib/kubelet/pods/42c7e4d1-0fbf-4984-864e-e795724aeed5/volumes" Feb 23 08:08:27 crc kubenswrapper[5118]: I0223 08:08:27.904691 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:29 crc kubenswrapper[5118]: I0223 08:08:29.963787 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vtpf"] Feb 23 08:08:29 crc kubenswrapper[5118]: I0223 08:08:29.965557 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7vtpf" podUID="a3687254-2082-48d4-842a-c3060748b30e" containerName="registry-server" containerID="cri-o://31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f" gracePeriod=2 Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.513669 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.636231 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww7g5\" (UniqueName: \"kubernetes.io/projected/a3687254-2082-48d4-842a-c3060748b30e-kube-api-access-ww7g5\") pod \"a3687254-2082-48d4-842a-c3060748b30e\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.636441 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-catalog-content\") pod \"a3687254-2082-48d4-842a-c3060748b30e\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.636560 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-utilities\") pod \"a3687254-2082-48d4-842a-c3060748b30e\" (UID: \"a3687254-2082-48d4-842a-c3060748b30e\") " Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.637601 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-utilities" (OuterVolumeSpecName: "utilities") pod "a3687254-2082-48d4-842a-c3060748b30e" (UID: "a3687254-2082-48d4-842a-c3060748b30e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.645174 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3687254-2082-48d4-842a-c3060748b30e-kube-api-access-ww7g5" (OuterVolumeSpecName: "kube-api-access-ww7g5") pod "a3687254-2082-48d4-842a-c3060748b30e" (UID: "a3687254-2082-48d4-842a-c3060748b30e"). InnerVolumeSpecName "kube-api-access-ww7g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.670460 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3687254-2082-48d4-842a-c3060748b30e" (UID: "a3687254-2082-48d4-842a-c3060748b30e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.738920 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww7g5\" (UniqueName: \"kubernetes.io/projected/a3687254-2082-48d4-842a-c3060748b30e-kube-api-access-ww7g5\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.738968 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.738982 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3687254-2082-48d4-842a-c3060748b30e-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.876807 5118 generic.go:334] "Generic (PLEG): container finished" podID="a3687254-2082-48d4-842a-c3060748b30e" containerID="31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f" exitCode=0 Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.876859 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vtpf" event={"ID":"a3687254-2082-48d4-842a-c3060748b30e","Type":"ContainerDied","Data":"31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f"} Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.876899 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vtpf" event={"ID":"a3687254-2082-48d4-842a-c3060748b30e","Type":"ContainerDied","Data":"0bba59c6029685357dce367edcc9e291915e3330af6b5992d157cbcc5484827b"} Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.876897 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vtpf" Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.876918 5118 scope.go:117] "RemoveContainer" containerID="31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f" Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.911973 5118 scope.go:117] "RemoveContainer" containerID="bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f" Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.928752 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vtpf"] Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.935062 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vtpf"] Feb 23 08:08:30 crc kubenswrapper[5118]: I0223 08:08:30.954241 5118 scope.go:117] "RemoveContainer" containerID="189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f" Feb 23 08:08:31 crc kubenswrapper[5118]: I0223 08:08:31.004549 5118 scope.go:117] "RemoveContainer" containerID="31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f" Feb 23 08:08:31 crc kubenswrapper[5118]: E0223 08:08:31.005237 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f\": container with ID starting with 31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f not found: ID does not exist" containerID="31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f" Feb 23 08:08:31 crc kubenswrapper[5118]: I0223 08:08:31.005291 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f"} err="failed to get container status \"31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f\": rpc error: code = NotFound desc = could not find container \"31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f\": container with ID starting with 31c0465b389dad03a08eac3f86a27d21eb261b466ad3c21c327d618a8a65c83f not found: ID does not exist" Feb 23 08:08:31 crc kubenswrapper[5118]: I0223 08:08:31.005325 5118 scope.go:117] "RemoveContainer" containerID="bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f" Feb 23 08:08:31 crc kubenswrapper[5118]: E0223 08:08:31.005657 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f\": container with ID starting with bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f not found: ID does not exist" containerID="bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f" Feb 23 08:08:31 crc kubenswrapper[5118]: I0223 08:08:31.005688 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f"} err="failed to get container status \"bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f\": rpc error: code = NotFound desc = could not find container \"bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f\": container with ID starting with bbc15367f489a06fc884529855370fdafad819c3240d67aaca0d52608688225f not found: ID does not exist" Feb 23 08:08:31 crc kubenswrapper[5118]: I0223 08:08:31.005713 5118 scope.go:117] "RemoveContainer" containerID="189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f" Feb 23 08:08:31 crc kubenswrapper[5118]: E0223 08:08:31.006079 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f\": container with ID starting with 189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f not found: ID does not exist" containerID="189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f" Feb 23 08:08:31 crc kubenswrapper[5118]: I0223 08:08:31.006172 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f"} err="failed to get container status \"189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f\": rpc error: code = NotFound desc = could not find container \"189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f\": container with ID starting with 189ca0f60842b438faff8f2ba4b6990a07608bfa19ab12720c258ca61bbb330f not found: ID does not exist" Feb 23 08:08:31 crc kubenswrapper[5118]: I0223 08:08:31.708181 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3687254-2082-48d4-842a-c3060748b30e" path="/var/lib/kubelet/pods/a3687254-2082-48d4-842a-c3060748b30e/volumes" Feb 23 08:10:02 crc kubenswrapper[5118]: I0223 08:10:02.975243 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:10:02 crc kubenswrapper[5118]: I0223 08:10:02.975749 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:10:32 crc kubenswrapper[5118]: I0223 08:10:32.975470 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:10:32 crc kubenswrapper[5118]: I0223 08:10:32.976781 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:11:02 crc kubenswrapper[5118]: I0223 08:11:02.976022 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:11:02 crc kubenswrapper[5118]: I0223 08:11:02.976985 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:11:02 crc kubenswrapper[5118]: I0223 08:11:02.977075 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 08:11:02 crc kubenswrapper[5118]: I0223 08:11:02.978069 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"06c13a9258503f19184e9cb3f622c606a2eb03b98d69f5116a81e486b888cb16"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:11:02 crc kubenswrapper[5118]: I0223 08:11:02.978196 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://06c13a9258503f19184e9cb3f622c606a2eb03b98d69f5116a81e486b888cb16" gracePeriod=600 Feb 23 08:11:03 crc kubenswrapper[5118]: I0223 08:11:03.338207 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="06c13a9258503f19184e9cb3f622c606a2eb03b98d69f5116a81e486b888cb16" exitCode=0 Feb 23 08:11:03 crc kubenswrapper[5118]: I0223 08:11:03.338299 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"06c13a9258503f19184e9cb3f622c606a2eb03b98d69f5116a81e486b888cb16"} Feb 23 08:11:03 crc kubenswrapper[5118]: I0223 08:11:03.338380 5118 scope.go:117] "RemoveContainer" containerID="0eda65da1d6711026b43fa052e588501fd69f213a6842fad5350c709aff28d63" Feb 23 08:11:04 crc kubenswrapper[5118]: I0223 08:11:04.352654 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4"} Feb 23 08:13:32 crc kubenswrapper[5118]: I0223 08:13:32.975560 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:13:32 crc kubenswrapper[5118]: I0223 08:13:32.976680 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.629814 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bzfnt"] Feb 23 08:13:47 crc kubenswrapper[5118]: E0223 08:13:47.630770 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerName="extract-utilities" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.630783 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerName="extract-utilities" Feb 23 08:13:47 crc kubenswrapper[5118]: E0223 08:13:47.630802 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3687254-2082-48d4-842a-c3060748b30e" containerName="extract-utilities" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.630809 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3687254-2082-48d4-842a-c3060748b30e" containerName="extract-utilities" Feb 23 08:13:47 crc kubenswrapper[5118]: E0223 08:13:47.630829 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerName="registry-server" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.630836 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerName="registry-server" Feb 23 08:13:47 crc kubenswrapper[5118]: E0223 08:13:47.630849 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3687254-2082-48d4-842a-c3060748b30e" containerName="registry-server" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.630855 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3687254-2082-48d4-842a-c3060748b30e" containerName="registry-server" Feb 23 08:13:47 crc kubenswrapper[5118]: E0223 08:13:47.630867 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3687254-2082-48d4-842a-c3060748b30e" containerName="extract-content" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.630873 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3687254-2082-48d4-842a-c3060748b30e" containerName="extract-content" Feb 23 08:13:47 crc kubenswrapper[5118]: E0223 08:13:47.630883 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerName="extract-content" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.630889 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerName="extract-content" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.631034 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3687254-2082-48d4-842a-c3060748b30e" containerName="registry-server" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.631057 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c7e4d1-0fbf-4984-864e-e795724aeed5" containerName="registry-server" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.632074 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.644208 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzfnt"] Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.671058 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-utilities\") pod \"redhat-operators-bzfnt\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.671137 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn7gl\" (UniqueName: \"kubernetes.io/projected/6bc9c312-233e-4c27-b000-cbbe34b303d5-kube-api-access-jn7gl\") pod \"redhat-operators-bzfnt\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.671421 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-catalog-content\") pod \"redhat-operators-bzfnt\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.773575 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-catalog-content\") pod \"redhat-operators-bzfnt\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.773678 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-utilities\") pod \"redhat-operators-bzfnt\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.773730 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn7gl\" (UniqueName: \"kubernetes.io/projected/6bc9c312-233e-4c27-b000-cbbe34b303d5-kube-api-access-jn7gl\") pod \"redhat-operators-bzfnt\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.775123 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-utilities\") pod \"redhat-operators-bzfnt\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.776385 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-catalog-content\") pod \"redhat-operators-bzfnt\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.813192 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn7gl\" (UniqueName: \"kubernetes.io/projected/6bc9c312-233e-4c27-b000-cbbe34b303d5-kube-api-access-jn7gl\") pod \"redhat-operators-bzfnt\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:47 crc kubenswrapper[5118]: I0223 08:13:47.952071 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.476285 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzfnt"] Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.631366 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7lv6m"] Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.635900 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.650044 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7lv6m"] Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.692970 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwrgf\" (UniqueName: \"kubernetes.io/projected/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-kube-api-access-rwrgf\") pod \"certified-operators-7lv6m\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.695191 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-catalog-content\") pod \"certified-operators-7lv6m\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.695357 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-utilities\") pod \"certified-operators-7lv6m\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.796674 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwrgf\" (UniqueName: \"kubernetes.io/projected/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-kube-api-access-rwrgf\") pod \"certified-operators-7lv6m\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.797190 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-catalog-content\") pod \"certified-operators-7lv6m\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.797299 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-utilities\") pod \"certified-operators-7lv6m\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.797982 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-catalog-content\") pod \"certified-operators-7lv6m\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.798042 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-utilities\") pod \"certified-operators-7lv6m\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.828683 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwrgf\" (UniqueName: \"kubernetes.io/projected/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-kube-api-access-rwrgf\") pod \"certified-operators-7lv6m\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:48 crc kubenswrapper[5118]: I0223 08:13:48.958072 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:49 crc kubenswrapper[5118]: I0223 08:13:49.076456 5118 generic.go:334] "Generic (PLEG): container finished" podID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerID="1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30" exitCode=0 Feb 23 08:13:49 crc kubenswrapper[5118]: I0223 08:13:49.076700 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzfnt" event={"ID":"6bc9c312-233e-4c27-b000-cbbe34b303d5","Type":"ContainerDied","Data":"1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30"} Feb 23 08:13:49 crc kubenswrapper[5118]: I0223 08:13:49.076951 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzfnt" event={"ID":"6bc9c312-233e-4c27-b000-cbbe34b303d5","Type":"ContainerStarted","Data":"d326988e7e99cfa4cbc63c3807f8c7749a41deb77a729f8968fd50b37c489394"} Feb 23 08:13:49 crc kubenswrapper[5118]: I0223 08:13:49.079499 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:13:49 crc kubenswrapper[5118]: I0223 08:13:49.273156 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7lv6m"] Feb 23 08:13:49 crc kubenswrapper[5118]: W0223 08:13:49.282071 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec8f921e_a89c_4aa9_a45c_ab3eef3272c0.slice/crio-e9c32ea04f7a88dc20f3b8fb00637f4bf2cb61b0d536524802603b2a35978506 WatchSource:0}: Error finding container e9c32ea04f7a88dc20f3b8fb00637f4bf2cb61b0d536524802603b2a35978506: Status 404 returned error can't find the container with id e9c32ea04f7a88dc20f3b8fb00637f4bf2cb61b0d536524802603b2a35978506 Feb 23 08:13:50 crc kubenswrapper[5118]: I0223 08:13:50.085933 5118 generic.go:334] "Generic (PLEG): container finished" podID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerID="9a3956cd72f1a33d89a7a318d39a82a2d17a835cd7d7d82f8974a3a80f71f086" exitCode=0 Feb 23 08:13:50 crc kubenswrapper[5118]: I0223 08:13:50.086075 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lv6m" event={"ID":"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0","Type":"ContainerDied","Data":"9a3956cd72f1a33d89a7a318d39a82a2d17a835cd7d7d82f8974a3a80f71f086"} Feb 23 08:13:50 crc kubenswrapper[5118]: I0223 08:13:50.086569 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lv6m" event={"ID":"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0","Type":"ContainerStarted","Data":"e9c32ea04f7a88dc20f3b8fb00637f4bf2cb61b0d536524802603b2a35978506"} Feb 23 08:13:50 crc kubenswrapper[5118]: I0223 08:13:50.088655 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzfnt" event={"ID":"6bc9c312-233e-4c27-b000-cbbe34b303d5","Type":"ContainerStarted","Data":"3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552"} Feb 23 08:13:51 crc kubenswrapper[5118]: I0223 08:13:51.099009 5118 generic.go:334] "Generic (PLEG): container finished" podID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerID="3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552" exitCode=0 Feb 23 08:13:51 crc kubenswrapper[5118]: I0223 08:13:51.099128 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzfnt" event={"ID":"6bc9c312-233e-4c27-b000-cbbe34b303d5","Type":"ContainerDied","Data":"3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552"} Feb 23 08:13:51 crc kubenswrapper[5118]: I0223 08:13:51.106526 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lv6m" event={"ID":"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0","Type":"ContainerStarted","Data":"73ca314ba3915792b1747918ca900c4bdcc213f7828f4546aabba6df188010bc"} Feb 23 08:13:52 crc kubenswrapper[5118]: I0223 08:13:52.142933 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzfnt" event={"ID":"6bc9c312-233e-4c27-b000-cbbe34b303d5","Type":"ContainerStarted","Data":"ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7"} Feb 23 08:13:52 crc kubenswrapper[5118]: I0223 08:13:52.148981 5118 generic.go:334] "Generic (PLEG): container finished" podID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerID="73ca314ba3915792b1747918ca900c4bdcc213f7828f4546aabba6df188010bc" exitCode=0 Feb 23 08:13:52 crc kubenswrapper[5118]: I0223 08:13:52.149085 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lv6m" event={"ID":"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0","Type":"ContainerDied","Data":"73ca314ba3915792b1747918ca900c4bdcc213f7828f4546aabba6df188010bc"} Feb 23 08:13:52 crc kubenswrapper[5118]: I0223 08:13:52.170701 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bzfnt" podStartSLOduration=2.7177226599999997 podStartE2EDuration="5.170672848s" podCreationTimestamp="2026-02-23 08:13:47 +0000 UTC" firstStartedPulling="2026-02-23 08:13:49.079122086 +0000 UTC m=+5292.082906659" lastFinishedPulling="2026-02-23 08:13:51.532072234 +0000 UTC m=+5294.535856847" observedRunningTime="2026-02-23 08:13:52.167272545 +0000 UTC m=+5295.171057138" watchObservedRunningTime="2026-02-23 08:13:52.170672848 +0000 UTC m=+5295.174457461" Feb 23 08:13:53 crc kubenswrapper[5118]: I0223 08:13:53.160384 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lv6m" event={"ID":"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0","Type":"ContainerStarted","Data":"a7970f00e5bcd5a042696e6625c7ca8cdc0000d8aea6accbd39d6664dbd2665b"} Feb 23 08:13:53 crc kubenswrapper[5118]: I0223 08:13:53.193215 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7lv6m" podStartSLOduration=2.759323594 podStartE2EDuration="5.193190371s" podCreationTimestamp="2026-02-23 08:13:48 +0000 UTC" firstStartedPulling="2026-02-23 08:13:50.087565143 +0000 UTC m=+5293.091349716" lastFinishedPulling="2026-02-23 08:13:52.52143192 +0000 UTC m=+5295.525216493" observedRunningTime="2026-02-23 08:13:53.187555936 +0000 UTC m=+5296.191340529" watchObservedRunningTime="2026-02-23 08:13:53.193190371 +0000 UTC m=+5296.196974944" Feb 23 08:13:57 crc kubenswrapper[5118]: I0223 08:13:57.952803 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:57 crc kubenswrapper[5118]: I0223 08:13:57.953213 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:13:58 crc kubenswrapper[5118]: I0223 08:13:58.958686 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:58 crc kubenswrapper[5118]: I0223 08:13:58.958783 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:59 crc kubenswrapper[5118]: I0223 08:13:59.010200 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bzfnt" podUID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerName="registry-server" probeResult="failure" output=< Feb 23 08:13:59 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 08:13:59 crc kubenswrapper[5118]: > Feb 23 08:13:59 crc kubenswrapper[5118]: I0223 08:13:59.019478 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:59 crc kubenswrapper[5118]: I0223 08:13:59.279053 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:13:59 crc kubenswrapper[5118]: I0223 08:13:59.333727 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7lv6m"] Feb 23 08:14:01 crc kubenswrapper[5118]: I0223 08:14:01.229854 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7lv6m" podUID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerName="registry-server" containerID="cri-o://a7970f00e5bcd5a042696e6625c7ca8cdc0000d8aea6accbd39d6664dbd2665b" gracePeriod=2 Feb 23 08:14:02 crc kubenswrapper[5118]: I0223 08:14:02.241493 5118 generic.go:334] "Generic (PLEG): container finished" podID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerID="a7970f00e5bcd5a042696e6625c7ca8cdc0000d8aea6accbd39d6664dbd2665b" exitCode=0 Feb 23 08:14:02 crc kubenswrapper[5118]: I0223 08:14:02.241569 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lv6m" event={"ID":"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0","Type":"ContainerDied","Data":"a7970f00e5bcd5a042696e6625c7ca8cdc0000d8aea6accbd39d6664dbd2665b"} Feb 23 08:14:02 crc kubenswrapper[5118]: I0223 08:14:02.872891 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:14:02 crc kubenswrapper[5118]: I0223 08:14:02.957351 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-utilities\") pod \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " Feb 23 08:14:02 crc kubenswrapper[5118]: I0223 08:14:02.957436 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-catalog-content\") pod \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " Feb 23 08:14:02 crc kubenswrapper[5118]: I0223 08:14:02.957506 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwrgf\" (UniqueName: \"kubernetes.io/projected/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-kube-api-access-rwrgf\") pod \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\" (UID: \"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0\") " Feb 23 08:14:02 crc kubenswrapper[5118]: I0223 08:14:02.958675 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-utilities" (OuterVolumeSpecName: "utilities") pod "ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" (UID: "ec8f921e-a89c-4aa9-a45c-ab3eef3272c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:14:02 crc kubenswrapper[5118]: I0223 08:14:02.963670 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-kube-api-access-rwrgf" (OuterVolumeSpecName: "kube-api-access-rwrgf") pod "ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" (UID: "ec8f921e-a89c-4aa9-a45c-ab3eef3272c0"). InnerVolumeSpecName "kube-api-access-rwrgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:14:02 crc kubenswrapper[5118]: I0223 08:14:02.975662 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:14:02 crc kubenswrapper[5118]: I0223 08:14:02.975744 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.015655 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" (UID: "ec8f921e-a89c-4aa9-a45c-ab3eef3272c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.059652 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.059714 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwrgf\" (UniqueName: \"kubernetes.io/projected/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-kube-api-access-rwrgf\") on node \"crc\" DevicePath \"\"" Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.059736 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.257883 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lv6m" event={"ID":"ec8f921e-a89c-4aa9-a45c-ab3eef3272c0","Type":"ContainerDied","Data":"e9c32ea04f7a88dc20f3b8fb00637f4bf2cb61b0d536524802603b2a35978506"} Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.257970 5118 scope.go:117] "RemoveContainer" containerID="a7970f00e5bcd5a042696e6625c7ca8cdc0000d8aea6accbd39d6664dbd2665b" Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.258030 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lv6m" Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.299323 5118 scope.go:117] "RemoveContainer" containerID="73ca314ba3915792b1747918ca900c4bdcc213f7828f4546aabba6df188010bc" Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.316385 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7lv6m"] Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.327440 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7lv6m"] Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.350046 5118 scope.go:117] "RemoveContainer" containerID="9a3956cd72f1a33d89a7a318d39a82a2d17a835cd7d7d82f8974a3a80f71f086" Feb 23 08:14:03 crc kubenswrapper[5118]: I0223 08:14:03.715574 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" path="/var/lib/kubelet/pods/ec8f921e-a89c-4aa9-a45c-ab3eef3272c0/volumes" Feb 23 08:14:08 crc kubenswrapper[5118]: I0223 08:14:08.023736 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:14:08 crc kubenswrapper[5118]: I0223 08:14:08.094090 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:14:08 crc kubenswrapper[5118]: I0223 08:14:08.267661 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzfnt"] Feb 23 08:14:09 crc kubenswrapper[5118]: I0223 08:14:09.326805 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bzfnt" podUID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerName="registry-server" containerID="cri-o://ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7" gracePeriod=2 Feb 23 08:14:09 crc kubenswrapper[5118]: I0223 08:14:09.809627 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:14:09 crc kubenswrapper[5118]: I0223 08:14:09.868920 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-catalog-content\") pod \"6bc9c312-233e-4c27-b000-cbbe34b303d5\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " Feb 23 08:14:09 crc kubenswrapper[5118]: I0223 08:14:09.869031 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn7gl\" (UniqueName: \"kubernetes.io/projected/6bc9c312-233e-4c27-b000-cbbe34b303d5-kube-api-access-jn7gl\") pod \"6bc9c312-233e-4c27-b000-cbbe34b303d5\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " Feb 23 08:14:09 crc kubenswrapper[5118]: I0223 08:14:09.869165 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-utilities\") pod \"6bc9c312-233e-4c27-b000-cbbe34b303d5\" (UID: \"6bc9c312-233e-4c27-b000-cbbe34b303d5\") " Feb 23 08:14:09 crc kubenswrapper[5118]: I0223 08:14:09.870270 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-utilities" (OuterVolumeSpecName: "utilities") pod "6bc9c312-233e-4c27-b000-cbbe34b303d5" (UID: "6bc9c312-233e-4c27-b000-cbbe34b303d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:14:09 crc kubenswrapper[5118]: I0223 08:14:09.877743 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc9c312-233e-4c27-b000-cbbe34b303d5-kube-api-access-jn7gl" (OuterVolumeSpecName: "kube-api-access-jn7gl") pod "6bc9c312-233e-4c27-b000-cbbe34b303d5" (UID: "6bc9c312-233e-4c27-b000-cbbe34b303d5"). InnerVolumeSpecName "kube-api-access-jn7gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:14:09 crc kubenswrapper[5118]: I0223 08:14:09.971370 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:14:09 crc kubenswrapper[5118]: I0223 08:14:09.971403 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn7gl\" (UniqueName: \"kubernetes.io/projected/6bc9c312-233e-4c27-b000-cbbe34b303d5-kube-api-access-jn7gl\") on node \"crc\" DevicePath \"\"" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.009408 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bc9c312-233e-4c27-b000-cbbe34b303d5" (UID: "6bc9c312-233e-4c27-b000-cbbe34b303d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.072899 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc9c312-233e-4c27-b000-cbbe34b303d5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.341159 5118 generic.go:334] "Generic (PLEG): container finished" podID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerID="ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7" exitCode=0 Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.341279 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzfnt" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.342356 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzfnt" event={"ID":"6bc9c312-233e-4c27-b000-cbbe34b303d5","Type":"ContainerDied","Data":"ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7"} Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.342527 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzfnt" event={"ID":"6bc9c312-233e-4c27-b000-cbbe34b303d5","Type":"ContainerDied","Data":"d326988e7e99cfa4cbc63c3807f8c7749a41deb77a729f8968fd50b37c489394"} Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.342581 5118 scope.go:117] "RemoveContainer" containerID="ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.372174 5118 scope.go:117] "RemoveContainer" containerID="3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.398694 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzfnt"] Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.409805 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bzfnt"] Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.419281 5118 scope.go:117] "RemoveContainer" containerID="1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.452477 5118 scope.go:117] "RemoveContainer" containerID="ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7" Feb 23 08:14:10 crc kubenswrapper[5118]: E0223 08:14:10.453141 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7\": container with ID starting with ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7 not found: ID does not exist" containerID="ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.453193 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7"} err="failed to get container status \"ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7\": rpc error: code = NotFound desc = could not find container \"ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7\": container with ID starting with ec62c19f577134f362a7e778a24333d2b31e86e30b81e40a4a96744c33e182d7 not found: ID does not exist" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.453223 5118 scope.go:117] "RemoveContainer" containerID="3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552" Feb 23 08:14:10 crc kubenswrapper[5118]: E0223 08:14:10.453845 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552\": container with ID starting with 3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552 not found: ID does not exist" containerID="3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.454017 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552"} err="failed to get container status \"3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552\": rpc error: code = NotFound desc = could not find container \"3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552\": container with ID starting with 3cd28cb0a6234b23516d61a533aba7777e64baf66e39b897003cfc45a6232552 not found: ID does not exist" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.454192 5118 scope.go:117] "RemoveContainer" containerID="1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30" Feb 23 08:14:10 crc kubenswrapper[5118]: E0223 08:14:10.454986 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30\": container with ID starting with 1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30 not found: ID does not exist" containerID="1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30" Feb 23 08:14:10 crc kubenswrapper[5118]: I0223 08:14:10.455078 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30"} err="failed to get container status \"1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30\": rpc error: code = NotFound desc = could not find container \"1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30\": container with ID starting with 1baab0e0c710b838d50d743985b1f6c443b6166ff2dba3e7245a5c483cf51a30 not found: ID does not exist" Feb 23 08:14:11 crc kubenswrapper[5118]: I0223 08:14:11.724615 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc9c312-233e-4c27-b000-cbbe34b303d5" path="/var/lib/kubelet/pods/6bc9c312-233e-4c27-b000-cbbe34b303d5/volumes" Feb 23 08:14:32 crc kubenswrapper[5118]: I0223 08:14:32.975056 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:14:32 crc kubenswrapper[5118]: I0223 08:14:32.975882 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:14:32 crc kubenswrapper[5118]: I0223 08:14:32.975953 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 08:14:32 crc kubenswrapper[5118]: I0223 08:14:32.976589 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:14:32 crc kubenswrapper[5118]: I0223 08:14:32.976662 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" gracePeriod=600 Feb 23 08:14:33 crc kubenswrapper[5118]: E0223 08:14:33.103833 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:14:33 crc kubenswrapper[5118]: I0223 08:14:33.571404 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" exitCode=0 Feb 23 08:14:33 crc kubenswrapper[5118]: I0223 08:14:33.571471 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4"} Feb 23 08:14:33 crc kubenswrapper[5118]: I0223 08:14:33.571525 5118 scope.go:117] "RemoveContainer" containerID="06c13a9258503f19184e9cb3f622c606a2eb03b98d69f5116a81e486b888cb16" Feb 23 08:14:33 crc kubenswrapper[5118]: I0223 08:14:33.572438 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:14:33 crc kubenswrapper[5118]: E0223 08:14:33.573221 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:14:45 crc kubenswrapper[5118]: I0223 08:14:45.697721 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:14:45 crc kubenswrapper[5118]: E0223 08:14:45.698842 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:14:59 crc kubenswrapper[5118]: I0223 08:14:59.749155 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:14:59 crc kubenswrapper[5118]: E0223 08:14:59.751053 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.160650 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g"] Feb 23 08:15:00 crc kubenswrapper[5118]: E0223 08:15:00.161522 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerName="extract-content" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.161558 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerName="extract-content" Feb 23 08:15:00 crc kubenswrapper[5118]: E0223 08:15:00.161582 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerName="extract-utilities" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.161595 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerName="extract-utilities" Feb 23 08:15:00 crc kubenswrapper[5118]: E0223 08:15:00.161622 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerName="extract-utilities" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.161635 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerName="extract-utilities" Feb 23 08:15:00 crc kubenswrapper[5118]: E0223 08:15:00.161661 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerName="extract-content" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.161674 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerName="extract-content" Feb 23 08:15:00 crc kubenswrapper[5118]: E0223 08:15:00.161704 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.161714 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[5118]: E0223 08:15:00.161729 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.161738 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.161993 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc9c312-233e-4c27-b000-cbbe34b303d5" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.162038 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8f921e-a89c-4aa9-a45c-ab3eef3272c0" containerName="registry-server" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.162961 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.166255 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.169935 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.173989 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g"] Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.359312 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d79b299b-f10b-4056-aa4e-78a9a3368d64-config-volume\") pod \"collect-profiles-29530575-6tm2g\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.359431 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d79b299b-f10b-4056-aa4e-78a9a3368d64-secret-volume\") pod \"collect-profiles-29530575-6tm2g\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.359484 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59lvj\" (UniqueName: \"kubernetes.io/projected/d79b299b-f10b-4056-aa4e-78a9a3368d64-kube-api-access-59lvj\") pod \"collect-profiles-29530575-6tm2g\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.461004 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59lvj\" (UniqueName: \"kubernetes.io/projected/d79b299b-f10b-4056-aa4e-78a9a3368d64-kube-api-access-59lvj\") pod \"collect-profiles-29530575-6tm2g\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.461340 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d79b299b-f10b-4056-aa4e-78a9a3368d64-config-volume\") pod \"collect-profiles-29530575-6tm2g\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.461398 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d79b299b-f10b-4056-aa4e-78a9a3368d64-secret-volume\") pod \"collect-profiles-29530575-6tm2g\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.463282 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d79b299b-f10b-4056-aa4e-78a9a3368d64-config-volume\") pod \"collect-profiles-29530575-6tm2g\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.470882 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d79b299b-f10b-4056-aa4e-78a9a3368d64-secret-volume\") pod \"collect-profiles-29530575-6tm2g\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.485213 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59lvj\" (UniqueName: \"kubernetes.io/projected/d79b299b-f10b-4056-aa4e-78a9a3368d64-kube-api-access-59lvj\") pod \"collect-profiles-29530575-6tm2g\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:00 crc kubenswrapper[5118]: I0223 08:15:00.489626 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:01 crc kubenswrapper[5118]: I0223 08:15:01.515956 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g"] Feb 23 08:15:01 crc kubenswrapper[5118]: I0223 08:15:01.723705 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" event={"ID":"d79b299b-f10b-4056-aa4e-78a9a3368d64","Type":"ContainerStarted","Data":"613538892e59998b65529532169dcd1a057ce670163a6fe806a3db8934f299fe"} Feb 23 08:15:01 crc kubenswrapper[5118]: I0223 08:15:01.724497 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" event={"ID":"d79b299b-f10b-4056-aa4e-78a9a3368d64","Type":"ContainerStarted","Data":"63ad35239a5b1e2587b9b9f755f9af73312edd967abc081ded680b2c5b70e5ca"} Feb 23 08:15:01 crc kubenswrapper[5118]: I0223 08:15:01.748745 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" podStartSLOduration=1.748674056 podStartE2EDuration="1.748674056s" podCreationTimestamp="2026-02-23 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:15:01.746980715 +0000 UTC m=+5364.750765328" watchObservedRunningTime="2026-02-23 08:15:01.748674056 +0000 UTC m=+5364.752458669" Feb 23 08:15:02 crc kubenswrapper[5118]: I0223 08:15:02.738193 5118 generic.go:334] "Generic (PLEG): container finished" podID="d79b299b-f10b-4056-aa4e-78a9a3368d64" containerID="613538892e59998b65529532169dcd1a057ce670163a6fe806a3db8934f299fe" exitCode=0 Feb 23 08:15:02 crc kubenswrapper[5118]: I0223 08:15:02.738298 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" event={"ID":"d79b299b-f10b-4056-aa4e-78a9a3368d64","Type":"ContainerDied","Data":"613538892e59998b65529532169dcd1a057ce670163a6fe806a3db8934f299fe"} Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.073211 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.139697 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59lvj\" (UniqueName: \"kubernetes.io/projected/d79b299b-f10b-4056-aa4e-78a9a3368d64-kube-api-access-59lvj\") pod \"d79b299b-f10b-4056-aa4e-78a9a3368d64\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.139888 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d79b299b-f10b-4056-aa4e-78a9a3368d64-config-volume\") pod \"d79b299b-f10b-4056-aa4e-78a9a3368d64\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.140009 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d79b299b-f10b-4056-aa4e-78a9a3368d64-secret-volume\") pod \"d79b299b-f10b-4056-aa4e-78a9a3368d64\" (UID: \"d79b299b-f10b-4056-aa4e-78a9a3368d64\") " Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.140771 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d79b299b-f10b-4056-aa4e-78a9a3368d64-config-volume" (OuterVolumeSpecName: "config-volume") pod "d79b299b-f10b-4056-aa4e-78a9a3368d64" (UID: "d79b299b-f10b-4056-aa4e-78a9a3368d64"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.147821 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d79b299b-f10b-4056-aa4e-78a9a3368d64-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d79b299b-f10b-4056-aa4e-78a9a3368d64" (UID: "d79b299b-f10b-4056-aa4e-78a9a3368d64"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.150477 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d79b299b-f10b-4056-aa4e-78a9a3368d64-kube-api-access-59lvj" (OuterVolumeSpecName: "kube-api-access-59lvj") pod "d79b299b-f10b-4056-aa4e-78a9a3368d64" (UID: "d79b299b-f10b-4056-aa4e-78a9a3368d64"). InnerVolumeSpecName "kube-api-access-59lvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.242441 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d79b299b-f10b-4056-aa4e-78a9a3368d64-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.242498 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59lvj\" (UniqueName: \"kubernetes.io/projected/d79b299b-f10b-4056-aa4e-78a9a3368d64-kube-api-access-59lvj\") on node \"crc\" DevicePath \"\"" Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.242512 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d79b299b-f10b-4056-aa4e-78a9a3368d64-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.616736 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn"] Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.623959 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530530-8zhjn"] Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.759577 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" event={"ID":"d79b299b-f10b-4056-aa4e-78a9a3368d64","Type":"ContainerDied","Data":"63ad35239a5b1e2587b9b9f755f9af73312edd967abc081ded680b2c5b70e5ca"} Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.759635 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63ad35239a5b1e2587b9b9f755f9af73312edd967abc081ded680b2c5b70e5ca" Feb 23 08:15:04 crc kubenswrapper[5118]: I0223 08:15:04.759708 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g" Feb 23 08:15:05 crc kubenswrapper[5118]: I0223 08:15:05.709498 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39bb2b60-4d35-423f-ac18-c48d017800cd" path="/var/lib/kubelet/pods/39bb2b60-4d35-423f-ac18-c48d017800cd/volumes" Feb 23 08:15:13 crc kubenswrapper[5118]: I0223 08:15:13.698218 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:15:13 crc kubenswrapper[5118]: E0223 08:15:13.699185 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:15:25 crc kubenswrapper[5118]: I0223 08:15:25.698239 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:15:25 crc kubenswrapper[5118]: E0223 08:15:25.699315 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:15:36 crc kubenswrapper[5118]: I0223 08:15:36.698846 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:15:36 crc kubenswrapper[5118]: E0223 08:15:36.699680 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:15:48 crc kubenswrapper[5118]: I0223 08:15:48.697831 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:15:48 crc kubenswrapper[5118]: E0223 08:15:48.698703 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:15:50 crc kubenswrapper[5118]: I0223 08:15:50.812055 5118 scope.go:117] "RemoveContainer" containerID="123da3c0cc5c405777b801da8c3912f59cb81951ccbf7ca79f59b658d01a7388" Feb 23 08:16:03 crc kubenswrapper[5118]: I0223 08:16:03.698521 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:16:03 crc kubenswrapper[5118]: E0223 08:16:03.700084 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:16:17 crc kubenswrapper[5118]: I0223 08:16:17.702790 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:16:17 crc kubenswrapper[5118]: E0223 08:16:17.704192 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:16:28 crc kubenswrapper[5118]: I0223 08:16:28.697132 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:16:28 crc kubenswrapper[5118]: E0223 08:16:28.698123 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:16:43 crc kubenswrapper[5118]: I0223 08:16:43.698264 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:16:43 crc kubenswrapper[5118]: E0223 08:16:43.699409 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:16:57 crc kubenswrapper[5118]: I0223 08:16:57.707723 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:16:57 crc kubenswrapper[5118]: E0223 08:16:57.709169 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:17:12 crc kubenswrapper[5118]: I0223 08:17:12.697374 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:17:12 crc kubenswrapper[5118]: E0223 08:17:12.698608 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:17:25 crc kubenswrapper[5118]: I0223 08:17:25.698116 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:17:25 crc kubenswrapper[5118]: E0223 08:17:25.699253 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:17:40 crc kubenswrapper[5118]: I0223 08:17:40.698652 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:17:40 crc kubenswrapper[5118]: E0223 08:17:40.699647 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:17:51 crc kubenswrapper[5118]: I0223 08:17:51.697986 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:17:51 crc kubenswrapper[5118]: E0223 08:17:51.698898 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:18:05 crc kubenswrapper[5118]: I0223 08:18:05.697621 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:18:05 crc kubenswrapper[5118]: E0223 08:18:05.698770 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:18:18 crc kubenswrapper[5118]: I0223 08:18:18.702525 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:18:18 crc kubenswrapper[5118]: E0223 08:18:18.703720 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:18:33 crc kubenswrapper[5118]: I0223 08:18:33.747319 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:18:33 crc kubenswrapper[5118]: E0223 08:18:33.750254 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.156135 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-szc9v"] Feb 23 08:18:37 crc kubenswrapper[5118]: E0223 08:18:37.157013 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79b299b-f10b-4056-aa4e-78a9a3368d64" containerName="collect-profiles" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.157029 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79b299b-f10b-4056-aa4e-78a9a3368d64" containerName="collect-profiles" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.157216 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d79b299b-f10b-4056-aa4e-78a9a3368d64" containerName="collect-profiles" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.158278 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.193162 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szc9v"] Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.261656 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bdn\" (UniqueName: \"kubernetes.io/projected/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-kube-api-access-j2bdn\") pod \"redhat-marketplace-szc9v\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.261738 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-catalog-content\") pod \"redhat-marketplace-szc9v\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.261954 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-utilities\") pod \"redhat-marketplace-szc9v\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.372388 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bdn\" (UniqueName: \"kubernetes.io/projected/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-kube-api-access-j2bdn\") pod \"redhat-marketplace-szc9v\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.372501 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-catalog-content\") pod \"redhat-marketplace-szc9v\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.372605 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-utilities\") pod \"redhat-marketplace-szc9v\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.373530 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-utilities\") pod \"redhat-marketplace-szc9v\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.375208 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-catalog-content\") pod \"redhat-marketplace-szc9v\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.416980 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bdn\" (UniqueName: \"kubernetes.io/projected/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-kube-api-access-j2bdn\") pod \"redhat-marketplace-szc9v\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.480121 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.832548 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szc9v"] Feb 23 08:18:37 crc kubenswrapper[5118]: I0223 08:18:37.876441 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szc9v" event={"ID":"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0","Type":"ContainerStarted","Data":"6f33a2681da2d5133d83e28b8c2a328fe4a542573d6e8ad94bddce8b326591e3"} Feb 23 08:18:38 crc kubenswrapper[5118]: I0223 08:18:38.889199 5118 generic.go:334] "Generic (PLEG): container finished" podID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerID="da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043" exitCode=0 Feb 23 08:18:38 crc kubenswrapper[5118]: I0223 08:18:38.889268 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szc9v" event={"ID":"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0","Type":"ContainerDied","Data":"da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043"} Feb 23 08:18:39 crc kubenswrapper[5118]: I0223 08:18:39.899551 5118 generic.go:334] "Generic (PLEG): container finished" podID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerID="995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084" exitCode=0 Feb 23 08:18:39 crc kubenswrapper[5118]: I0223 08:18:39.899642 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szc9v" event={"ID":"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0","Type":"ContainerDied","Data":"995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084"} Feb 23 08:18:40 crc kubenswrapper[5118]: I0223 08:18:40.908599 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szc9v" event={"ID":"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0","Type":"ContainerStarted","Data":"8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95"} Feb 23 08:18:40 crc kubenswrapper[5118]: I0223 08:18:40.941553 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-szc9v" podStartSLOduration=2.555954272 podStartE2EDuration="3.941003603s" podCreationTimestamp="2026-02-23 08:18:37 +0000 UTC" firstStartedPulling="2026-02-23 08:18:38.893780001 +0000 UTC m=+5581.897564604" lastFinishedPulling="2026-02-23 08:18:40.278829362 +0000 UTC m=+5583.282613935" observedRunningTime="2026-02-23 08:18:40.930368567 +0000 UTC m=+5583.934153170" watchObservedRunningTime="2026-02-23 08:18:40.941003603 +0000 UTC m=+5583.944788176" Feb 23 08:18:44 crc kubenswrapper[5118]: I0223 08:18:44.697484 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:18:44 crc kubenswrapper[5118]: E0223 08:18:44.698416 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:18:47 crc kubenswrapper[5118]: I0223 08:18:47.480547 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:47 crc kubenswrapper[5118]: I0223 08:18:47.480944 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:47 crc kubenswrapper[5118]: I0223 08:18:47.564173 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:48 crc kubenswrapper[5118]: I0223 08:18:48.048847 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:48 crc kubenswrapper[5118]: I0223 08:18:48.125898 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szc9v"] Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.005932 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-szc9v" podUID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerName="registry-server" containerID="cri-o://8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95" gracePeriod=2 Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.524068 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.605456 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2bdn\" (UniqueName: \"kubernetes.io/projected/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-kube-api-access-j2bdn\") pod \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.605565 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-utilities\") pod \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.605682 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-catalog-content\") pod \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\" (UID: \"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0\") " Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.606766 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-utilities" (OuterVolumeSpecName: "utilities") pod "5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" (UID: "5faa2ed1-24c6-46c6-872e-4e98a02e3bc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.611270 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-kube-api-access-j2bdn" (OuterVolumeSpecName: "kube-api-access-j2bdn") pod "5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" (UID: "5faa2ed1-24c6-46c6-872e-4e98a02e3bc0"). InnerVolumeSpecName "kube-api-access-j2bdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.637609 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" (UID: "5faa2ed1-24c6-46c6-872e-4e98a02e3bc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.707476 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.707536 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2bdn\" (UniqueName: \"kubernetes.io/projected/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-kube-api-access-j2bdn\") on node \"crc\" DevicePath \"\"" Feb 23 08:18:50 crc kubenswrapper[5118]: I0223 08:18:50.707550 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.017338 5118 generic.go:334] "Generic (PLEG): container finished" podID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerID="8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95" exitCode=0 Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.017404 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szc9v" event={"ID":"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0","Type":"ContainerDied","Data":"8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95"} Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.017444 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szc9v" event={"ID":"5faa2ed1-24c6-46c6-872e-4e98a02e3bc0","Type":"ContainerDied","Data":"6f33a2681da2d5133d83e28b8c2a328fe4a542573d6e8ad94bddce8b326591e3"} Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.017475 5118 scope.go:117] "RemoveContainer" containerID="8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.017655 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szc9v" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.044347 5118 scope.go:117] "RemoveContainer" containerID="995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.079992 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szc9v"] Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.086310 5118 scope.go:117] "RemoveContainer" containerID="da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.091928 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-szc9v"] Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.115301 5118 scope.go:117] "RemoveContainer" containerID="8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95" Feb 23 08:18:51 crc kubenswrapper[5118]: E0223 08:18:51.119195 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95\": container with ID starting with 8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95 not found: ID does not exist" containerID="8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.119232 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95"} err="failed to get container status \"8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95\": rpc error: code = NotFound desc = could not find container \"8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95\": container with ID starting with 8d6cea21c75eb21f0e6f84d5b3b7f0c1ce91a8fc05d344363ff5db7383f35b95 not found: ID does not exist" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.119258 5118 scope.go:117] "RemoveContainer" containerID="995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084" Feb 23 08:18:51 crc kubenswrapper[5118]: E0223 08:18:51.124168 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084\": container with ID starting with 995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084 not found: ID does not exist" containerID="995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.124191 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084"} err="failed to get container status \"995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084\": rpc error: code = NotFound desc = could not find container \"995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084\": container with ID starting with 995835361aa6ca053c220c3fb5eae190d3caf90599a5c6b9d0fb51083927d084 not found: ID does not exist" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.124206 5118 scope.go:117] "RemoveContainer" containerID="da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043" Feb 23 08:18:51 crc kubenswrapper[5118]: E0223 08:18:51.125324 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043\": container with ID starting with da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043 not found: ID does not exist" containerID="da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.125347 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043"} err="failed to get container status \"da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043\": rpc error: code = NotFound desc = could not find container \"da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043\": container with ID starting with da3d1f0cfca3655e88edbc7c2944e3f4b26ea2d33edbdce9c50aee00fd2f9043 not found: ID does not exist" Feb 23 08:18:51 crc kubenswrapper[5118]: I0223 08:18:51.714356 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" path="/var/lib/kubelet/pods/5faa2ed1-24c6-46c6-872e-4e98a02e3bc0/volumes" Feb 23 08:18:57 crc kubenswrapper[5118]: I0223 08:18:57.704219 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:18:57 crc kubenswrapper[5118]: E0223 08:18:57.704810 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:19:09 crc kubenswrapper[5118]: I0223 08:19:09.698290 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:19:09 crc kubenswrapper[5118]: E0223 08:19:09.699384 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:19:22 crc kubenswrapper[5118]: I0223 08:19:22.699503 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:19:22 crc kubenswrapper[5118]: E0223 08:19:22.700600 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:19:37 crc kubenswrapper[5118]: I0223 08:19:37.702777 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:19:38 crc kubenswrapper[5118]: I0223 08:19:38.493377 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"1e2b737d523c4548d60f3331b545eb0ad583561f50e921a2ebd442cb227f1582"} Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.001941 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-cgwkc"] Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.012502 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-cgwkc"] Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.137313 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-nlb5r"] Feb 23 08:21:36 crc kubenswrapper[5118]: E0223 08:21:36.137858 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerName="extract-content" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.137889 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerName="extract-content" Feb 23 08:21:36 crc kubenswrapper[5118]: E0223 08:21:36.137937 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerName="extract-utilities" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.137952 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerName="extract-utilities" Feb 23 08:21:36 crc kubenswrapper[5118]: E0223 08:21:36.137975 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerName="registry-server" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.137990 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerName="registry-server" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.138279 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5faa2ed1-24c6-46c6-872e-4e98a02e3bc0" containerName="registry-server" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.139134 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.143191 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-nlb5r"] Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.146756 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.150009 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.150183 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.150697 5118 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-hsj24" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.227121 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m2wf\" (UniqueName: \"kubernetes.io/projected/bccc1beb-cac2-457f-8a73-09a6c0d28809-kube-api-access-4m2wf\") pod \"crc-storage-crc-nlb5r\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.227212 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/bccc1beb-cac2-457f-8a73-09a6c0d28809-node-mnt\") pod \"crc-storage-crc-nlb5r\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.227289 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/bccc1beb-cac2-457f-8a73-09a6c0d28809-crc-storage\") pod \"crc-storage-crc-nlb5r\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.358500 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m2wf\" (UniqueName: \"kubernetes.io/projected/bccc1beb-cac2-457f-8a73-09a6c0d28809-kube-api-access-4m2wf\") pod \"crc-storage-crc-nlb5r\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.358602 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/bccc1beb-cac2-457f-8a73-09a6c0d28809-node-mnt\") pod \"crc-storage-crc-nlb5r\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.358646 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/bccc1beb-cac2-457f-8a73-09a6c0d28809-crc-storage\") pod \"crc-storage-crc-nlb5r\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.359795 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/bccc1beb-cac2-457f-8a73-09a6c0d28809-crc-storage\") pod \"crc-storage-crc-nlb5r\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.359850 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/bccc1beb-cac2-457f-8a73-09a6c0d28809-node-mnt\") pod \"crc-storage-crc-nlb5r\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.392480 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m2wf\" (UniqueName: \"kubernetes.io/projected/bccc1beb-cac2-457f-8a73-09a6c0d28809-kube-api-access-4m2wf\") pod \"crc-storage-crc-nlb5r\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:36 crc kubenswrapper[5118]: I0223 08:21:36.473170 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:37 crc kubenswrapper[5118]: I0223 08:21:37.045382 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-nlb5r"] Feb 23 08:21:37 crc kubenswrapper[5118]: I0223 08:21:37.046207 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:21:37 crc kubenswrapper[5118]: I0223 08:21:37.651509 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nlb5r" event={"ID":"bccc1beb-cac2-457f-8a73-09a6c0d28809","Type":"ContainerStarted","Data":"7926126ea8d4bd4311ad524b081071878f5099b2c22ac4b5e4101c51d29bafd4"} Feb 23 08:21:37 crc kubenswrapper[5118]: I0223 08:21:37.733185 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b146f5c-57ff-45e2-862d-41329e4fd358" path="/var/lib/kubelet/pods/5b146f5c-57ff-45e2-862d-41329e4fd358/volumes" Feb 23 08:21:38 crc kubenswrapper[5118]: I0223 08:21:38.662549 5118 generic.go:334] "Generic (PLEG): container finished" podID="bccc1beb-cac2-457f-8a73-09a6c0d28809" containerID="492bcd6dba0ae02876e4f6ecffa5881479b703c9c74c9a7ce67d18e8905a6428" exitCode=0 Feb 23 08:21:38 crc kubenswrapper[5118]: I0223 08:21:38.662626 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nlb5r" event={"ID":"bccc1beb-cac2-457f-8a73-09a6c0d28809","Type":"ContainerDied","Data":"492bcd6dba0ae02876e4f6ecffa5881479b703c9c74c9a7ce67d18e8905a6428"} Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.006887 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.127414 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/bccc1beb-cac2-457f-8a73-09a6c0d28809-node-mnt\") pod \"bccc1beb-cac2-457f-8a73-09a6c0d28809\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.127805 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/bccc1beb-cac2-457f-8a73-09a6c0d28809-crc-storage\") pod \"bccc1beb-cac2-457f-8a73-09a6c0d28809\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.127517 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bccc1beb-cac2-457f-8a73-09a6c0d28809-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "bccc1beb-cac2-457f-8a73-09a6c0d28809" (UID: "bccc1beb-cac2-457f-8a73-09a6c0d28809"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.128253 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m2wf\" (UniqueName: \"kubernetes.io/projected/bccc1beb-cac2-457f-8a73-09a6c0d28809-kube-api-access-4m2wf\") pod \"bccc1beb-cac2-457f-8a73-09a6c0d28809\" (UID: \"bccc1beb-cac2-457f-8a73-09a6c0d28809\") " Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.129057 5118 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/bccc1beb-cac2-457f-8a73-09a6c0d28809-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.136312 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccc1beb-cac2-457f-8a73-09a6c0d28809-kube-api-access-4m2wf" (OuterVolumeSpecName: "kube-api-access-4m2wf") pod "bccc1beb-cac2-457f-8a73-09a6c0d28809" (UID: "bccc1beb-cac2-457f-8a73-09a6c0d28809"). InnerVolumeSpecName "kube-api-access-4m2wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.152212 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bccc1beb-cac2-457f-8a73-09a6c0d28809-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "bccc1beb-cac2-457f-8a73-09a6c0d28809" (UID: "bccc1beb-cac2-457f-8a73-09a6c0d28809"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.230943 5118 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/bccc1beb-cac2-457f-8a73-09a6c0d28809-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.231009 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m2wf\" (UniqueName: \"kubernetes.io/projected/bccc1beb-cac2-457f-8a73-09a6c0d28809-kube-api-access-4m2wf\") on node \"crc\" DevicePath \"\"" Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.684213 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nlb5r" event={"ID":"bccc1beb-cac2-457f-8a73-09a6c0d28809","Type":"ContainerDied","Data":"7926126ea8d4bd4311ad524b081071878f5099b2c22ac4b5e4101c51d29bafd4"} Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.684274 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7926126ea8d4bd4311ad524b081071878f5099b2c22ac4b5e4101c51d29bafd4" Feb 23 08:21:40 crc kubenswrapper[5118]: I0223 08:21:40.684397 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nlb5r" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.181993 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-nlb5r"] Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.214747 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-nlb5r"] Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.288628 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-skjbq"] Feb 23 08:21:42 crc kubenswrapper[5118]: E0223 08:21:42.288944 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccc1beb-cac2-457f-8a73-09a6c0d28809" containerName="storage" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.288958 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccc1beb-cac2-457f-8a73-09a6c0d28809" containerName="storage" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.289130 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccc1beb-cac2-457f-8a73-09a6c0d28809" containerName="storage" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.289641 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.301740 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.301800 5118 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-hsj24" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.302008 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.302508 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.306787 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-skjbq"] Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.368233 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtwm\" (UniqueName: \"kubernetes.io/projected/c2450705-6a4c-4fab-b248-e24fcb74f0d5-kube-api-access-zrtwm\") pod \"crc-storage-crc-skjbq\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.368305 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c2450705-6a4c-4fab-b248-e24fcb74f0d5-node-mnt\") pod \"crc-storage-crc-skjbq\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.368512 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c2450705-6a4c-4fab-b248-e24fcb74f0d5-crc-storage\") pod \"crc-storage-crc-skjbq\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.470552 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtwm\" (UniqueName: \"kubernetes.io/projected/c2450705-6a4c-4fab-b248-e24fcb74f0d5-kube-api-access-zrtwm\") pod \"crc-storage-crc-skjbq\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.470633 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c2450705-6a4c-4fab-b248-e24fcb74f0d5-node-mnt\") pod \"crc-storage-crc-skjbq\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.470721 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c2450705-6a4c-4fab-b248-e24fcb74f0d5-crc-storage\") pod \"crc-storage-crc-skjbq\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.470917 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c2450705-6a4c-4fab-b248-e24fcb74f0d5-node-mnt\") pod \"crc-storage-crc-skjbq\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.471918 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c2450705-6a4c-4fab-b248-e24fcb74f0d5-crc-storage\") pod \"crc-storage-crc-skjbq\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.500908 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtwm\" (UniqueName: \"kubernetes.io/projected/c2450705-6a4c-4fab-b248-e24fcb74f0d5-kube-api-access-zrtwm\") pod \"crc-storage-crc-skjbq\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.613482 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:42 crc kubenswrapper[5118]: I0223 08:21:42.925386 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-skjbq"] Feb 23 08:21:43 crc kubenswrapper[5118]: I0223 08:21:43.718579 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bccc1beb-cac2-457f-8a73-09a6c0d28809" path="/var/lib/kubelet/pods/bccc1beb-cac2-457f-8a73-09a6c0d28809/volumes" Feb 23 08:21:43 crc kubenswrapper[5118]: I0223 08:21:43.721955 5118 generic.go:334] "Generic (PLEG): container finished" podID="c2450705-6a4c-4fab-b248-e24fcb74f0d5" containerID="af4f397afd06734b4154cbad07dcd40a2eda9fc4f72795155f30f5a96751949a" exitCode=0 Feb 23 08:21:43 crc kubenswrapper[5118]: I0223 08:21:43.722608 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-skjbq" event={"ID":"c2450705-6a4c-4fab-b248-e24fcb74f0d5","Type":"ContainerDied","Data":"af4f397afd06734b4154cbad07dcd40a2eda9fc4f72795155f30f5a96751949a"} Feb 23 08:21:43 crc kubenswrapper[5118]: I0223 08:21:43.722687 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-skjbq" event={"ID":"c2450705-6a4c-4fab-b248-e24fcb74f0d5","Type":"ContainerStarted","Data":"a9d77decbb61b2a636031a4be7eac06c1d9f6e1acc2824c1629fe164922f3d70"} Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.155447 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.219068 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c2450705-6a4c-4fab-b248-e24fcb74f0d5-crc-storage\") pod \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.219634 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrtwm\" (UniqueName: \"kubernetes.io/projected/c2450705-6a4c-4fab-b248-e24fcb74f0d5-kube-api-access-zrtwm\") pod \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.219942 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c2450705-6a4c-4fab-b248-e24fcb74f0d5-node-mnt\") pod \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\" (UID: \"c2450705-6a4c-4fab-b248-e24fcb74f0d5\") " Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.220040 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2450705-6a4c-4fab-b248-e24fcb74f0d5-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "c2450705-6a4c-4fab-b248-e24fcb74f0d5" (UID: "c2450705-6a4c-4fab-b248-e24fcb74f0d5"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.221133 5118 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/c2450705-6a4c-4fab-b248-e24fcb74f0d5-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.232148 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2450705-6a4c-4fab-b248-e24fcb74f0d5-kube-api-access-zrtwm" (OuterVolumeSpecName: "kube-api-access-zrtwm") pod "c2450705-6a4c-4fab-b248-e24fcb74f0d5" (UID: "c2450705-6a4c-4fab-b248-e24fcb74f0d5"). InnerVolumeSpecName "kube-api-access-zrtwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.252133 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2450705-6a4c-4fab-b248-e24fcb74f0d5-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "c2450705-6a4c-4fab-b248-e24fcb74f0d5" (UID: "c2450705-6a4c-4fab-b248-e24fcb74f0d5"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.322820 5118 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/c2450705-6a4c-4fab-b248-e24fcb74f0d5-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.323194 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrtwm\" (UniqueName: \"kubernetes.io/projected/c2450705-6a4c-4fab-b248-e24fcb74f0d5-kube-api-access-zrtwm\") on node \"crc\" DevicePath \"\"" Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.743741 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-skjbq" event={"ID":"c2450705-6a4c-4fab-b248-e24fcb74f0d5","Type":"ContainerDied","Data":"a9d77decbb61b2a636031a4be7eac06c1d9f6e1acc2824c1629fe164922f3d70"} Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.744270 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9d77decbb61b2a636031a4be7eac06c1d9f6e1acc2824c1629fe164922f3d70" Feb 23 08:21:45 crc kubenswrapper[5118]: I0223 08:21:45.743847 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-skjbq" Feb 23 08:21:51 crc kubenswrapper[5118]: I0223 08:21:51.029872 5118 scope.go:117] "RemoveContainer" containerID="68148643f0bfaec0fd75a57daaf7377976aaddd6db878112a4bb51c63721ea41" Feb 23 08:22:02 crc kubenswrapper[5118]: I0223 08:22:02.975544 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:22:02 crc kubenswrapper[5118]: I0223 08:22:02.976351 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:22:32 crc kubenswrapper[5118]: I0223 08:22:32.975366 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:22:32 crc kubenswrapper[5118]: I0223 08:22:32.976438 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:23:02 crc kubenswrapper[5118]: I0223 08:23:02.975997 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:23:02 crc kubenswrapper[5118]: I0223 08:23:02.977011 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:23:02 crc kubenswrapper[5118]: I0223 08:23:02.977090 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 08:23:02 crc kubenswrapper[5118]: I0223 08:23:02.978053 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e2b737d523c4548d60f3331b545eb0ad583561f50e921a2ebd442cb227f1582"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:23:02 crc kubenswrapper[5118]: I0223 08:23:02.978168 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://1e2b737d523c4548d60f3331b545eb0ad583561f50e921a2ebd442cb227f1582" gracePeriod=600 Feb 23 08:23:03 crc kubenswrapper[5118]: I0223 08:23:03.569880 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="1e2b737d523c4548d60f3331b545eb0ad583561f50e921a2ebd442cb227f1582" exitCode=0 Feb 23 08:23:03 crc kubenswrapper[5118]: I0223 08:23:03.569931 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"1e2b737d523c4548d60f3331b545eb0ad583561f50e921a2ebd442cb227f1582"} Feb 23 08:23:03 crc kubenswrapper[5118]: I0223 08:23:03.570250 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e"} Feb 23 08:23:03 crc kubenswrapper[5118]: I0223 08:23:03.570275 5118 scope.go:117] "RemoveContainer" containerID="0537fc7f35612cd5e52c27e54ac08dceb0272dffda77b13833d09641cd2541f4" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.018748 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8b4cq"] Feb 23 08:23:53 crc kubenswrapper[5118]: E0223 08:23:53.019829 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2450705-6a4c-4fab-b248-e24fcb74f0d5" containerName="storage" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.019852 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2450705-6a4c-4fab-b248-e24fcb74f0d5" containerName="storage" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.020130 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2450705-6a4c-4fab-b248-e24fcb74f0d5" containerName="storage" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.022418 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.032325 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8b4cq"] Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.199483 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qlz9\" (UniqueName: \"kubernetes.io/projected/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-kube-api-access-4qlz9\") pod \"certified-operators-8b4cq\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.199527 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-catalog-content\") pod \"certified-operators-8b4cq\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.199662 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-utilities\") pod \"certified-operators-8b4cq\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.301487 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-utilities\") pod \"certified-operators-8b4cq\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.301563 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-catalog-content\") pod \"certified-operators-8b4cq\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.301587 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qlz9\" (UniqueName: \"kubernetes.io/projected/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-kube-api-access-4qlz9\") pod \"certified-operators-8b4cq\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.302468 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-utilities\") pod \"certified-operators-8b4cq\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.302694 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-catalog-content\") pod \"certified-operators-8b4cq\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.326179 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qlz9\" (UniqueName: \"kubernetes.io/projected/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-kube-api-access-4qlz9\") pod \"certified-operators-8b4cq\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.354894 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:23:53 crc kubenswrapper[5118]: I0223 08:23:53.652549 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8b4cq"] Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.012690 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rbs5g"] Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.014697 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.033524 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rbs5g"] Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.039003 5118 generic.go:334] "Generic (PLEG): container finished" podID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerID="a0f6ac02b84be0e4b80b0d8c0932131a45e03a0838e293a4b0b876c1fe25deb0" exitCode=0 Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.039066 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b4cq" event={"ID":"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db","Type":"ContainerDied","Data":"a0f6ac02b84be0e4b80b0d8c0932131a45e03a0838e293a4b0b876c1fe25deb0"} Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.039127 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b4cq" event={"ID":"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db","Type":"ContainerStarted","Data":"d2094a0a8018297893c958574b5228b0a751e4c184a403ff9f4816fbc1251ebe"} Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.114679 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-utilities\") pod \"community-operators-rbs5g\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.114761 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-catalog-content\") pod \"community-operators-rbs5g\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.114821 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w99gq\" (UniqueName: \"kubernetes.io/projected/088ea52e-90bf-4d9c-94ab-245c932dd54c-kube-api-access-w99gq\") pod \"community-operators-rbs5g\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.216111 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-utilities\") pod \"community-operators-rbs5g\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.216224 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-catalog-content\") pod \"community-operators-rbs5g\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.216278 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w99gq\" (UniqueName: \"kubernetes.io/projected/088ea52e-90bf-4d9c-94ab-245c932dd54c-kube-api-access-w99gq\") pod \"community-operators-rbs5g\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.216793 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-utilities\") pod \"community-operators-rbs5g\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.217043 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-catalog-content\") pod \"community-operators-rbs5g\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.239707 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w99gq\" (UniqueName: \"kubernetes.io/projected/088ea52e-90bf-4d9c-94ab-245c932dd54c-kube-api-access-w99gq\") pod \"community-operators-rbs5g\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:54 crc kubenswrapper[5118]: I0223 08:23:54.330514 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:23:55 crc kubenswrapper[5118]: W0223 08:23:55.487395 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod088ea52e_90bf_4d9c_94ab_245c932dd54c.slice/crio-74a77b6ecf20b62d23aae06f27ac6d23933c960e7db5b508dff04813011a9db6 WatchSource:0}: Error finding container 74a77b6ecf20b62d23aae06f27ac6d23933c960e7db5b508dff04813011a9db6: Status 404 returned error can't find the container with id 74a77b6ecf20b62d23aae06f27ac6d23933c960e7db5b508dff04813011a9db6 Feb 23 08:23:55 crc kubenswrapper[5118]: I0223 08:23:55.495328 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rbs5g"] Feb 23 08:23:56 crc kubenswrapper[5118]: I0223 08:23:56.063874 5118 generic.go:334] "Generic (PLEG): container finished" podID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerID="aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552" exitCode=0 Feb 23 08:23:56 crc kubenswrapper[5118]: I0223 08:23:56.063945 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbs5g" event={"ID":"088ea52e-90bf-4d9c-94ab-245c932dd54c","Type":"ContainerDied","Data":"aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552"} Feb 23 08:23:56 crc kubenswrapper[5118]: I0223 08:23:56.063973 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbs5g" event={"ID":"088ea52e-90bf-4d9c-94ab-245c932dd54c","Type":"ContainerStarted","Data":"74a77b6ecf20b62d23aae06f27ac6d23933c960e7db5b508dff04813011a9db6"} Feb 23 08:23:56 crc kubenswrapper[5118]: I0223 08:23:56.066991 5118 generic.go:334] "Generic (PLEG): container finished" podID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerID="88b6861088adeca66293a8d288655e8a6512441fed9439a51538f01adb0dcac1" exitCode=0 Feb 23 08:23:56 crc kubenswrapper[5118]: I0223 08:23:56.067024 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b4cq" event={"ID":"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db","Type":"ContainerDied","Data":"88b6861088adeca66293a8d288655e8a6512441fed9439a51538f01adb0dcac1"} Feb 23 08:23:57 crc kubenswrapper[5118]: I0223 08:23:57.076770 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbs5g" event={"ID":"088ea52e-90bf-4d9c-94ab-245c932dd54c","Type":"ContainerStarted","Data":"035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b"} Feb 23 08:23:57 crc kubenswrapper[5118]: I0223 08:23:57.081265 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b4cq" event={"ID":"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db","Type":"ContainerStarted","Data":"703b6facec2340d4625afcc233cffd4424ebad1e919532913e6f011a2e0d86d3"} Feb 23 08:23:57 crc kubenswrapper[5118]: I0223 08:23:57.133268 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8b4cq" podStartSLOduration=2.682837245 podStartE2EDuration="5.133242122s" podCreationTimestamp="2026-02-23 08:23:52 +0000 UTC" firstStartedPulling="2026-02-23 08:23:54.043488483 +0000 UTC m=+5897.047273056" lastFinishedPulling="2026-02-23 08:23:56.49389335 +0000 UTC m=+5899.497677933" observedRunningTime="2026-02-23 08:23:57.130364383 +0000 UTC m=+5900.134148986" watchObservedRunningTime="2026-02-23 08:23:57.133242122 +0000 UTC m=+5900.137026685" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.092938 5118 generic.go:334] "Generic (PLEG): container finished" podID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerID="035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b" exitCode=0 Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.093158 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbs5g" event={"ID":"088ea52e-90bf-4d9c-94ab-245c932dd54c","Type":"ContainerDied","Data":"035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b"} Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.420928 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6kg8w"] Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.424378 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.439399 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6kg8w"] Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.513544 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24md4\" (UniqueName: \"kubernetes.io/projected/31082e2f-e315-4fc0-8611-51841727000f-kube-api-access-24md4\") pod \"redhat-operators-6kg8w\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.513604 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-utilities\") pod \"redhat-operators-6kg8w\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.513663 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-catalog-content\") pod \"redhat-operators-6kg8w\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.614586 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-utilities\") pod \"redhat-operators-6kg8w\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.614685 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-catalog-content\") pod \"redhat-operators-6kg8w\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.614747 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24md4\" (UniqueName: \"kubernetes.io/projected/31082e2f-e315-4fc0-8611-51841727000f-kube-api-access-24md4\") pod \"redhat-operators-6kg8w\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.615468 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-utilities\") pod \"redhat-operators-6kg8w\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.615768 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-catalog-content\") pod \"redhat-operators-6kg8w\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.638045 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24md4\" (UniqueName: \"kubernetes.io/projected/31082e2f-e315-4fc0-8611-51841727000f-kube-api-access-24md4\") pod \"redhat-operators-6kg8w\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:58 crc kubenswrapper[5118]: I0223 08:23:58.744362 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:23:59 crc kubenswrapper[5118]: I0223 08:23:59.040988 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6kg8w"] Feb 23 08:23:59 crc kubenswrapper[5118]: W0223 08:23:59.045875 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31082e2f_e315_4fc0_8611_51841727000f.slice/crio-6ec35e3ef952eb73d2d2c38633382c4162267108e920979aaf8f3491e967c402 WatchSource:0}: Error finding container 6ec35e3ef952eb73d2d2c38633382c4162267108e920979aaf8f3491e967c402: Status 404 returned error can't find the container with id 6ec35e3ef952eb73d2d2c38633382c4162267108e920979aaf8f3491e967c402 Feb 23 08:23:59 crc kubenswrapper[5118]: I0223 08:23:59.099358 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kg8w" event={"ID":"31082e2f-e315-4fc0-8611-51841727000f","Type":"ContainerStarted","Data":"6ec35e3ef952eb73d2d2c38633382c4162267108e920979aaf8f3491e967c402"} Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.106738 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbs5g" event={"ID":"088ea52e-90bf-4d9c-94ab-245c932dd54c","Type":"ContainerStarted","Data":"494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441"} Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.108847 5118 generic.go:334] "Generic (PLEG): container finished" podID="31082e2f-e315-4fc0-8611-51841727000f" containerID="10dd066b092fae5e30205a9b40e7cf7e4e88b8da461fdbebc180d75b8d04bcc1" exitCode=0 Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.108880 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kg8w" event={"ID":"31082e2f-e315-4fc0-8611-51841727000f","Type":"ContainerDied","Data":"10dd066b092fae5e30205a9b40e7cf7e4e88b8da461fdbebc180d75b8d04bcc1"} Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.135381 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rbs5g" podStartSLOduration=3.665394482 podStartE2EDuration="7.135363233s" podCreationTimestamp="2026-02-23 08:23:53 +0000 UTC" firstStartedPulling="2026-02-23 08:23:56.065697494 +0000 UTC m=+5899.069482057" lastFinishedPulling="2026-02-23 08:23:59.535666235 +0000 UTC m=+5902.539450808" observedRunningTime="2026-02-23 08:24:00.135362993 +0000 UTC m=+5903.139147566" watchObservedRunningTime="2026-02-23 08:24:00.135363233 +0000 UTC m=+5903.139147796" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.216048 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-wz4tb"] Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.217187 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.222406 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.222495 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.222707 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wxzbt" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.222887 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.223850 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.229286 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-wz4tb"] Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.338800 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-dns-svc\") pod \"dnsmasq-dns-787c4dc9-wz4tb\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.339211 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-config\") pod \"dnsmasq-dns-787c4dc9-wz4tb\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.339304 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th8tx\" (UniqueName: \"kubernetes.io/projected/309efc96-31a9-4913-b6b2-97f7ac5b4d60-kube-api-access-th8tx\") pod \"dnsmasq-dns-787c4dc9-wz4tb\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.440471 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-dns-svc\") pod \"dnsmasq-dns-787c4dc9-wz4tb\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.440552 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-config\") pod \"dnsmasq-dns-787c4dc9-wz4tb\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.440587 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th8tx\" (UniqueName: \"kubernetes.io/projected/309efc96-31a9-4913-b6b2-97f7ac5b4d60-kube-api-access-th8tx\") pod \"dnsmasq-dns-787c4dc9-wz4tb\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.441472 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-dns-svc\") pod \"dnsmasq-dns-787c4dc9-wz4tb\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.441608 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-config\") pod \"dnsmasq-dns-787c4dc9-wz4tb\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.483427 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th8tx\" (UniqueName: \"kubernetes.io/projected/309efc96-31a9-4913-b6b2-97f7ac5b4d60-kube-api-access-th8tx\") pod \"dnsmasq-dns-787c4dc9-wz4tb\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.492239 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-qrvkl"] Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.493641 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.503821 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-qrvkl"] Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.541422 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-dns-svc\") pod \"dnsmasq-dns-bb88b7bf5-qrvkl\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.541476 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gjz\" (UniqueName: \"kubernetes.io/projected/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-kube-api-access-27gjz\") pod \"dnsmasq-dns-bb88b7bf5-qrvkl\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.541500 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-config\") pod \"dnsmasq-dns-bb88b7bf5-qrvkl\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.543328 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.643855 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-config\") pod \"dnsmasq-dns-bb88b7bf5-qrvkl\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.644257 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-dns-svc\") pod \"dnsmasq-dns-bb88b7bf5-qrvkl\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.644288 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27gjz\" (UniqueName: \"kubernetes.io/projected/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-kube-api-access-27gjz\") pod \"dnsmasq-dns-bb88b7bf5-qrvkl\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.645773 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-config\") pod \"dnsmasq-dns-bb88b7bf5-qrvkl\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.646036 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-dns-svc\") pod \"dnsmasq-dns-bb88b7bf5-qrvkl\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.698070 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gjz\" (UniqueName: \"kubernetes.io/projected/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-kube-api-access-27gjz\") pod \"dnsmasq-dns-bb88b7bf5-qrvkl\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:00 crc kubenswrapper[5118]: I0223 08:24:00.827167 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.200751 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-qrvkl"] Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.285530 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-wz4tb"] Feb 23 08:24:01 crc kubenswrapper[5118]: W0223 08:24:01.288499 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod309efc96_31a9_4913_b6b2_97f7ac5b4d60.slice/crio-d75d9e0b2b517c71bf6d1dada37c2e3fb8e8413ebe078180bbcd4270faf5b9a0 WatchSource:0}: Error finding container d75d9e0b2b517c71bf6d1dada37c2e3fb8e8413ebe078180bbcd4270faf5b9a0: Status 404 returned error can't find the container with id d75d9e0b2b517c71bf6d1dada37c2e3fb8e8413ebe078180bbcd4270faf5b9a0 Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.348210 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.373135 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.373491 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.377235 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.377468 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z7xd8" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.377910 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.377926 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.380537 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.470020 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.470108 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdx4s\" (UniqueName: \"kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-kube-api-access-gdx4s\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.470136 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.470179 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.470199 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.470364 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.470424 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.470605 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.470687 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.574866 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.574929 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.574971 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.574999 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdx4s\" (UniqueName: \"kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-kube-api-access-gdx4s\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.575014 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.575048 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.575065 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.575089 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.575126 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.575602 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.575914 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.576395 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.576508 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.581903 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.582653 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.582657 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.585985 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.586017 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f28412002351dd7faaa1cdc039facc478af48ee316228c5423d737536d43b857/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.605981 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdx4s\" (UniqueName: \"kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-kube-api-access-gdx4s\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.630945 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") pod \"rabbitmq-server-0\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.717991 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.750615 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.755517 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.760160 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.760434 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.760574 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.760727 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.760914 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jtkmt" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.767851 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.882937 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.882990 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.883071 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.883140 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.883167 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/883474ec-4fbf-47ab-9d9f-304e53823d98-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.883196 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmf27\" (UniqueName: \"kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-kube-api-access-xmf27\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.883225 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.883241 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.883275 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/883474ec-4fbf-47ab-9d9f-304e53823d98-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.984821 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.984972 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.984998 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/883474ec-4fbf-47ab-9d9f-304e53823d98-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.985022 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmf27\" (UniqueName: \"kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-kube-api-access-xmf27\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.985050 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.985067 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.985087 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/883474ec-4fbf-47ab-9d9f-304e53823d98-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.985129 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.985148 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.985733 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.986961 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.987582 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.989945 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.989973 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fde6efe325c4fc32923ec989bc4f441a900a9715e86789b84f83171ec6fd762e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.991369 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.994947 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/883474ec-4fbf-47ab-9d9f-304e53823d98-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:01 crc kubenswrapper[5118]: I0223 08:24:01.998533 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.000140 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/883474ec-4fbf-47ab-9d9f-304e53823d98-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.004028 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmf27\" (UniqueName: \"kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-kube-api-access-xmf27\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.072536 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.092738 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.125805 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" event={"ID":"309efc96-31a9-4913-b6b2-97f7ac5b4d60","Type":"ContainerStarted","Data":"d75d9e0b2b517c71bf6d1dada37c2e3fb8e8413ebe078180bbcd4270faf5b9a0"} Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.128851 5118 generic.go:334] "Generic (PLEG): container finished" podID="31082e2f-e315-4fc0-8611-51841727000f" containerID="35fc7b9beae91e3537de9917332b4b24c1aec8dffde88b01c3a435d5b9cdcb8e" exitCode=0 Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.128927 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kg8w" event={"ID":"31082e2f-e315-4fc0-8611-51841727000f","Type":"ContainerDied","Data":"35fc7b9beae91e3537de9917332b4b24c1aec8dffde88b01c3a435d5b9cdcb8e"} Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.130033 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" event={"ID":"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23","Type":"ContainerStarted","Data":"dd097f24343fe7b5bb8494f92df1f817845bd8a35868e537fcccca5f9bf26722"} Feb 23 08:24:02 crc kubenswrapper[5118]: W0223 08:24:02.365720 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d8dfb84_18ac_44e7_8774_d1e193bf72b3.slice/crio-475d36bfbc5fd6d5d703119ad617e9ce8332f91f4343d2bc0f82210f811f83b3 WatchSource:0}: Error finding container 475d36bfbc5fd6d5d703119ad617e9ce8332f91f4343d2bc0f82210f811f83b3: Status 404 returned error can't find the container with id 475d36bfbc5fd6d5d703119ad617e9ce8332f91f4343d2bc0f82210f811f83b3 Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.366693 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.391888 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.884041 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.885561 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.888995 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.889639 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-sj4mj" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.890048 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.891368 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.902737 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 23 08:24:02 crc kubenswrapper[5118]: I0223 08:24:02.910334 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.013116 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsmhb\" (UniqueName: \"kubernetes.io/projected/6148598d-7822-4b48-b805-f2544e9bc5ea-kube-api-access-lsmhb\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.015195 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6148598d-7822-4b48-b805-f2544e9bc5ea-kolla-config\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.015298 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6148598d-7822-4b48-b805-f2544e9bc5ea-config-data-default\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.016928 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6148598d-7822-4b48-b805-f2544e9bc5ea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.017038 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6148598d-7822-4b48-b805-f2544e9bc5ea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.017148 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6148598d-7822-4b48-b805-f2544e9bc5ea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.017259 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6148598d-7822-4b48-b805-f2544e9bc5ea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.017489 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1f66a6dd-9c9b-47c6-84fb-3fdf5b093afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f66a6dd-9c9b-47c6-84fb-3fdf5b093afc\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.065250 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.066503 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.069440 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mvcpf" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.069718 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.076739 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.118573 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsmhb\" (UniqueName: \"kubernetes.io/projected/6148598d-7822-4b48-b805-f2544e9bc5ea-kube-api-access-lsmhb\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.118622 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6148598d-7822-4b48-b805-f2544e9bc5ea-kolla-config\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.118639 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6148598d-7822-4b48-b805-f2544e9bc5ea-config-data-default\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.118727 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6148598d-7822-4b48-b805-f2544e9bc5ea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.118751 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6148598d-7822-4b48-b805-f2544e9bc5ea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.118767 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6148598d-7822-4b48-b805-f2544e9bc5ea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.118798 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6148598d-7822-4b48-b805-f2544e9bc5ea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.118820 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1f66a6dd-9c9b-47c6-84fb-3fdf5b093afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f66a6dd-9c9b-47c6-84fb-3fdf5b093afc\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.120914 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6148598d-7822-4b48-b805-f2544e9bc5ea-operator-scripts\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.121497 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6148598d-7822-4b48-b805-f2544e9bc5ea-kolla-config\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.122561 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6148598d-7822-4b48-b805-f2544e9bc5ea-config-data-generated\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.124758 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.124784 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1f66a6dd-9c9b-47c6-84fb-3fdf5b093afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f66a6dd-9c9b-47c6-84fb-3fdf5b093afc\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6f91d3ffc81c76bcc10ae07075cc50dc758a48ee4d9c2eb7449da32b69326372/globalmount\"" pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.125258 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6148598d-7822-4b48-b805-f2544e9bc5ea-config-data-default\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.126940 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6148598d-7822-4b48-b805-f2544e9bc5ea-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.136330 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6148598d-7822-4b48-b805-f2544e9bc5ea-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.145619 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsmhb\" (UniqueName: \"kubernetes.io/projected/6148598d-7822-4b48-b805-f2544e9bc5ea-kube-api-access-lsmhb\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.153800 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kg8w" event={"ID":"31082e2f-e315-4fc0-8611-51841727000f","Type":"ContainerStarted","Data":"ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8"} Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.167320 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d8dfb84-18ac-44e7-8774-d1e193bf72b3","Type":"ContainerStarted","Data":"475d36bfbc5fd6d5d703119ad617e9ce8332f91f4343d2bc0f82210f811f83b3"} Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.178361 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"883474ec-4fbf-47ab-9d9f-304e53823d98","Type":"ContainerStarted","Data":"5a456dbd3d4d2713b2ce1a91d47c84047b8a2e206a5e31f1c147ef637403b428"} Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.190516 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1f66a6dd-9c9b-47c6-84fb-3fdf5b093afc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1f66a6dd-9c9b-47c6-84fb-3fdf5b093afc\") pod \"openstack-galera-0\" (UID: \"6148598d-7822-4b48-b805-f2544e9bc5ea\") " pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.209763 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6kg8w" podStartSLOduration=2.703012942 podStartE2EDuration="5.209689771s" podCreationTimestamp="2026-02-23 08:23:58 +0000 UTC" firstStartedPulling="2026-02-23 08:24:00.110453514 +0000 UTC m=+5903.114238087" lastFinishedPulling="2026-02-23 08:24:02.617130333 +0000 UTC m=+5905.620914916" observedRunningTime="2026-02-23 08:24:03.182270171 +0000 UTC m=+5906.186054744" watchObservedRunningTime="2026-02-23 08:24:03.209689771 +0000 UTC m=+5906.213474444" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.210064 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.221130 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmmtk\" (UniqueName: \"kubernetes.io/projected/c42fb7ad-c7fe-425a-b3b5-82cf6c03122b-kube-api-access-jmmtk\") pod \"memcached-0\" (UID: \"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b\") " pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.221571 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c42fb7ad-c7fe-425a-b3b5-82cf6c03122b-config-data\") pod \"memcached-0\" (UID: \"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b\") " pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.221616 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c42fb7ad-c7fe-425a-b3b5-82cf6c03122b-kolla-config\") pod \"memcached-0\" (UID: \"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b\") " pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.329774 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmtk\" (UniqueName: \"kubernetes.io/projected/c42fb7ad-c7fe-425a-b3b5-82cf6c03122b-kube-api-access-jmmtk\") pod \"memcached-0\" (UID: \"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b\") " pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.330000 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c42fb7ad-c7fe-425a-b3b5-82cf6c03122b-config-data\") pod \"memcached-0\" (UID: \"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b\") " pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.330022 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c42fb7ad-c7fe-425a-b3b5-82cf6c03122b-kolla-config\") pod \"memcached-0\" (UID: \"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b\") " pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.331134 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c42fb7ad-c7fe-425a-b3b5-82cf6c03122b-kolla-config\") pod \"memcached-0\" (UID: \"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b\") " pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.331167 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c42fb7ad-c7fe-425a-b3b5-82cf6c03122b-config-data\") pod \"memcached-0\" (UID: \"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b\") " pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.356474 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.356822 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.365866 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmmtk\" (UniqueName: \"kubernetes.io/projected/c42fb7ad-c7fe-425a-b3b5-82cf6c03122b-kube-api-access-jmmtk\") pod \"memcached-0\" (UID: \"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b\") " pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.389041 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.405577 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.668016 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 08:24:03 crc kubenswrapper[5118]: W0223 08:24:03.674461 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc42fb7ad_c7fe_425a_b3b5_82cf6c03122b.slice/crio-1cba8f2a81bf84cd8a819631f7bfca18bce773e775769f2e892fe80991546a3f WatchSource:0}: Error finding container 1cba8f2a81bf84cd8a819631f7bfca18bce773e775769f2e892fe80991546a3f: Status 404 returned error can't find the container with id 1cba8f2a81bf84cd8a819631f7bfca18bce773e775769f2e892fe80991546a3f Feb 23 08:24:03 crc kubenswrapper[5118]: W0223 08:24:03.745077 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6148598d_7822_4b48_b805_f2544e9bc5ea.slice/crio-c3842f9d53fa0251c716ea65dc70d896a30d7873259a86561a489c5a044c20e1 WatchSource:0}: Error finding container c3842f9d53fa0251c716ea65dc70d896a30d7873259a86561a489c5a044c20e1: Status 404 returned error can't find the container with id c3842f9d53fa0251c716ea65dc70d896a30d7873259a86561a489c5a044c20e1 Feb 23 08:24:03 crc kubenswrapper[5118]: I0223 08:24:03.746285 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.185853 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6148598d-7822-4b48-b805-f2544e9bc5ea","Type":"ContainerStarted","Data":"c3842f9d53fa0251c716ea65dc70d896a30d7873259a86561a489c5a044c20e1"} Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.187249 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b","Type":"ContainerStarted","Data":"1cba8f2a81bf84cd8a819631f7bfca18bce773e775769f2e892fe80991546a3f"} Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.260320 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.280410 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.281674 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.285682 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.285961 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.290713 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-qsvhp" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.290889 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.312752 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.331511 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.331557 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.349170 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-97f2f5bb-ae4d-45af-b284-1402cf82840e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f5bb-ae4d-45af-b284-1402cf82840e\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.349225 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f104baf-13e1-44ac-bf73-ea400599dee0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.349253 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmm7v\" (UniqueName: \"kubernetes.io/projected/3f104baf-13e1-44ac-bf73-ea400599dee0-kube-api-access-wmm7v\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.349326 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3f104baf-13e1-44ac-bf73-ea400599dee0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.349352 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f104baf-13e1-44ac-bf73-ea400599dee0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.349402 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f104baf-13e1-44ac-bf73-ea400599dee0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.349496 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3f104baf-13e1-44ac-bf73-ea400599dee0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.349530 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f104baf-13e1-44ac-bf73-ea400599dee0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.428555 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.451871 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3f104baf-13e1-44ac-bf73-ea400599dee0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.452002 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f104baf-13e1-44ac-bf73-ea400599dee0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.452046 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-97f2f5bb-ae4d-45af-b284-1402cf82840e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f5bb-ae4d-45af-b284-1402cf82840e\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.452064 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f104baf-13e1-44ac-bf73-ea400599dee0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.452083 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmm7v\" (UniqueName: \"kubernetes.io/projected/3f104baf-13e1-44ac-bf73-ea400599dee0-kube-api-access-wmm7v\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.452190 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3f104baf-13e1-44ac-bf73-ea400599dee0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.452209 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f104baf-13e1-44ac-bf73-ea400599dee0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.452309 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f104baf-13e1-44ac-bf73-ea400599dee0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.452995 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f104baf-13e1-44ac-bf73-ea400599dee0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.453342 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3f104baf-13e1-44ac-bf73-ea400599dee0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.454174 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3f104baf-13e1-44ac-bf73-ea400599dee0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.455452 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f104baf-13e1-44ac-bf73-ea400599dee0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.458608 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f104baf-13e1-44ac-bf73-ea400599dee0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.458911 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f104baf-13e1-44ac-bf73-ea400599dee0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.464883 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.464920 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-97f2f5bb-ae4d-45af-b284-1402cf82840e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f5bb-ae4d-45af-b284-1402cf82840e\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f250c10f75e4f876e7c0c19af6de44d1b13c0c9698758e8479a1e265c5435cd/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.469340 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmm7v\" (UniqueName: \"kubernetes.io/projected/3f104baf-13e1-44ac-bf73-ea400599dee0-kube-api-access-wmm7v\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.502527 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-97f2f5bb-ae4d-45af-b284-1402cf82840e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-97f2f5bb-ae4d-45af-b284-1402cf82840e\") pod \"openstack-cell1-galera-0\" (UID: \"3f104baf-13e1-44ac-bf73-ea400599dee0\") " pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:04 crc kubenswrapper[5118]: I0223 08:24:04.607332 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:05 crc kubenswrapper[5118]: I0223 08:24:05.102992 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 08:24:05 crc kubenswrapper[5118]: I0223 08:24:05.442225 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:24:06 crc kubenswrapper[5118]: I0223 08:24:06.385598 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3f104baf-13e1-44ac-bf73-ea400599dee0","Type":"ContainerStarted","Data":"52e453a49a6ab739f5606fbc4ffa365d72cb1c0655d5a7626826927fea0519e0"} Feb 23 08:24:07 crc kubenswrapper[5118]: I0223 08:24:07.205844 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8b4cq"] Feb 23 08:24:07 crc kubenswrapper[5118]: I0223 08:24:07.394454 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8b4cq" podUID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerName="registry-server" containerID="cri-o://703b6facec2340d4625afcc233cffd4424ebad1e919532913e6f011a2e0d86d3" gracePeriod=2 Feb 23 08:24:08 crc kubenswrapper[5118]: I0223 08:24:08.746145 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:24:08 crc kubenswrapper[5118]: I0223 08:24:08.746711 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:24:08 crc kubenswrapper[5118]: I0223 08:24:08.811979 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rbs5g"] Feb 23 08:24:08 crc kubenswrapper[5118]: I0223 08:24:08.812533 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rbs5g" podUID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerName="registry-server" containerID="cri-o://494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441" gracePeriod=2 Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.291176 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.374168 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-utilities\") pod \"088ea52e-90bf-4d9c-94ab-245c932dd54c\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.374371 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w99gq\" (UniqueName: \"kubernetes.io/projected/088ea52e-90bf-4d9c-94ab-245c932dd54c-kube-api-access-w99gq\") pod \"088ea52e-90bf-4d9c-94ab-245c932dd54c\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.374413 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-catalog-content\") pod \"088ea52e-90bf-4d9c-94ab-245c932dd54c\" (UID: \"088ea52e-90bf-4d9c-94ab-245c932dd54c\") " Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.375126 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-utilities" (OuterVolumeSpecName: "utilities") pod "088ea52e-90bf-4d9c-94ab-245c932dd54c" (UID: "088ea52e-90bf-4d9c-94ab-245c932dd54c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.382808 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/088ea52e-90bf-4d9c-94ab-245c932dd54c-kube-api-access-w99gq" (OuterVolumeSpecName: "kube-api-access-w99gq") pod "088ea52e-90bf-4d9c-94ab-245c932dd54c" (UID: "088ea52e-90bf-4d9c-94ab-245c932dd54c"). InnerVolumeSpecName "kube-api-access-w99gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.417366 5118 generic.go:334] "Generic (PLEG): container finished" podID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerID="494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441" exitCode=0 Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.417436 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbs5g" event={"ID":"088ea52e-90bf-4d9c-94ab-245c932dd54c","Type":"ContainerDied","Data":"494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441"} Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.417466 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbs5g" event={"ID":"088ea52e-90bf-4d9c-94ab-245c932dd54c","Type":"ContainerDied","Data":"74a77b6ecf20b62d23aae06f27ac6d23933c960e7db5b508dff04813011a9db6"} Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.417492 5118 scope.go:117] "RemoveContainer" containerID="494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.417612 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbs5g" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.422582 5118 generic.go:334] "Generic (PLEG): container finished" podID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerID="703b6facec2340d4625afcc233cffd4424ebad1e919532913e6f011a2e0d86d3" exitCode=0 Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.422620 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b4cq" event={"ID":"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db","Type":"ContainerDied","Data":"703b6facec2340d4625afcc233cffd4424ebad1e919532913e6f011a2e0d86d3"} Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.438938 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "088ea52e-90bf-4d9c-94ab-245c932dd54c" (UID: "088ea52e-90bf-4d9c-94ab-245c932dd54c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.460613 5118 scope.go:117] "RemoveContainer" containerID="035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.485125 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.485163 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/088ea52e-90bf-4d9c-94ab-245c932dd54c-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.485174 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w99gq\" (UniqueName: \"kubernetes.io/projected/088ea52e-90bf-4d9c-94ab-245c932dd54c-kube-api-access-w99gq\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.499728 5118 scope.go:117] "RemoveContainer" containerID="aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.540150 5118 scope.go:117] "RemoveContainer" containerID="494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441" Feb 23 08:24:09 crc kubenswrapper[5118]: E0223 08:24:09.544881 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441\": container with ID starting with 494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441 not found: ID does not exist" containerID="494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.544931 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441"} err="failed to get container status \"494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441\": rpc error: code = NotFound desc = could not find container \"494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441\": container with ID starting with 494cd8327e2cdd4f610b1e141d64a015b51d5f582c0f8ac47a575b12a2832441 not found: ID does not exist" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.544958 5118 scope.go:117] "RemoveContainer" containerID="035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b" Feb 23 08:24:09 crc kubenswrapper[5118]: E0223 08:24:09.548631 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b\": container with ID starting with 035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b not found: ID does not exist" containerID="035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.548669 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b"} err="failed to get container status \"035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b\": rpc error: code = NotFound desc = could not find container \"035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b\": container with ID starting with 035fd31fb63b49f189b9471fc034a3fd107c64c2d1e174206f9ddb988d5d2b5b not found: ID does not exist" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.548686 5118 scope.go:117] "RemoveContainer" containerID="aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552" Feb 23 08:24:09 crc kubenswrapper[5118]: E0223 08:24:09.548933 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552\": container with ID starting with aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552 not found: ID does not exist" containerID="aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.548949 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552"} err="failed to get container status \"aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552\": rpc error: code = NotFound desc = could not find container \"aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552\": container with ID starting with aa168c36154bcaf42b5f3ac6a6b8797827757f7312457ae3a9d82828bcf4e552 not found: ID does not exist" Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.776969 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rbs5g"] Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.811977 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rbs5g"] Feb 23 08:24:09 crc kubenswrapper[5118]: I0223 08:24:09.827535 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6kg8w" podUID="31082e2f-e315-4fc0-8611-51841727000f" containerName="registry-server" probeResult="failure" output=< Feb 23 08:24:09 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 08:24:09 crc kubenswrapper[5118]: > Feb 23 08:24:10 crc kubenswrapper[5118]: I0223 08:24:10.571567 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:24:10 crc kubenswrapper[5118]: I0223 08:24:10.605066 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-catalog-content\") pod \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " Feb 23 08:24:10 crc kubenswrapper[5118]: I0223 08:24:10.605432 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-utilities\") pod \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " Feb 23 08:24:10 crc kubenswrapper[5118]: I0223 08:24:10.605537 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qlz9\" (UniqueName: \"kubernetes.io/projected/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-kube-api-access-4qlz9\") pod \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\" (UID: \"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db\") " Feb 23 08:24:10 crc kubenswrapper[5118]: I0223 08:24:10.607087 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-utilities" (OuterVolumeSpecName: "utilities") pod "eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" (UID: "eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:24:10 crc kubenswrapper[5118]: I0223 08:24:10.615070 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-kube-api-access-4qlz9" (OuterVolumeSpecName: "kube-api-access-4qlz9") pod "eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" (UID: "eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db"). InnerVolumeSpecName "kube-api-access-4qlz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:24:10 crc kubenswrapper[5118]: I0223 08:24:10.680421 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" (UID: "eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:24:10 crc kubenswrapper[5118]: I0223 08:24:10.708015 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:10 crc kubenswrapper[5118]: I0223 08:24:10.708055 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:10 crc kubenswrapper[5118]: I0223 08:24:10.708069 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qlz9\" (UniqueName: \"kubernetes.io/projected/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db-kube-api-access-4qlz9\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:11 crc kubenswrapper[5118]: I0223 08:24:11.445084 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8b4cq" event={"ID":"eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db","Type":"ContainerDied","Data":"d2094a0a8018297893c958574b5228b0a751e4c184a403ff9f4816fbc1251ebe"} Feb 23 08:24:11 crc kubenswrapper[5118]: I0223 08:24:11.446003 5118 scope.go:117] "RemoveContainer" containerID="703b6facec2340d4625afcc233cffd4424ebad1e919532913e6f011a2e0d86d3" Feb 23 08:24:11 crc kubenswrapper[5118]: I0223 08:24:11.445272 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8b4cq" Feb 23 08:24:11 crc kubenswrapper[5118]: I0223 08:24:11.499678 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8b4cq"] Feb 23 08:24:11 crc kubenswrapper[5118]: I0223 08:24:11.503462 5118 scope.go:117] "RemoveContainer" containerID="88b6861088adeca66293a8d288655e8a6512441fed9439a51538f01adb0dcac1" Feb 23 08:24:11 crc kubenswrapper[5118]: I0223 08:24:11.508050 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8b4cq"] Feb 23 08:24:11 crc kubenswrapper[5118]: I0223 08:24:11.538316 5118 scope.go:117] "RemoveContainer" containerID="a0f6ac02b84be0e4b80b0d8c0932131a45e03a0838e293a4b0b876c1fe25deb0" Feb 23 08:24:11 crc kubenswrapper[5118]: I0223 08:24:11.708016 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="088ea52e-90bf-4d9c-94ab-245c932dd54c" path="/var/lib/kubelet/pods/088ea52e-90bf-4d9c-94ab-245c932dd54c/volumes" Feb 23 08:24:11 crc kubenswrapper[5118]: I0223 08:24:11.708854 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" path="/var/lib/kubelet/pods/eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db/volumes" Feb 23 08:24:18 crc kubenswrapper[5118]: I0223 08:24:18.813925 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:24:18 crc kubenswrapper[5118]: I0223 08:24:18.924929 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:24:19 crc kubenswrapper[5118]: I0223 08:24:19.053995 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6kg8w"] Feb 23 08:24:20 crc kubenswrapper[5118]: I0223 08:24:20.564348 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6kg8w" podUID="31082e2f-e315-4fc0-8611-51841727000f" containerName="registry-server" containerID="cri-o://ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8" gracePeriod=2 Feb 23 08:24:21 crc kubenswrapper[5118]: I0223 08:24:21.578179 5118 generic.go:334] "Generic (PLEG): container finished" podID="31082e2f-e315-4fc0-8611-51841727000f" containerID="ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8" exitCode=0 Feb 23 08:24:21 crc kubenswrapper[5118]: I0223 08:24:21.578268 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kg8w" event={"ID":"31082e2f-e315-4fc0-8611-51841727000f","Type":"ContainerDied","Data":"ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8"} Feb 23 08:24:28 crc kubenswrapper[5118]: E0223 08:24:28.746222 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8 is running failed: container process not found" containerID="ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8" cmd=["grpc_health_probe","-addr=:50051"] Feb 23 08:24:28 crc kubenswrapper[5118]: E0223 08:24:28.747874 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8 is running failed: container process not found" containerID="ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8" cmd=["grpc_health_probe","-addr=:50051"] Feb 23 08:24:28 crc kubenswrapper[5118]: E0223 08:24:28.748675 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8 is running failed: container process not found" containerID="ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8" cmd=["grpc_health_probe","-addr=:50051"] Feb 23 08:24:28 crc kubenswrapper[5118]: E0223 08:24:28.748777 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-6kg8w" podUID="31082e2f-e315-4fc0-8611-51841727000f" containerName="registry-server" Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.602036 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.682344 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6kg8w" event={"ID":"31082e2f-e315-4fc0-8611-51841727000f","Type":"ContainerDied","Data":"6ec35e3ef952eb73d2d2c38633382c4162267108e920979aaf8f3491e967c402"} Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.682437 5118 scope.go:117] "RemoveContainer" containerID="ba88c216fd7d6343a9c62f42718105400690a6bdfa9bd84c647c6eb0bbce0dd8" Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.683033 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6kg8w" Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.750514 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24md4\" (UniqueName: \"kubernetes.io/projected/31082e2f-e315-4fc0-8611-51841727000f-kube-api-access-24md4\") pod \"31082e2f-e315-4fc0-8611-51841727000f\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.750609 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-utilities\") pod \"31082e2f-e315-4fc0-8611-51841727000f\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.750736 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-catalog-content\") pod \"31082e2f-e315-4fc0-8611-51841727000f\" (UID: \"31082e2f-e315-4fc0-8611-51841727000f\") " Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.752500 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-utilities" (OuterVolumeSpecName: "utilities") pod "31082e2f-e315-4fc0-8611-51841727000f" (UID: "31082e2f-e315-4fc0-8611-51841727000f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.758829 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31082e2f-e315-4fc0-8611-51841727000f-kube-api-access-24md4" (OuterVolumeSpecName: "kube-api-access-24md4") pod "31082e2f-e315-4fc0-8611-51841727000f" (UID: "31082e2f-e315-4fc0-8611-51841727000f"). InnerVolumeSpecName "kube-api-access-24md4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.853878 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.853918 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24md4\" (UniqueName: \"kubernetes.io/projected/31082e2f-e315-4fc0-8611-51841727000f-kube-api-access-24md4\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:30 crc kubenswrapper[5118]: I0223 08:24:30.990025 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31082e2f-e315-4fc0-8611-51841727000f" (UID: "31082e2f-e315-4fc0-8611-51841727000f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:24:31 crc kubenswrapper[5118]: I0223 08:24:31.082775 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31082e2f-e315-4fc0-8611-51841727000f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:31 crc kubenswrapper[5118]: I0223 08:24:31.322241 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6kg8w"] Feb 23 08:24:31 crc kubenswrapper[5118]: I0223 08:24:31.327766 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6kg8w"] Feb 23 08:24:31 crc kubenswrapper[5118]: I0223 08:24:31.434885 5118 scope.go:117] "RemoveContainer" containerID="35fc7b9beae91e3537de9917332b4b24c1aec8dffde88b01c3a435d5b9cdcb8e" Feb 23 08:24:31 crc kubenswrapper[5118]: E0223 08:24:31.486462 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 08:24:31 crc kubenswrapper[5118]: E0223 08:24:31.486529 5118 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 08:24:31 crc kubenswrapper[5118]: E0223 08:24:31.486704 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-th8tx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-787c4dc9-wz4tb_openstack(309efc96-31a9-4913-b6b2-97f7ac5b4d60): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 08:24:31 crc kubenswrapper[5118]: E0223 08:24:31.488818 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" podUID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" Feb 23 08:24:31 crc kubenswrapper[5118]: E0223 08:24:31.504204 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 08:24:31 crc kubenswrapper[5118]: E0223 08:24:31.504257 5118 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb" Feb 23 08:24:31 crc kubenswrapper[5118]: E0223 08:24:31.504389 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h564h676h699hcdh67bh66hfdh569h545h648h94h546h696h668h89h96h667h575h595h5d9h584h8dhbdh697h54bhb7h58fh5c9hd8h5cdh5c7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-27gjz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bb88b7bf5-qrvkl_openstack(0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 08:24:31 crc kubenswrapper[5118]: E0223 08:24:31.505715 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" podUID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" Feb 23 08:24:31 crc kubenswrapper[5118]: I0223 08:24:31.595341 5118 scope.go:117] "RemoveContainer" containerID="10dd066b092fae5e30205a9b40e7cf7e4e88b8da461fdbebc180d75b8d04bcc1" Feb 23 08:24:31 crc kubenswrapper[5118]: E0223 08:24:31.692819 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" podUID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" Feb 23 08:24:31 crc kubenswrapper[5118]: E0223 08:24:31.692851 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" podUID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" Feb 23 08:24:31 crc kubenswrapper[5118]: I0223 08:24:31.716124 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31082e2f-e315-4fc0-8611-51841727000f" path="/var/lib/kubelet/pods/31082e2f-e315-4fc0-8611-51841727000f/volumes" Feb 23 08:24:32 crc kubenswrapper[5118]: I0223 08:24:32.700118 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c42fb7ad-c7fe-425a-b3b5-82cf6c03122b","Type":"ContainerStarted","Data":"070ad0bcc4fcbf43462c71e5710c43eab671d4a905b2e7b7ba91b7cfc3d5fbfd"} Feb 23 08:24:32 crc kubenswrapper[5118]: I0223 08:24:32.700572 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 23 08:24:32 crc kubenswrapper[5118]: I0223 08:24:32.702348 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3f104baf-13e1-44ac-bf73-ea400599dee0","Type":"ContainerStarted","Data":"992c0b38f91cb98c4f94888f70436fd95ac08db7df8f97e62a8fe15b96e8dfce"} Feb 23 08:24:32 crc kubenswrapper[5118]: I0223 08:24:32.704628 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6148598d-7822-4b48-b805-f2544e9bc5ea","Type":"ContainerStarted","Data":"8d7830134aa480d6e4d21f908f35c0205c65f9e145593b9de60739992f67c69a"} Feb 23 08:24:32 crc kubenswrapper[5118]: I0223 08:24:32.727088 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.9689267099999999 podStartE2EDuration="29.727059782s" podCreationTimestamp="2026-02-23 08:24:03 +0000 UTC" firstStartedPulling="2026-02-23 08:24:03.676767611 +0000 UTC m=+5906.680552184" lastFinishedPulling="2026-02-23 08:24:31.434900683 +0000 UTC m=+5934.438685256" observedRunningTime="2026-02-23 08:24:32.720760101 +0000 UTC m=+5935.724544724" watchObservedRunningTime="2026-02-23 08:24:32.727059782 +0000 UTC m=+5935.730844365" Feb 23 08:24:33 crc kubenswrapper[5118]: I0223 08:24:33.726713 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"883474ec-4fbf-47ab-9d9f-304e53823d98","Type":"ContainerStarted","Data":"0d2ddb0bc434e2bbf38831b1667e0dba12b12c6c007226642771685944a8d637"} Feb 23 08:24:33 crc kubenswrapper[5118]: I0223 08:24:33.729166 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d8dfb84-18ac-44e7-8774-d1e193bf72b3","Type":"ContainerStarted","Data":"e615da9c9bdf402865129f0b334b1b9da733a273a2c5c8c73dacb4933d99735b"} Feb 23 08:24:35 crc kubenswrapper[5118]: I0223 08:24:35.756937 5118 generic.go:334] "Generic (PLEG): container finished" podID="6148598d-7822-4b48-b805-f2544e9bc5ea" containerID="8d7830134aa480d6e4d21f908f35c0205c65f9e145593b9de60739992f67c69a" exitCode=0 Feb 23 08:24:35 crc kubenswrapper[5118]: I0223 08:24:35.757025 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6148598d-7822-4b48-b805-f2544e9bc5ea","Type":"ContainerDied","Data":"8d7830134aa480d6e4d21f908f35c0205c65f9e145593b9de60739992f67c69a"} Feb 23 08:24:35 crc kubenswrapper[5118]: I0223 08:24:35.764874 5118 generic.go:334] "Generic (PLEG): container finished" podID="3f104baf-13e1-44ac-bf73-ea400599dee0" containerID="992c0b38f91cb98c4f94888f70436fd95ac08db7df8f97e62a8fe15b96e8dfce" exitCode=0 Feb 23 08:24:35 crc kubenswrapper[5118]: I0223 08:24:35.764988 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3f104baf-13e1-44ac-bf73-ea400599dee0","Type":"ContainerDied","Data":"992c0b38f91cb98c4f94888f70436fd95ac08db7df8f97e62a8fe15b96e8dfce"} Feb 23 08:24:36 crc kubenswrapper[5118]: I0223 08:24:36.778668 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"3f104baf-13e1-44ac-bf73-ea400599dee0","Type":"ContainerStarted","Data":"89e88a79204891f81a85678fc29202ad62ab48f18fc44e044cf55312a0dcc294"} Feb 23 08:24:36 crc kubenswrapper[5118]: I0223 08:24:36.782984 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"6148598d-7822-4b48-b805-f2544e9bc5ea","Type":"ContainerStarted","Data":"45719e0233c4db926630f30fb51ddd2285a4c5a054d0547743615567a95980d6"} Feb 23 08:24:36 crc kubenswrapper[5118]: I0223 08:24:36.858061 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.170438788 podStartE2EDuration="35.858025394s" podCreationTimestamp="2026-02-23 08:24:01 +0000 UTC" firstStartedPulling="2026-02-23 08:24:03.74742693 +0000 UTC m=+5906.751211503" lastFinishedPulling="2026-02-23 08:24:31.435013506 +0000 UTC m=+5934.438798109" observedRunningTime="2026-02-23 08:24:36.853749441 +0000 UTC m=+5939.857534044" watchObservedRunningTime="2026-02-23 08:24:36.858025394 +0000 UTC m=+5939.861810007" Feb 23 08:24:36 crc kubenswrapper[5118]: I0223 08:24:36.867841 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.641291002 podStartE2EDuration="33.86781514s" podCreationTimestamp="2026-02-23 08:24:03 +0000 UTC" firstStartedPulling="2026-02-23 08:24:05.270424708 +0000 UTC m=+5908.274209281" lastFinishedPulling="2026-02-23 08:24:31.496948836 +0000 UTC m=+5934.500733419" observedRunningTime="2026-02-23 08:24:36.819579029 +0000 UTC m=+5939.823363632" watchObservedRunningTime="2026-02-23 08:24:36.86781514 +0000 UTC m=+5939.871599753" Feb 23 08:24:38 crc kubenswrapper[5118]: I0223 08:24:38.392621 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 23 08:24:43 crc kubenswrapper[5118]: I0223 08:24:43.211906 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 23 08:24:43 crc kubenswrapper[5118]: I0223 08:24:43.212817 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 23 08:24:43 crc kubenswrapper[5118]: I0223 08:24:43.315278 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 23 08:24:43 crc kubenswrapper[5118]: I0223 08:24:43.969545 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 23 08:24:44 crc kubenswrapper[5118]: I0223 08:24:44.607899 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:44 crc kubenswrapper[5118]: I0223 08:24:44.609749 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:44 crc kubenswrapper[5118]: I0223 08:24:44.713768 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:44 crc kubenswrapper[5118]: I0223 08:24:44.885426 5118 generic.go:334] "Generic (PLEG): container finished" podID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" containerID="91879ebcb9221ab0786b206ee839de272fed3bd59c9ba0f98b739357ebd60240" exitCode=0 Feb 23 08:24:44 crc kubenswrapper[5118]: I0223 08:24:44.885548 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" event={"ID":"309efc96-31a9-4913-b6b2-97f7ac5b4d60","Type":"ContainerDied","Data":"91879ebcb9221ab0786b206ee839de272fed3bd59c9ba0f98b739357ebd60240"} Feb 23 08:24:45 crc kubenswrapper[5118]: I0223 08:24:45.023889 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 23 08:24:45 crc kubenswrapper[5118]: I0223 08:24:45.900327 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" event={"ID":"309efc96-31a9-4913-b6b2-97f7ac5b4d60","Type":"ContainerStarted","Data":"1640de891b2af7af38898fcfacf50cc23985e179d29934d58493ca7a43fc4ae5"} Feb 23 08:24:45 crc kubenswrapper[5118]: I0223 08:24:45.935372 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" podStartSLOduration=2.856641062 podStartE2EDuration="45.935349963s" podCreationTimestamp="2026-02-23 08:24:00 +0000 UTC" firstStartedPulling="2026-02-23 08:24:01.290675851 +0000 UTC m=+5904.294460434" lastFinishedPulling="2026-02-23 08:24:44.369384752 +0000 UTC m=+5947.373169335" observedRunningTime="2026-02-23 08:24:45.924805119 +0000 UTC m=+5948.928589702" watchObservedRunningTime="2026-02-23 08:24:45.935349963 +0000 UTC m=+5948.939134546" Feb 23 08:24:46 crc kubenswrapper[5118]: I0223 08:24:46.914350 5118 generic.go:334] "Generic (PLEG): container finished" podID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" containerID="82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938" exitCode=0 Feb 23 08:24:46 crc kubenswrapper[5118]: I0223 08:24:46.914490 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" event={"ID":"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23","Type":"ContainerDied","Data":"82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938"} Feb 23 08:24:47 crc kubenswrapper[5118]: I0223 08:24:47.928460 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" event={"ID":"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23","Type":"ContainerStarted","Data":"62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea"} Feb 23 08:24:47 crc kubenswrapper[5118]: I0223 08:24:47.929303 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:47 crc kubenswrapper[5118]: I0223 08:24:47.968896 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" podStartSLOduration=-9223371988.885948 podStartE2EDuration="47.968827724s" podCreationTimestamp="2026-02-23 08:24:00 +0000 UTC" firstStartedPulling="2026-02-23 08:24:01.214342885 +0000 UTC m=+5904.218127448" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:24:47.957911282 +0000 UTC m=+5950.961695905" watchObservedRunningTime="2026-02-23 08:24:47.968827724 +0000 UTC m=+5950.972612327" Feb 23 08:24:50 crc kubenswrapper[5118]: I0223 08:24:50.544297 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:50 crc kubenswrapper[5118]: I0223 08:24:50.546364 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.881021 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gg7rc"] Feb 23 08:24:51 crc kubenswrapper[5118]: E0223 08:24:51.881708 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31082e2f-e315-4fc0-8611-51841727000f" containerName="registry-server" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.881741 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="31082e2f-e315-4fc0-8611-51841727000f" containerName="registry-server" Feb 23 08:24:51 crc kubenswrapper[5118]: E0223 08:24:51.881812 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerName="extract-content" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.881831 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerName="extract-content" Feb 23 08:24:51 crc kubenswrapper[5118]: E0223 08:24:51.881873 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerName="registry-server" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.881892 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerName="registry-server" Feb 23 08:24:51 crc kubenswrapper[5118]: E0223 08:24:51.881920 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerName="registry-server" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.881936 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerName="registry-server" Feb 23 08:24:51 crc kubenswrapper[5118]: E0223 08:24:51.881958 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31082e2f-e315-4fc0-8611-51841727000f" containerName="extract-content" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.881973 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="31082e2f-e315-4fc0-8611-51841727000f" containerName="extract-content" Feb 23 08:24:51 crc kubenswrapper[5118]: E0223 08:24:51.881997 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerName="extract-content" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.882014 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerName="extract-content" Feb 23 08:24:51 crc kubenswrapper[5118]: E0223 08:24:51.882062 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31082e2f-e315-4fc0-8611-51841727000f" containerName="extract-utilities" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.882077 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="31082e2f-e315-4fc0-8611-51841727000f" containerName="extract-utilities" Feb 23 08:24:51 crc kubenswrapper[5118]: E0223 08:24:51.882159 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerName="extract-utilities" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.882176 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerName="extract-utilities" Feb 23 08:24:51 crc kubenswrapper[5118]: E0223 08:24:51.882211 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerName="extract-utilities" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.882228 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerName="extract-utilities" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.882599 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="31082e2f-e315-4fc0-8611-51841727000f" containerName="registry-server" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.882667 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="088ea52e-90bf-4d9c-94ab-245c932dd54c" containerName="registry-server" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.882698 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6ea9ff-5ee0-49f2-ac7f-75e1ac8369db" containerName="registry-server" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.883659 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gg7rc" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.886731 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 23 08:24:51 crc kubenswrapper[5118]: I0223 08:24:51.906516 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gg7rc"] Feb 23 08:24:52 crc kubenswrapper[5118]: I0223 08:24:52.009636 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd8e2260-1604-42a1-b756-c5916007b894-operator-scripts\") pod \"root-account-create-update-gg7rc\" (UID: \"fd8e2260-1604-42a1-b756-c5916007b894\") " pod="openstack/root-account-create-update-gg7rc" Feb 23 08:24:52 crc kubenswrapper[5118]: I0223 08:24:52.009737 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g8tq\" (UniqueName: \"kubernetes.io/projected/fd8e2260-1604-42a1-b756-c5916007b894-kube-api-access-4g8tq\") pod \"root-account-create-update-gg7rc\" (UID: \"fd8e2260-1604-42a1-b756-c5916007b894\") " pod="openstack/root-account-create-update-gg7rc" Feb 23 08:24:52 crc kubenswrapper[5118]: I0223 08:24:52.113458 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd8e2260-1604-42a1-b756-c5916007b894-operator-scripts\") pod \"root-account-create-update-gg7rc\" (UID: \"fd8e2260-1604-42a1-b756-c5916007b894\") " pod="openstack/root-account-create-update-gg7rc" Feb 23 08:24:52 crc kubenswrapper[5118]: I0223 08:24:52.113606 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g8tq\" (UniqueName: \"kubernetes.io/projected/fd8e2260-1604-42a1-b756-c5916007b894-kube-api-access-4g8tq\") pod \"root-account-create-update-gg7rc\" (UID: \"fd8e2260-1604-42a1-b756-c5916007b894\") " pod="openstack/root-account-create-update-gg7rc" Feb 23 08:24:52 crc kubenswrapper[5118]: I0223 08:24:52.116686 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd8e2260-1604-42a1-b756-c5916007b894-operator-scripts\") pod \"root-account-create-update-gg7rc\" (UID: \"fd8e2260-1604-42a1-b756-c5916007b894\") " pod="openstack/root-account-create-update-gg7rc" Feb 23 08:24:52 crc kubenswrapper[5118]: I0223 08:24:52.153125 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g8tq\" (UniqueName: \"kubernetes.io/projected/fd8e2260-1604-42a1-b756-c5916007b894-kube-api-access-4g8tq\") pod \"root-account-create-update-gg7rc\" (UID: \"fd8e2260-1604-42a1-b756-c5916007b894\") " pod="openstack/root-account-create-update-gg7rc" Feb 23 08:24:52 crc kubenswrapper[5118]: I0223 08:24:52.215366 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gg7rc" Feb 23 08:24:52 crc kubenswrapper[5118]: I0223 08:24:52.538814 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gg7rc"] Feb 23 08:24:52 crc kubenswrapper[5118]: I0223 08:24:52.984734 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gg7rc" event={"ID":"fd8e2260-1604-42a1-b756-c5916007b894","Type":"ContainerStarted","Data":"6a892b34811f623216b756a67d0349d4413e5929bacc85ee145a7e573403d069"} Feb 23 08:24:52 crc kubenswrapper[5118]: I0223 08:24:52.984787 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gg7rc" event={"ID":"fd8e2260-1604-42a1-b756-c5916007b894","Type":"ContainerStarted","Data":"a8403114feacfb59ae3e6404d6e05b1f0f82b962784529bd9664bdd6fd3d063f"} Feb 23 08:24:53 crc kubenswrapper[5118]: I0223 08:24:53.024583 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-gg7rc" podStartSLOduration=2.024550451 podStartE2EDuration="2.024550451s" podCreationTimestamp="2026-02-23 08:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:24:53.010248928 +0000 UTC m=+5956.014033541" watchObservedRunningTime="2026-02-23 08:24:53.024550451 +0000 UTC m=+5956.028335065" Feb 23 08:24:54 crc kubenswrapper[5118]: I0223 08:24:54.016338 5118 generic.go:334] "Generic (PLEG): container finished" podID="fd8e2260-1604-42a1-b756-c5916007b894" containerID="6a892b34811f623216b756a67d0349d4413e5929bacc85ee145a7e573403d069" exitCode=0 Feb 23 08:24:54 crc kubenswrapper[5118]: I0223 08:24:54.016424 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gg7rc" event={"ID":"fd8e2260-1604-42a1-b756-c5916007b894","Type":"ContainerDied","Data":"6a892b34811f623216b756a67d0349d4413e5929bacc85ee145a7e573403d069"} Feb 23 08:24:55 crc kubenswrapper[5118]: I0223 08:24:55.446891 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gg7rc" Feb 23 08:24:55 crc kubenswrapper[5118]: I0223 08:24:55.634551 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g8tq\" (UniqueName: \"kubernetes.io/projected/fd8e2260-1604-42a1-b756-c5916007b894-kube-api-access-4g8tq\") pod \"fd8e2260-1604-42a1-b756-c5916007b894\" (UID: \"fd8e2260-1604-42a1-b756-c5916007b894\") " Feb 23 08:24:55 crc kubenswrapper[5118]: I0223 08:24:55.634831 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd8e2260-1604-42a1-b756-c5916007b894-operator-scripts\") pod \"fd8e2260-1604-42a1-b756-c5916007b894\" (UID: \"fd8e2260-1604-42a1-b756-c5916007b894\") " Feb 23 08:24:55 crc kubenswrapper[5118]: I0223 08:24:55.635876 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd8e2260-1604-42a1-b756-c5916007b894-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd8e2260-1604-42a1-b756-c5916007b894" (UID: "fd8e2260-1604-42a1-b756-c5916007b894"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:24:55 crc kubenswrapper[5118]: I0223 08:24:55.643582 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd8e2260-1604-42a1-b756-c5916007b894-kube-api-access-4g8tq" (OuterVolumeSpecName: "kube-api-access-4g8tq") pod "fd8e2260-1604-42a1-b756-c5916007b894" (UID: "fd8e2260-1604-42a1-b756-c5916007b894"). InnerVolumeSpecName "kube-api-access-4g8tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:24:55 crc kubenswrapper[5118]: I0223 08:24:55.737304 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g8tq\" (UniqueName: \"kubernetes.io/projected/fd8e2260-1604-42a1-b756-c5916007b894-kube-api-access-4g8tq\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:55 crc kubenswrapper[5118]: I0223 08:24:55.737355 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd8e2260-1604-42a1-b756-c5916007b894-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:55 crc kubenswrapper[5118]: I0223 08:24:55.831057 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:24:55 crc kubenswrapper[5118]: I0223 08:24:55.898854 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-wz4tb"] Feb 23 08:24:55 crc kubenswrapper[5118]: I0223 08:24:55.899368 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" podUID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" containerName="dnsmasq-dns" containerID="cri-o://1640de891b2af7af38898fcfacf50cc23985e179d29934d58493ca7a43fc4ae5" gracePeriod=10 Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.046030 5118 generic.go:334] "Generic (PLEG): container finished" podID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" containerID="1640de891b2af7af38898fcfacf50cc23985e179d29934d58493ca7a43fc4ae5" exitCode=0 Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.046087 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" event={"ID":"309efc96-31a9-4913-b6b2-97f7ac5b4d60","Type":"ContainerDied","Data":"1640de891b2af7af38898fcfacf50cc23985e179d29934d58493ca7a43fc4ae5"} Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.078607 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gg7rc" event={"ID":"fd8e2260-1604-42a1-b756-c5916007b894","Type":"ContainerDied","Data":"a8403114feacfb59ae3e6404d6e05b1f0f82b962784529bd9664bdd6fd3d063f"} Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.078652 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8403114feacfb59ae3e6404d6e05b1f0f82b962784529bd9664bdd6fd3d063f" Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.078708 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gg7rc" Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.340136 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.454013 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-config\") pod \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.454090 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-dns-svc\") pod \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.454171 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th8tx\" (UniqueName: \"kubernetes.io/projected/309efc96-31a9-4913-b6b2-97f7ac5b4d60-kube-api-access-th8tx\") pod \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\" (UID: \"309efc96-31a9-4913-b6b2-97f7ac5b4d60\") " Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.459931 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/309efc96-31a9-4913-b6b2-97f7ac5b4d60-kube-api-access-th8tx" (OuterVolumeSpecName: "kube-api-access-th8tx") pod "309efc96-31a9-4913-b6b2-97f7ac5b4d60" (UID: "309efc96-31a9-4913-b6b2-97f7ac5b4d60"). InnerVolumeSpecName "kube-api-access-th8tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.493090 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "309efc96-31a9-4913-b6b2-97f7ac5b4d60" (UID: "309efc96-31a9-4913-b6b2-97f7ac5b4d60"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.496371 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-config" (OuterVolumeSpecName: "config") pod "309efc96-31a9-4913-b6b2-97f7ac5b4d60" (UID: "309efc96-31a9-4913-b6b2-97f7ac5b4d60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.556976 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.557018 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/309efc96-31a9-4913-b6b2-97f7ac5b4d60-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:56 crc kubenswrapper[5118]: I0223 08:24:56.557032 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th8tx\" (UniqueName: \"kubernetes.io/projected/309efc96-31a9-4913-b6b2-97f7ac5b4d60-kube-api-access-th8tx\") on node \"crc\" DevicePath \"\"" Feb 23 08:24:57 crc kubenswrapper[5118]: I0223 08:24:57.088518 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" event={"ID":"309efc96-31a9-4913-b6b2-97f7ac5b4d60","Type":"ContainerDied","Data":"d75d9e0b2b517c71bf6d1dada37c2e3fb8e8413ebe078180bbcd4270faf5b9a0"} Feb 23 08:24:57 crc kubenswrapper[5118]: I0223 08:24:57.088669 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-787c4dc9-wz4tb" Feb 23 08:24:57 crc kubenswrapper[5118]: I0223 08:24:57.088678 5118 scope.go:117] "RemoveContainer" containerID="1640de891b2af7af38898fcfacf50cc23985e179d29934d58493ca7a43fc4ae5" Feb 23 08:24:57 crc kubenswrapper[5118]: I0223 08:24:57.120901 5118 scope.go:117] "RemoveContainer" containerID="91879ebcb9221ab0786b206ee839de272fed3bd59c9ba0f98b739357ebd60240" Feb 23 08:24:57 crc kubenswrapper[5118]: I0223 08:24:57.137015 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-wz4tb"] Feb 23 08:24:57 crc kubenswrapper[5118]: I0223 08:24:57.147965 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-787c4dc9-wz4tb"] Feb 23 08:24:57 crc kubenswrapper[5118]: I0223 08:24:57.716298 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" path="/var/lib/kubelet/pods/309efc96-31a9-4913-b6b2-97f7ac5b4d60/volumes" Feb 23 08:24:58 crc kubenswrapper[5118]: I0223 08:24:58.257396 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gg7rc"] Feb 23 08:24:58 crc kubenswrapper[5118]: I0223 08:24:58.271503 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gg7rc"] Feb 23 08:24:59 crc kubenswrapper[5118]: I0223 08:24:59.717127 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd8e2260-1604-42a1-b756-c5916007b894" path="/var/lib/kubelet/pods/fd8e2260-1604-42a1-b756-c5916007b894/volumes" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.256575 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9vfl4"] Feb 23 08:25:03 crc kubenswrapper[5118]: E0223 08:25:03.260945 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" containerName="dnsmasq-dns" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.260990 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" containerName="dnsmasq-dns" Feb 23 08:25:03 crc kubenswrapper[5118]: E0223 08:25:03.261060 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" containerName="init" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.261078 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" containerName="init" Feb 23 08:25:03 crc kubenswrapper[5118]: E0223 08:25:03.261168 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8e2260-1604-42a1-b756-c5916007b894" containerName="mariadb-account-create-update" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.261190 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8e2260-1604-42a1-b756-c5916007b894" containerName="mariadb-account-create-update" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.261581 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="309efc96-31a9-4913-b6b2-97f7ac5b4d60" containerName="dnsmasq-dns" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.261643 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8e2260-1604-42a1-b756-c5916007b894" containerName="mariadb-account-create-update" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.262922 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9vfl4" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.266306 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.275698 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9vfl4"] Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.397118 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w42n\" (UniqueName: \"kubernetes.io/projected/90069789-63ca-43a0-9aea-a49dc5f2233f-kube-api-access-2w42n\") pod \"root-account-create-update-9vfl4\" (UID: \"90069789-63ca-43a0-9aea-a49dc5f2233f\") " pod="openstack/root-account-create-update-9vfl4" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.397237 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90069789-63ca-43a0-9aea-a49dc5f2233f-operator-scripts\") pod \"root-account-create-update-9vfl4\" (UID: \"90069789-63ca-43a0-9aea-a49dc5f2233f\") " pod="openstack/root-account-create-update-9vfl4" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.498823 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w42n\" (UniqueName: \"kubernetes.io/projected/90069789-63ca-43a0-9aea-a49dc5f2233f-kube-api-access-2w42n\") pod \"root-account-create-update-9vfl4\" (UID: \"90069789-63ca-43a0-9aea-a49dc5f2233f\") " pod="openstack/root-account-create-update-9vfl4" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.498955 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90069789-63ca-43a0-9aea-a49dc5f2233f-operator-scripts\") pod \"root-account-create-update-9vfl4\" (UID: \"90069789-63ca-43a0-9aea-a49dc5f2233f\") " pod="openstack/root-account-create-update-9vfl4" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.500308 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90069789-63ca-43a0-9aea-a49dc5f2233f-operator-scripts\") pod \"root-account-create-update-9vfl4\" (UID: \"90069789-63ca-43a0-9aea-a49dc5f2233f\") " pod="openstack/root-account-create-update-9vfl4" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.535093 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w42n\" (UniqueName: \"kubernetes.io/projected/90069789-63ca-43a0-9aea-a49dc5f2233f-kube-api-access-2w42n\") pod \"root-account-create-update-9vfl4\" (UID: \"90069789-63ca-43a0-9aea-a49dc5f2233f\") " pod="openstack/root-account-create-update-9vfl4" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.609550 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9vfl4" Feb 23 08:25:03 crc kubenswrapper[5118]: I0223 08:25:03.929419 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9vfl4"] Feb 23 08:25:04 crc kubenswrapper[5118]: I0223 08:25:04.162742 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9vfl4" event={"ID":"90069789-63ca-43a0-9aea-a49dc5f2233f","Type":"ContainerStarted","Data":"e3278b9dc1a7662d6fb2aeed4b7f12bfa7e0cabeb6e13e6905f874640e2e9f4c"} Feb 23 08:25:04 crc kubenswrapper[5118]: I0223 08:25:04.163355 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9vfl4" event={"ID":"90069789-63ca-43a0-9aea-a49dc5f2233f","Type":"ContainerStarted","Data":"f6190f2a2a1188d0e180644aa48ff0cb182c1c442e1f0d8a79c374f717186f92"} Feb 23 08:25:04 crc kubenswrapper[5118]: I0223 08:25:04.185495 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-9vfl4" podStartSLOduration=1.185460538 podStartE2EDuration="1.185460538s" podCreationTimestamp="2026-02-23 08:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:25:04.179684039 +0000 UTC m=+5967.183468612" watchObservedRunningTime="2026-02-23 08:25:04.185460538 +0000 UTC m=+5967.189245141" Feb 23 08:25:05 crc kubenswrapper[5118]: I0223 08:25:05.175090 5118 generic.go:334] "Generic (PLEG): container finished" podID="90069789-63ca-43a0-9aea-a49dc5f2233f" containerID="e3278b9dc1a7662d6fb2aeed4b7f12bfa7e0cabeb6e13e6905f874640e2e9f4c" exitCode=0 Feb 23 08:25:05 crc kubenswrapper[5118]: I0223 08:25:05.175195 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9vfl4" event={"ID":"90069789-63ca-43a0-9aea-a49dc5f2233f","Type":"ContainerDied","Data":"e3278b9dc1a7662d6fb2aeed4b7f12bfa7e0cabeb6e13e6905f874640e2e9f4c"} Feb 23 08:25:06 crc kubenswrapper[5118]: I0223 08:25:06.608144 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9vfl4" Feb 23 08:25:06 crc kubenswrapper[5118]: I0223 08:25:06.755014 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90069789-63ca-43a0-9aea-a49dc5f2233f-operator-scripts\") pod \"90069789-63ca-43a0-9aea-a49dc5f2233f\" (UID: \"90069789-63ca-43a0-9aea-a49dc5f2233f\") " Feb 23 08:25:06 crc kubenswrapper[5118]: I0223 08:25:06.755246 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w42n\" (UniqueName: \"kubernetes.io/projected/90069789-63ca-43a0-9aea-a49dc5f2233f-kube-api-access-2w42n\") pod \"90069789-63ca-43a0-9aea-a49dc5f2233f\" (UID: \"90069789-63ca-43a0-9aea-a49dc5f2233f\") " Feb 23 08:25:06 crc kubenswrapper[5118]: I0223 08:25:06.756012 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90069789-63ca-43a0-9aea-a49dc5f2233f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90069789-63ca-43a0-9aea-a49dc5f2233f" (UID: "90069789-63ca-43a0-9aea-a49dc5f2233f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:25:06 crc kubenswrapper[5118]: I0223 08:25:06.756236 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90069789-63ca-43a0-9aea-a49dc5f2233f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:06 crc kubenswrapper[5118]: I0223 08:25:06.772133 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90069789-63ca-43a0-9aea-a49dc5f2233f-kube-api-access-2w42n" (OuterVolumeSpecName: "kube-api-access-2w42n") pod "90069789-63ca-43a0-9aea-a49dc5f2233f" (UID: "90069789-63ca-43a0-9aea-a49dc5f2233f"). InnerVolumeSpecName "kube-api-access-2w42n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:25:06 crc kubenswrapper[5118]: I0223 08:25:06.857915 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w42n\" (UniqueName: \"kubernetes.io/projected/90069789-63ca-43a0-9aea-a49dc5f2233f-kube-api-access-2w42n\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:07 crc kubenswrapper[5118]: I0223 08:25:07.193069 5118 generic.go:334] "Generic (PLEG): container finished" podID="5d8dfb84-18ac-44e7-8774-d1e193bf72b3" containerID="e615da9c9bdf402865129f0b334b1b9da733a273a2c5c8c73dacb4933d99735b" exitCode=0 Feb 23 08:25:07 crc kubenswrapper[5118]: I0223 08:25:07.193198 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d8dfb84-18ac-44e7-8774-d1e193bf72b3","Type":"ContainerDied","Data":"e615da9c9bdf402865129f0b334b1b9da733a273a2c5c8c73dacb4933d99735b"} Feb 23 08:25:07 crc kubenswrapper[5118]: I0223 08:25:07.196462 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"883474ec-4fbf-47ab-9d9f-304e53823d98","Type":"ContainerDied","Data":"0d2ddb0bc434e2bbf38831b1667e0dba12b12c6c007226642771685944a8d637"} Feb 23 08:25:07 crc kubenswrapper[5118]: I0223 08:25:07.197256 5118 generic.go:334] "Generic (PLEG): container finished" podID="883474ec-4fbf-47ab-9d9f-304e53823d98" containerID="0d2ddb0bc434e2bbf38831b1667e0dba12b12c6c007226642771685944a8d637" exitCode=0 Feb 23 08:25:07 crc kubenswrapper[5118]: I0223 08:25:07.200554 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9vfl4" event={"ID":"90069789-63ca-43a0-9aea-a49dc5f2233f","Type":"ContainerDied","Data":"f6190f2a2a1188d0e180644aa48ff0cb182c1c442e1f0d8a79c374f717186f92"} Feb 23 08:25:07 crc kubenswrapper[5118]: I0223 08:25:07.200582 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6190f2a2a1188d0e180644aa48ff0cb182c1c442e1f0d8a79c374f717186f92" Feb 23 08:25:07 crc kubenswrapper[5118]: I0223 08:25:07.200629 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9vfl4" Feb 23 08:25:08 crc kubenswrapper[5118]: I0223 08:25:08.211051 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d8dfb84-18ac-44e7-8774-d1e193bf72b3","Type":"ContainerStarted","Data":"4366b47e0c1490943c7e9055ae4b446af1780b60bd13644b14fa90c66ad14795"} Feb 23 08:25:08 crc kubenswrapper[5118]: I0223 08:25:08.211750 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 23 08:25:08 crc kubenswrapper[5118]: I0223 08:25:08.213687 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"883474ec-4fbf-47ab-9d9f-304e53823d98","Type":"ContainerStarted","Data":"7828d7983304b13e19e9a0fc884e4d76e6e3526ff33d0440568f9a4a6b308446"} Feb 23 08:25:08 crc kubenswrapper[5118]: I0223 08:25:08.213884 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:08 crc kubenswrapper[5118]: I0223 08:25:08.259368 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.194484317 podStartE2EDuration="1m8.259343659s" podCreationTimestamp="2026-02-23 08:24:00 +0000 UTC" firstStartedPulling="2026-02-23 08:24:02.370647677 +0000 UTC m=+5905.374432250" lastFinishedPulling="2026-02-23 08:24:31.435507019 +0000 UTC m=+5934.439291592" observedRunningTime="2026-02-23 08:25:08.251840329 +0000 UTC m=+5971.255624902" watchObservedRunningTime="2026-02-23 08:25:08.259343659 +0000 UTC m=+5971.263128232" Feb 23 08:25:08 crc kubenswrapper[5118]: I0223 08:25:08.284819 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.211147101 podStartE2EDuration="1m8.284795941s" podCreationTimestamp="2026-02-23 08:24:00 +0000 UTC" firstStartedPulling="2026-02-23 08:24:02.412890202 +0000 UTC m=+5905.416674775" lastFinishedPulling="2026-02-23 08:24:30.486539002 +0000 UTC m=+5933.490323615" observedRunningTime="2026-02-23 08:25:08.280648111 +0000 UTC m=+5971.284432684" watchObservedRunningTime="2026-02-23 08:25:08.284795941 +0000 UTC m=+5971.288580514" Feb 23 08:25:21 crc kubenswrapper[5118]: I0223 08:25:21.723045 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 23 08:25:22 crc kubenswrapper[5118]: I0223 08:25:22.096423 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.185700 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-s5blm"] Feb 23 08:25:27 crc kubenswrapper[5118]: E0223 08:25:27.186372 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90069789-63ca-43a0-9aea-a49dc5f2233f" containerName="mariadb-account-create-update" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.186389 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="90069789-63ca-43a0-9aea-a49dc5f2233f" containerName="mariadb-account-create-update" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.186593 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="90069789-63ca-43a0-9aea-a49dc5f2233f" containerName="mariadb-account-create-update" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.187579 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.203189 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-s5blm"] Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.246014 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b87xf\" (UniqueName: \"kubernetes.io/projected/87add532-23a8-4a35-a83c-10fbe3457f67-kube-api-access-b87xf\") pod \"dnsmasq-dns-79496f79cc-s5blm\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.246191 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-dns-svc\") pod \"dnsmasq-dns-79496f79cc-s5blm\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.246358 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-config\") pod \"dnsmasq-dns-79496f79cc-s5blm\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.347746 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-dns-svc\") pod \"dnsmasq-dns-79496f79cc-s5blm\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.347859 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-config\") pod \"dnsmasq-dns-79496f79cc-s5blm\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.347919 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b87xf\" (UniqueName: \"kubernetes.io/projected/87add532-23a8-4a35-a83c-10fbe3457f67-kube-api-access-b87xf\") pod \"dnsmasq-dns-79496f79cc-s5blm\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.348820 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-dns-svc\") pod \"dnsmasq-dns-79496f79cc-s5blm\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.349192 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-config\") pod \"dnsmasq-dns-79496f79cc-s5blm\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.374190 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b87xf\" (UniqueName: \"kubernetes.io/projected/87add532-23a8-4a35-a83c-10fbe3457f67-kube-api-access-b87xf\") pod \"dnsmasq-dns-79496f79cc-s5blm\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.556802 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:27 crc kubenswrapper[5118]: I0223 08:25:27.887062 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-s5blm"] Feb 23 08:25:28 crc kubenswrapper[5118]: I0223 08:25:28.014386 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:25:28 crc kubenswrapper[5118]: I0223 08:25:28.398889 5118 generic.go:334] "Generic (PLEG): container finished" podID="87add532-23a8-4a35-a83c-10fbe3457f67" containerID="15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7" exitCode=0 Feb 23 08:25:28 crc kubenswrapper[5118]: I0223 08:25:28.398972 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" event={"ID":"87add532-23a8-4a35-a83c-10fbe3457f67","Type":"ContainerDied","Data":"15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7"} Feb 23 08:25:28 crc kubenswrapper[5118]: I0223 08:25:28.399022 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" event={"ID":"87add532-23a8-4a35-a83c-10fbe3457f67","Type":"ContainerStarted","Data":"1d65cce26dec757f52f85f7a943c46c382a6f05790a5a29141f09bb68391e473"} Feb 23 08:25:29 crc kubenswrapper[5118]: I0223 08:25:29.059522 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:25:29 crc kubenswrapper[5118]: I0223 08:25:29.411181 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" event={"ID":"87add532-23a8-4a35-a83c-10fbe3457f67","Type":"ContainerStarted","Data":"8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046"} Feb 23 08:25:29 crc kubenswrapper[5118]: I0223 08:25:29.411395 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:29 crc kubenswrapper[5118]: I0223 08:25:29.433877 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" podStartSLOduration=2.433855445 podStartE2EDuration="2.433855445s" podCreationTimestamp="2026-02-23 08:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:25:29.431301323 +0000 UTC m=+5992.435085896" watchObservedRunningTime="2026-02-23 08:25:29.433855445 +0000 UTC m=+5992.437640018" Feb 23 08:25:29 crc kubenswrapper[5118]: I0223 08:25:29.913968 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5d8dfb84-18ac-44e7-8774-d1e193bf72b3" containerName="rabbitmq" containerID="cri-o://4366b47e0c1490943c7e9055ae4b446af1780b60bd13644b14fa90c66ad14795" gracePeriod=604799 Feb 23 08:25:30 crc kubenswrapper[5118]: I0223 08:25:30.939632 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="883474ec-4fbf-47ab-9d9f-304e53823d98" containerName="rabbitmq" containerID="cri-o://7828d7983304b13e19e9a0fc884e4d76e6e3526ff33d0440568f9a4a6b308446" gracePeriod=604799 Feb 23 08:25:31 crc kubenswrapper[5118]: I0223 08:25:31.719429 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5d8dfb84-18ac-44e7-8774-d1e193bf72b3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.254:5672: connect: connection refused" Feb 23 08:25:32 crc kubenswrapper[5118]: I0223 08:25:32.093990 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="883474ec-4fbf-47ab-9d9f-304e53823d98" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.255:5672: connect: connection refused" Feb 23 08:25:32 crc kubenswrapper[5118]: I0223 08:25:32.975517 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:25:32 crc kubenswrapper[5118]: I0223 08:25:32.975638 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.500783 5118 generic.go:334] "Generic (PLEG): container finished" podID="5d8dfb84-18ac-44e7-8774-d1e193bf72b3" containerID="4366b47e0c1490943c7e9055ae4b446af1780b60bd13644b14fa90c66ad14795" exitCode=0 Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.501007 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d8dfb84-18ac-44e7-8774-d1e193bf72b3","Type":"ContainerDied","Data":"4366b47e0c1490943c7e9055ae4b446af1780b60bd13644b14fa90c66ad14795"} Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.501828 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d8dfb84-18ac-44e7-8774-d1e193bf72b3","Type":"ContainerDied","Data":"475d36bfbc5fd6d5d703119ad617e9ce8332f91f4343d2bc0f82210f811f83b3"} Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.501863 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475d36bfbc5fd6d5d703119ad617e9ce8332f91f4343d2bc0f82210f811f83b3" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.523333 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.631470 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-confd\") pod \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.631566 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-erlang-cookie-secret\") pod \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.631852 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") pod \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.631898 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdx4s\" (UniqueName: \"kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-kube-api-access-gdx4s\") pod \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.631935 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-plugins\") pod \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.632009 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-erlang-cookie\") pod \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.632063 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-server-conf\") pod \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.632156 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-plugins-conf\") pod \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.632194 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-pod-info\") pod \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\" (UID: \"5d8dfb84-18ac-44e7-8774-d1e193bf72b3\") " Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.632685 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5d8dfb84-18ac-44e7-8774-d1e193bf72b3" (UID: "5d8dfb84-18ac-44e7-8774-d1e193bf72b3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.633554 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5d8dfb84-18ac-44e7-8774-d1e193bf72b3" (UID: "5d8dfb84-18ac-44e7-8774-d1e193bf72b3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.633797 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5d8dfb84-18ac-44e7-8774-d1e193bf72b3" (UID: "5d8dfb84-18ac-44e7-8774-d1e193bf72b3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.640996 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-pod-info" (OuterVolumeSpecName: "pod-info") pod "5d8dfb84-18ac-44e7-8774-d1e193bf72b3" (UID: "5d8dfb84-18ac-44e7-8774-d1e193bf72b3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.643337 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-kube-api-access-gdx4s" (OuterVolumeSpecName: "kube-api-access-gdx4s") pod "5d8dfb84-18ac-44e7-8774-d1e193bf72b3" (UID: "5d8dfb84-18ac-44e7-8774-d1e193bf72b3"). InnerVolumeSpecName "kube-api-access-gdx4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.648060 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5d8dfb84-18ac-44e7-8774-d1e193bf72b3" (UID: "5d8dfb84-18ac-44e7-8774-d1e193bf72b3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.649800 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2" (OuterVolumeSpecName: "persistence") pod "5d8dfb84-18ac-44e7-8774-d1e193bf72b3" (UID: "5d8dfb84-18ac-44e7-8774-d1e193bf72b3"). InnerVolumeSpecName "pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.659233 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-server-conf" (OuterVolumeSpecName: "server-conf") pod "5d8dfb84-18ac-44e7-8774-d1e193bf72b3" (UID: "5d8dfb84-18ac-44e7-8774-d1e193bf72b3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.725868 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5d8dfb84-18ac-44e7-8774-d1e193bf72b3" (UID: "5d8dfb84-18ac-44e7-8774-d1e193bf72b3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.735075 5118 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.735148 5118 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.735161 5118 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.735171 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.735186 5118 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.735309 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") on node \"crc\" " Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.735353 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdx4s\" (UniqueName: \"kubernetes.io/projected/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-kube-api-access-gdx4s\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.735370 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.735382 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d8dfb84-18ac-44e7-8774-d1e193bf72b3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.763783 5118 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.763956 5118 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2") on node "crc" Feb 23 08:25:36 crc kubenswrapper[5118]: I0223 08:25:36.836791 5118 reconciler_common.go:293] "Volume detached for volume \"pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.537438 5118 generic.go:334] "Generic (PLEG): container finished" podID="883474ec-4fbf-47ab-9d9f-304e53823d98" containerID="7828d7983304b13e19e9a0fc884e4d76e6e3526ff33d0440568f9a4a6b308446" exitCode=0 Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.537627 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.538714 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"883474ec-4fbf-47ab-9d9f-304e53823d98","Type":"ContainerDied","Data":"7828d7983304b13e19e9a0fc884e4d76e6e3526ff33d0440568f9a4a6b308446"} Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.559475 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.638158 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.647703 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-qrvkl"] Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.648049 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" podUID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" containerName="dnsmasq-dns" containerID="cri-o://62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea" gracePeriod=10 Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.742277 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.742519 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.742534 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:25:37 crc kubenswrapper[5118]: E0223 08:25:37.742844 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="883474ec-4fbf-47ab-9d9f-304e53823d98" containerName="setup-container" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.742855 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="883474ec-4fbf-47ab-9d9f-304e53823d98" containerName="setup-container" Feb 23 08:25:37 crc kubenswrapper[5118]: E0223 08:25:37.742872 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8dfb84-18ac-44e7-8774-d1e193bf72b3" containerName="rabbitmq" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.742878 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8dfb84-18ac-44e7-8774-d1e193bf72b3" containerName="rabbitmq" Feb 23 08:25:37 crc kubenswrapper[5118]: E0223 08:25:37.742897 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8dfb84-18ac-44e7-8774-d1e193bf72b3" containerName="setup-container" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.742905 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8dfb84-18ac-44e7-8774-d1e193bf72b3" containerName="setup-container" Feb 23 08:25:37 crc kubenswrapper[5118]: E0223 08:25:37.742913 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="883474ec-4fbf-47ab-9d9f-304e53823d98" containerName="rabbitmq" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.742918 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="883474ec-4fbf-47ab-9d9f-304e53823d98" containerName="rabbitmq" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.743064 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="883474ec-4fbf-47ab-9d9f-304e53823d98" containerName="rabbitmq" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.743082 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8dfb84-18ac-44e7-8774-d1e193bf72b3" containerName="rabbitmq" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.743823 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.745958 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.746123 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.746241 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.748253 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-z7xd8" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.748274 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.748461 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.757067 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-plugins-conf\") pod \"883474ec-4fbf-47ab-9d9f-304e53823d98\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.757180 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-server-conf\") pod \"883474ec-4fbf-47ab-9d9f-304e53823d98\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.757200 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-plugins\") pod \"883474ec-4fbf-47ab-9d9f-304e53823d98\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.757220 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-confd\") pod \"883474ec-4fbf-47ab-9d9f-304e53823d98\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.757250 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmf27\" (UniqueName: \"kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-kube-api-access-xmf27\") pod \"883474ec-4fbf-47ab-9d9f-304e53823d98\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.757439 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") pod \"883474ec-4fbf-47ab-9d9f-304e53823d98\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.757480 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/883474ec-4fbf-47ab-9d9f-304e53823d98-pod-info\") pod \"883474ec-4fbf-47ab-9d9f-304e53823d98\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.757570 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-erlang-cookie\") pod \"883474ec-4fbf-47ab-9d9f-304e53823d98\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.757818 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/883474ec-4fbf-47ab-9d9f-304e53823d98-erlang-cookie-secret\") pod \"883474ec-4fbf-47ab-9d9f-304e53823d98\" (UID: \"883474ec-4fbf-47ab-9d9f-304e53823d98\") " Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.758634 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "883474ec-4fbf-47ab-9d9f-304e53823d98" (UID: "883474ec-4fbf-47ab-9d9f-304e53823d98"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.761954 5118 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.763888 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "883474ec-4fbf-47ab-9d9f-304e53823d98" (UID: "883474ec-4fbf-47ab-9d9f-304e53823d98"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.764690 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "883474ec-4fbf-47ab-9d9f-304e53823d98" (UID: "883474ec-4fbf-47ab-9d9f-304e53823d98"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.768703 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/883474ec-4fbf-47ab-9d9f-304e53823d98-pod-info" (OuterVolumeSpecName: "pod-info") pod "883474ec-4fbf-47ab-9d9f-304e53823d98" (UID: "883474ec-4fbf-47ab-9d9f-304e53823d98"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.776479 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-kube-api-access-xmf27" (OuterVolumeSpecName: "kube-api-access-xmf27") pod "883474ec-4fbf-47ab-9d9f-304e53823d98" (UID: "883474ec-4fbf-47ab-9d9f-304e53823d98"). InnerVolumeSpecName "kube-api-access-xmf27". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.785336 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883474ec-4fbf-47ab-9d9f-304e53823d98-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "883474ec-4fbf-47ab-9d9f-304e53823d98" (UID: "883474ec-4fbf-47ab-9d9f-304e53823d98"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.799427 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-server-conf" (OuterVolumeSpecName: "server-conf") pod "883474ec-4fbf-47ab-9d9f-304e53823d98" (UID: "883474ec-4fbf-47ab-9d9f-304e53823d98"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.800279 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2" (OuterVolumeSpecName: "persistence") pod "883474ec-4fbf-47ab-9d9f-304e53823d98" (UID: "883474ec-4fbf-47ab-9d9f-304e53823d98"). InnerVolumeSpecName "pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.864126 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.864207 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.864317 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.864514 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.864544 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.864783 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsf97\" (UniqueName: \"kubernetes.io/projected/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-kube-api-access-nsf97\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.864909 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.865287 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.865339 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.865477 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmf27\" (UniqueName: \"kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-kube-api-access-xmf27\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.865509 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") on node \"crc\" " Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.865521 5118 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/883474ec-4fbf-47ab-9d9f-304e53823d98-pod-info\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.865535 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.865547 5118 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/883474ec-4fbf-47ab-9d9f-304e53823d98-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.865555 5118 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/883474ec-4fbf-47ab-9d9f-304e53823d98-server-conf\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.865563 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.874345 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "883474ec-4fbf-47ab-9d9f-304e53823d98" (UID: "883474ec-4fbf-47ab-9d9f-304e53823d98"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.895863 5118 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.896039 5118 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2") on node "crc" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.966699 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsf97\" (UniqueName: \"kubernetes.io/projected/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-kube-api-access-nsf97\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.966760 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.966788 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.966806 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.966832 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.966863 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.966879 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.966927 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.966993 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.967124 5118 reconciler_common.go:293] "Volume detached for volume \"pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.967144 5118 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/883474ec-4fbf-47ab-9d9f-304e53823d98-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.968480 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.970557 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.971923 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.972185 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.973748 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.973797 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f28412002351dd7faaa1cdc039facc478af48ee316228c5423d737536d43b857/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.974996 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.975501 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.975862 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:37 crc kubenswrapper[5118]: I0223 08:25:37.994339 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsf97\" (UniqueName: \"kubernetes.io/projected/c25bc4bd-e5d5-49b8-9419-28f0e33f394c-kube-api-access-nsf97\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.007478 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbec4e72-b0c9-4168-8bd9-3f319422bbf2\") pod \"rabbitmq-server-0\" (UID: \"c25bc4bd-e5d5-49b8-9419-28f0e33f394c\") " pod="openstack/rabbitmq-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.087023 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.158007 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.169853 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-config\") pod \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.170069 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27gjz\" (UniqueName: \"kubernetes.io/projected/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-kube-api-access-27gjz\") pod \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.170591 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-dns-svc\") pod \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\" (UID: \"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23\") " Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.175865 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-kube-api-access-27gjz" (OuterVolumeSpecName: "kube-api-access-27gjz") pod "0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" (UID: "0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23"). InnerVolumeSpecName "kube-api-access-27gjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.216124 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-config" (OuterVolumeSpecName: "config") pod "0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" (UID: "0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.241694 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" (UID: "0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.278165 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.278213 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.278226 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27gjz\" (UniqueName: \"kubernetes.io/projected/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23-kube-api-access-27gjz\") on node \"crc\" DevicePath \"\"" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.419791 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 08:25:38 crc kubenswrapper[5118]: W0223 08:25:38.427142 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc25bc4bd_e5d5_49b8_9419_28f0e33f394c.slice/crio-a1bd4e152ab687593d5bb0f4c0e652412a1ba496684c85352535117c680d50b5 WatchSource:0}: Error finding container a1bd4e152ab687593d5bb0f4c0e652412a1ba496684c85352535117c680d50b5: Status 404 returned error can't find the container with id a1bd4e152ab687593d5bb0f4c0e652412a1ba496684c85352535117c680d50b5 Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.548030 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c25bc4bd-e5d5-49b8-9419-28f0e33f394c","Type":"ContainerStarted","Data":"a1bd4e152ab687593d5bb0f4c0e652412a1ba496684c85352535117c680d50b5"} Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.550053 5118 generic.go:334] "Generic (PLEG): container finished" podID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" containerID="62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea" exitCode=0 Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.550140 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" event={"ID":"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23","Type":"ContainerDied","Data":"62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea"} Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.550239 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" event={"ID":"0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23","Type":"ContainerDied","Data":"dd097f24343fe7b5bb8494f92df1f817845bd8a35868e537fcccca5f9bf26722"} Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.550235 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb88b7bf5-qrvkl" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.550302 5118 scope.go:117] "RemoveContainer" containerID="62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.552764 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"883474ec-4fbf-47ab-9d9f-304e53823d98","Type":"ContainerDied","Data":"5a456dbd3d4d2713b2ce1a91d47c84047b8a2e206a5e31f1c147ef637403b428"} Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.552793 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.567995 5118 scope.go:117] "RemoveContainer" containerID="82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.594913 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-qrvkl"] Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.598969 5118 scope.go:117] "RemoveContainer" containerID="62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.600208 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bb88b7bf5-qrvkl"] Feb 23 08:25:38 crc kubenswrapper[5118]: E0223 08:25:38.602290 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea\": container with ID starting with 62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea not found: ID does not exist" containerID="62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.602355 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea"} err="failed to get container status \"62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea\": rpc error: code = NotFound desc = could not find container \"62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea\": container with ID starting with 62f5a57713379e24525aa4eca531f451e20de12acac671bfe60f74e7b8165fea not found: ID does not exist" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.602401 5118 scope.go:117] "RemoveContainer" containerID="82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938" Feb 23 08:25:38 crc kubenswrapper[5118]: E0223 08:25:38.603353 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938\": container with ID starting with 82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938 not found: ID does not exist" containerID="82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.603431 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938"} err="failed to get container status \"82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938\": rpc error: code = NotFound desc = could not find container \"82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938\": container with ID starting with 82e1f7fc1f5de31b45f65a58d607ce839235600f6c62d2fcc25ec723399a4938 not found: ID does not exist" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.603491 5118 scope.go:117] "RemoveContainer" containerID="7828d7983304b13e19e9a0fc884e4d76e6e3526ff33d0440568f9a4a6b308446" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.623269 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.627707 5118 scope.go:117] "RemoveContainer" containerID="0d2ddb0bc434e2bbf38831b1667e0dba12b12c6c007226642771685944a8d637" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.633946 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.660185 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:25:38 crc kubenswrapper[5118]: E0223 08:25:38.660820 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" containerName="dnsmasq-dns" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.660855 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" containerName="dnsmasq-dns" Feb 23 08:25:38 crc kubenswrapper[5118]: E0223 08:25:38.660887 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" containerName="init" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.660903 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" containerName="init" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.661359 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" containerName="dnsmasq-dns" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.663223 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.665271 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.665350 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-jtkmt" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.666631 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.666746 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.667084 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.668817 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.801603 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c02f2661-073e-48e7-b6c6-527de9c32b91-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.801687 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c02f2661-073e-48e7-b6c6-527de9c32b91-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.801744 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c02f2661-073e-48e7-b6c6-527de9c32b91-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.801786 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c02f2661-073e-48e7-b6c6-527de9c32b91-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.801813 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c02f2661-073e-48e7-b6c6-527de9c32b91-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.801904 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c02f2661-073e-48e7-b6c6-527de9c32b91-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.802153 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c02f2661-073e-48e7-b6c6-527de9c32b91-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.802213 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.802353 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8jjl\" (UniqueName: \"kubernetes.io/projected/c02f2661-073e-48e7-b6c6-527de9c32b91-kube-api-access-h8jjl\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.904402 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c02f2661-073e-48e7-b6c6-527de9c32b91-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.904503 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c02f2661-073e-48e7-b6c6-527de9c32b91-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.904539 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c02f2661-073e-48e7-b6c6-527de9c32b91-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.904593 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c02f2661-073e-48e7-b6c6-527de9c32b91-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.904628 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c02f2661-073e-48e7-b6c6-527de9c32b91-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.904690 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c02f2661-073e-48e7-b6c6-527de9c32b91-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.904752 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c02f2661-073e-48e7-b6c6-527de9c32b91-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.904799 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.904876 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8jjl\" (UniqueName: \"kubernetes.io/projected/c02f2661-073e-48e7-b6c6-527de9c32b91-kube-api-access-h8jjl\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.904960 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c02f2661-073e-48e7-b6c6-527de9c32b91-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.906734 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c02f2661-073e-48e7-b6c6-527de9c32b91-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.907419 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c02f2661-073e-48e7-b6c6-527de9c32b91-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.907450 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c02f2661-073e-48e7-b6c6-527de9c32b91-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.908904 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.908949 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fde6efe325c4fc32923ec989bc4f441a900a9715e86789b84f83171ec6fd762e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.910660 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c02f2661-073e-48e7-b6c6-527de9c32b91-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.911227 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c02f2661-073e-48e7-b6c6-527de9c32b91-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.912650 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c02f2661-073e-48e7-b6c6-527de9c32b91-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.932312 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8jjl\" (UniqueName: \"kubernetes.io/projected/c02f2661-073e-48e7-b6c6-527de9c32b91-kube-api-access-h8jjl\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.949630 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fd38af74-0b1f-4dda-92b3-99e7626c99c2\") pod \"rabbitmq-cell1-server-0\" (UID: \"c02f2661-073e-48e7-b6c6-527de9c32b91\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:38 crc kubenswrapper[5118]: I0223 08:25:38.992203 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:25:39 crc kubenswrapper[5118]: W0223 08:25:39.288177 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc02f2661_073e_48e7_b6c6_527de9c32b91.slice/crio-aa54d8cd881a7af0a7710841b79cb932a86b35af9b98ba524e61f585c08cbd0e WatchSource:0}: Error finding container aa54d8cd881a7af0a7710841b79cb932a86b35af9b98ba524e61f585c08cbd0e: Status 404 returned error can't find the container with id aa54d8cd881a7af0a7710841b79cb932a86b35af9b98ba524e61f585c08cbd0e Feb 23 08:25:39 crc kubenswrapper[5118]: I0223 08:25:39.293905 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 08:25:39 crc kubenswrapper[5118]: I0223 08:25:39.561711 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c02f2661-073e-48e7-b6c6-527de9c32b91","Type":"ContainerStarted","Data":"aa54d8cd881a7af0a7710841b79cb932a86b35af9b98ba524e61f585c08cbd0e"} Feb 23 08:25:39 crc kubenswrapper[5118]: I0223 08:25:39.712557 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23" path="/var/lib/kubelet/pods/0aa82cab-64dc-43a7-bc66-f8a1a4c1aa23/volumes" Feb 23 08:25:39 crc kubenswrapper[5118]: I0223 08:25:39.714452 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d8dfb84-18ac-44e7-8774-d1e193bf72b3" path="/var/lib/kubelet/pods/5d8dfb84-18ac-44e7-8774-d1e193bf72b3/volumes" Feb 23 08:25:39 crc kubenswrapper[5118]: I0223 08:25:39.717737 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="883474ec-4fbf-47ab-9d9f-304e53823d98" path="/var/lib/kubelet/pods/883474ec-4fbf-47ab-9d9f-304e53823d98/volumes" Feb 23 08:25:40 crc kubenswrapper[5118]: I0223 08:25:40.579359 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c25bc4bd-e5d5-49b8-9419-28f0e33f394c","Type":"ContainerStarted","Data":"6e195655fd38c21b94419ae2439b7251cdcb342177aaa7fd4f3172aa3a6cadb9"} Feb 23 08:25:41 crc kubenswrapper[5118]: I0223 08:25:41.589881 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c02f2661-073e-48e7-b6c6-527de9c32b91","Type":"ContainerStarted","Data":"8a724ec206dea7640635f5099ffdbb69cf91fe959596ba0e81029fd65e0fb52d"} Feb 23 08:26:02 crc kubenswrapper[5118]: I0223 08:26:02.975179 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:26:02 crc kubenswrapper[5118]: I0223 08:26:02.975863 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:26:13 crc kubenswrapper[5118]: I0223 08:26:13.916175 5118 generic.go:334] "Generic (PLEG): container finished" podID="c25bc4bd-e5d5-49b8-9419-28f0e33f394c" containerID="6e195655fd38c21b94419ae2439b7251cdcb342177aaa7fd4f3172aa3a6cadb9" exitCode=0 Feb 23 08:26:13 crc kubenswrapper[5118]: I0223 08:26:13.916914 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c25bc4bd-e5d5-49b8-9419-28f0e33f394c","Type":"ContainerDied","Data":"6e195655fd38c21b94419ae2439b7251cdcb342177aaa7fd4f3172aa3a6cadb9"} Feb 23 08:26:14 crc kubenswrapper[5118]: I0223 08:26:14.932247 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c25bc4bd-e5d5-49b8-9419-28f0e33f394c","Type":"ContainerStarted","Data":"067bb4a25e1d6fccd556ae9446f660d38c946a3d015ba85a63491c8e7c8b5648"} Feb 23 08:26:14 crc kubenswrapper[5118]: I0223 08:26:14.935041 5118 generic.go:334] "Generic (PLEG): container finished" podID="c02f2661-073e-48e7-b6c6-527de9c32b91" containerID="8a724ec206dea7640635f5099ffdbb69cf91fe959596ba0e81029fd65e0fb52d" exitCode=0 Feb 23 08:26:14 crc kubenswrapper[5118]: I0223 08:26:14.933606 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 23 08:26:14 crc kubenswrapper[5118]: I0223 08:26:14.935556 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c02f2661-073e-48e7-b6c6-527de9c32b91","Type":"ContainerDied","Data":"8a724ec206dea7640635f5099ffdbb69cf91fe959596ba0e81029fd65e0fb52d"} Feb 23 08:26:14 crc kubenswrapper[5118]: I0223 08:26:14.981626 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.981602587 podStartE2EDuration="37.981602587s" podCreationTimestamp="2026-02-23 08:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:26:14.969539456 +0000 UTC m=+6037.973324029" watchObservedRunningTime="2026-02-23 08:26:14.981602587 +0000 UTC m=+6037.985387160" Feb 23 08:26:15 crc kubenswrapper[5118]: I0223 08:26:15.947264 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c02f2661-073e-48e7-b6c6-527de9c32b91","Type":"ContainerStarted","Data":"f87d25bf32c448560dad98a53b0d7f9acb23229e1344b187cb2c35c8d9909a24"} Feb 23 08:26:15 crc kubenswrapper[5118]: I0223 08:26:15.948331 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:26:16 crc kubenswrapper[5118]: I0223 08:26:16.003115 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.003083472 podStartE2EDuration="38.003083472s" podCreationTimestamp="2026-02-23 08:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:26:15.985238533 +0000 UTC m=+6038.989023146" watchObservedRunningTime="2026-02-23 08:26:16.003083472 +0000 UTC m=+6039.006868055" Feb 23 08:26:28 crc kubenswrapper[5118]: I0223 08:26:28.161358 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 23 08:26:28 crc kubenswrapper[5118]: I0223 08:26:28.995364 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 23 08:26:32 crc kubenswrapper[5118]: I0223 08:26:32.975699 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:26:32 crc kubenswrapper[5118]: I0223 08:26:32.976043 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:26:32 crc kubenswrapper[5118]: I0223 08:26:32.976134 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 08:26:32 crc kubenswrapper[5118]: I0223 08:26:32.976861 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:26:32 crc kubenswrapper[5118]: I0223 08:26:32.976918 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" gracePeriod=600 Feb 23 08:26:33 crc kubenswrapper[5118]: E0223 08:26:33.140512 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:26:34 crc kubenswrapper[5118]: I0223 08:26:34.126976 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" exitCode=0 Feb 23 08:26:34 crc kubenswrapper[5118]: I0223 08:26:34.127071 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e"} Feb 23 08:26:34 crc kubenswrapper[5118]: I0223 08:26:34.128522 5118 scope.go:117] "RemoveContainer" containerID="1e2b737d523c4548d60f3331b545eb0ad583561f50e921a2ebd442cb227f1582" Feb 23 08:26:34 crc kubenswrapper[5118]: I0223 08:26:34.129497 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:26:34 crc kubenswrapper[5118]: E0223 08:26:34.130189 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:26:40 crc kubenswrapper[5118]: I0223 08:26:40.112835 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 23 08:26:40 crc kubenswrapper[5118]: I0223 08:26:40.118211 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:26:40 crc kubenswrapper[5118]: I0223 08:26:40.125309 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cbpwf" Feb 23 08:26:40 crc kubenswrapper[5118]: I0223 08:26:40.127646 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:26:40 crc kubenswrapper[5118]: I0223 08:26:40.138137 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpchm\" (UniqueName: \"kubernetes.io/projected/36d522b5-d2d2-4ae8-97c0-0ec54f42dee5-kube-api-access-mpchm\") pod \"mariadb-client\" (UID: \"36d522b5-d2d2-4ae8-97c0-0ec54f42dee5\") " pod="openstack/mariadb-client" Feb 23 08:26:40 crc kubenswrapper[5118]: I0223 08:26:40.238908 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpchm\" (UniqueName: \"kubernetes.io/projected/36d522b5-d2d2-4ae8-97c0-0ec54f42dee5-kube-api-access-mpchm\") pod \"mariadb-client\" (UID: \"36d522b5-d2d2-4ae8-97c0-0ec54f42dee5\") " pod="openstack/mariadb-client" Feb 23 08:26:40 crc kubenswrapper[5118]: I0223 08:26:40.270963 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpchm\" (UniqueName: \"kubernetes.io/projected/36d522b5-d2d2-4ae8-97c0-0ec54f42dee5-kube-api-access-mpchm\") pod \"mariadb-client\" (UID: \"36d522b5-d2d2-4ae8-97c0-0ec54f42dee5\") " pod="openstack/mariadb-client" Feb 23 08:26:40 crc kubenswrapper[5118]: I0223 08:26:40.452403 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:26:40 crc kubenswrapper[5118]: I0223 08:26:40.777722 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:26:40 crc kubenswrapper[5118]: W0223 08:26:40.788426 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d522b5_d2d2_4ae8_97c0_0ec54f42dee5.slice/crio-d49fd64ecd94900cfb6c677bb0606f0ee2baa172de15fee493c4f58b978f433c WatchSource:0}: Error finding container d49fd64ecd94900cfb6c677bb0606f0ee2baa172de15fee493c4f58b978f433c: Status 404 returned error can't find the container with id d49fd64ecd94900cfb6c677bb0606f0ee2baa172de15fee493c4f58b978f433c Feb 23 08:26:40 crc kubenswrapper[5118]: I0223 08:26:40.791999 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:26:41 crc kubenswrapper[5118]: I0223 08:26:41.207387 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"36d522b5-d2d2-4ae8-97c0-0ec54f42dee5","Type":"ContainerStarted","Data":"d49fd64ecd94900cfb6c677bb0606f0ee2baa172de15fee493c4f58b978f433c"} Feb 23 08:26:42 crc kubenswrapper[5118]: I0223 08:26:42.216331 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"36d522b5-d2d2-4ae8-97c0-0ec54f42dee5","Type":"ContainerStarted","Data":"0dba79c8388679e9176987279e4f0ce6a1d286280dca11f3fe1e7753f98373d1"} Feb 23 08:26:42 crc kubenswrapper[5118]: I0223 08:26:42.240346 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.3614131409999999 podStartE2EDuration="2.240319672s" podCreationTimestamp="2026-02-23 08:26:40 +0000 UTC" firstStartedPulling="2026-02-23 08:26:40.791789581 +0000 UTC m=+6063.795574154" lastFinishedPulling="2026-02-23 08:26:41.670696092 +0000 UTC m=+6064.674480685" observedRunningTime="2026-02-23 08:26:42.236345957 +0000 UTC m=+6065.240130550" watchObservedRunningTime="2026-02-23 08:26:42.240319672 +0000 UTC m=+6065.244104255" Feb 23 08:26:47 crc kubenswrapper[5118]: I0223 08:26:47.705848 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:26:47 crc kubenswrapper[5118]: E0223 08:26:47.707207 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:26:50 crc kubenswrapper[5118]: E0223 08:26:50.705871 5118 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.46:34372->38.102.83.46:40611: write tcp 38.102.83.46:34372->38.102.83.46:40611: write: connection reset by peer Feb 23 08:26:56 crc kubenswrapper[5118]: I0223 08:26:56.193315 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:26:56 crc kubenswrapper[5118]: I0223 08:26:56.194217 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="36d522b5-d2d2-4ae8-97c0-0ec54f42dee5" containerName="mariadb-client" containerID="cri-o://0dba79c8388679e9176987279e4f0ce6a1d286280dca11f3fe1e7753f98373d1" gracePeriod=30 Feb 23 08:26:56 crc kubenswrapper[5118]: I0223 08:26:56.396959 5118 generic.go:334] "Generic (PLEG): container finished" podID="36d522b5-d2d2-4ae8-97c0-0ec54f42dee5" containerID="0dba79c8388679e9176987279e4f0ce6a1d286280dca11f3fe1e7753f98373d1" exitCode=143 Feb 23 08:26:56 crc kubenswrapper[5118]: I0223 08:26:56.397019 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"36d522b5-d2d2-4ae8-97c0-0ec54f42dee5","Type":"ContainerDied","Data":"0dba79c8388679e9176987279e4f0ce6a1d286280dca11f3fe1e7753f98373d1"} Feb 23 08:26:56 crc kubenswrapper[5118]: I0223 08:26:56.809180 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:26:56 crc kubenswrapper[5118]: I0223 08:26:56.943272 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpchm\" (UniqueName: \"kubernetes.io/projected/36d522b5-d2d2-4ae8-97c0-0ec54f42dee5-kube-api-access-mpchm\") pod \"36d522b5-d2d2-4ae8-97c0-0ec54f42dee5\" (UID: \"36d522b5-d2d2-4ae8-97c0-0ec54f42dee5\") " Feb 23 08:26:56 crc kubenswrapper[5118]: I0223 08:26:56.949307 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d522b5-d2d2-4ae8-97c0-0ec54f42dee5-kube-api-access-mpchm" (OuterVolumeSpecName: "kube-api-access-mpchm") pod "36d522b5-d2d2-4ae8-97c0-0ec54f42dee5" (UID: "36d522b5-d2d2-4ae8-97c0-0ec54f42dee5"). InnerVolumeSpecName "kube-api-access-mpchm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:26:57 crc kubenswrapper[5118]: I0223 08:26:57.046499 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpchm\" (UniqueName: \"kubernetes.io/projected/36d522b5-d2d2-4ae8-97c0-0ec54f42dee5-kube-api-access-mpchm\") on node \"crc\" DevicePath \"\"" Feb 23 08:26:57 crc kubenswrapper[5118]: I0223 08:26:57.410383 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"36d522b5-d2d2-4ae8-97c0-0ec54f42dee5","Type":"ContainerDied","Data":"d49fd64ecd94900cfb6c677bb0606f0ee2baa172de15fee493c4f58b978f433c"} Feb 23 08:26:57 crc kubenswrapper[5118]: I0223 08:26:57.410456 5118 scope.go:117] "RemoveContainer" containerID="0dba79c8388679e9176987279e4f0ce6a1d286280dca11f3fe1e7753f98373d1" Feb 23 08:26:57 crc kubenswrapper[5118]: I0223 08:26:57.410506 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:26:57 crc kubenswrapper[5118]: I0223 08:26:57.471251 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:26:57 crc kubenswrapper[5118]: I0223 08:26:57.482002 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:26:57 crc kubenswrapper[5118]: I0223 08:26:57.711076 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d522b5-d2d2-4ae8-97c0-0ec54f42dee5" path="/var/lib/kubelet/pods/36d522b5-d2d2-4ae8-97c0-0ec54f42dee5/volumes" Feb 23 08:26:59 crc kubenswrapper[5118]: I0223 08:26:59.697707 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:26:59 crc kubenswrapper[5118]: E0223 08:26:59.698998 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:27:12 crc kubenswrapper[5118]: I0223 08:27:12.698060 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:27:12 crc kubenswrapper[5118]: E0223 08:27:12.699197 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:27:24 crc kubenswrapper[5118]: I0223 08:27:24.697531 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:27:24 crc kubenswrapper[5118]: E0223 08:27:24.698698 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:27:35 crc kubenswrapper[5118]: I0223 08:27:35.697755 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:27:35 crc kubenswrapper[5118]: E0223 08:27:35.698593 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:27:46 crc kubenswrapper[5118]: I0223 08:27:46.699254 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:27:46 crc kubenswrapper[5118]: E0223 08:27:46.700391 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:27:51 crc kubenswrapper[5118]: I0223 08:27:51.256980 5118 scope.go:117] "RemoveContainer" containerID="492bcd6dba0ae02876e4f6ecffa5881479b703c9c74c9a7ce67d18e8905a6428" Feb 23 08:27:59 crc kubenswrapper[5118]: I0223 08:27:59.698042 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:27:59 crc kubenswrapper[5118]: E0223 08:27:59.699688 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:28:12 crc kubenswrapper[5118]: I0223 08:28:12.698620 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:28:12 crc kubenswrapper[5118]: E0223 08:28:12.699552 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:28:23 crc kubenswrapper[5118]: I0223 08:28:23.698045 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:28:23 crc kubenswrapper[5118]: E0223 08:28:23.698627 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:28:36 crc kubenswrapper[5118]: I0223 08:28:36.698148 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:28:36 crc kubenswrapper[5118]: E0223 08:28:36.699546 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:28:47 crc kubenswrapper[5118]: I0223 08:28:47.705609 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:28:47 crc kubenswrapper[5118]: E0223 08:28:47.706459 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.462581 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z5fhx"] Feb 23 08:28:51 crc kubenswrapper[5118]: E0223 08:28:51.463305 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d522b5-d2d2-4ae8-97c0-0ec54f42dee5" containerName="mariadb-client" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.463317 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d522b5-d2d2-4ae8-97c0-0ec54f42dee5" containerName="mariadb-client" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.463450 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d522b5-d2d2-4ae8-97c0-0ec54f42dee5" containerName="mariadb-client" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.464457 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.488347 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5fhx"] Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.633399 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klnh9\" (UniqueName: \"kubernetes.io/projected/4dd1848c-5073-470c-81cf-4818073b1c21-kube-api-access-klnh9\") pod \"redhat-marketplace-z5fhx\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.633552 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-utilities\") pod \"redhat-marketplace-z5fhx\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.633593 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-catalog-content\") pod \"redhat-marketplace-z5fhx\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.735240 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klnh9\" (UniqueName: \"kubernetes.io/projected/4dd1848c-5073-470c-81cf-4818073b1c21-kube-api-access-klnh9\") pod \"redhat-marketplace-z5fhx\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.735341 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-utilities\") pod \"redhat-marketplace-z5fhx\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.735371 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-catalog-content\") pod \"redhat-marketplace-z5fhx\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.736074 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-catalog-content\") pod \"redhat-marketplace-z5fhx\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.736262 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-utilities\") pod \"redhat-marketplace-z5fhx\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.755273 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klnh9\" (UniqueName: \"kubernetes.io/projected/4dd1848c-5073-470c-81cf-4818073b1c21-kube-api-access-klnh9\") pod \"redhat-marketplace-z5fhx\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:51 crc kubenswrapper[5118]: I0223 08:28:51.801531 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:28:52 crc kubenswrapper[5118]: I0223 08:28:52.093283 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5fhx"] Feb 23 08:28:52 crc kubenswrapper[5118]: I0223 08:28:52.539308 5118 generic.go:334] "Generic (PLEG): container finished" podID="4dd1848c-5073-470c-81cf-4818073b1c21" containerID="358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7" exitCode=0 Feb 23 08:28:52 crc kubenswrapper[5118]: I0223 08:28:52.539364 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5fhx" event={"ID":"4dd1848c-5073-470c-81cf-4818073b1c21","Type":"ContainerDied","Data":"358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7"} Feb 23 08:28:52 crc kubenswrapper[5118]: I0223 08:28:52.539596 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5fhx" event={"ID":"4dd1848c-5073-470c-81cf-4818073b1c21","Type":"ContainerStarted","Data":"3b69c8e62a3db1ed144b2df2051f2dcc9b220a645fee7d7c0d9d7c1eaecf8fa0"} Feb 23 08:28:53 crc kubenswrapper[5118]: I0223 08:28:53.549281 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5fhx" event={"ID":"4dd1848c-5073-470c-81cf-4818073b1c21","Type":"ContainerStarted","Data":"8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67"} Feb 23 08:28:54 crc kubenswrapper[5118]: I0223 08:28:54.563217 5118 generic.go:334] "Generic (PLEG): container finished" podID="4dd1848c-5073-470c-81cf-4818073b1c21" containerID="8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67" exitCode=0 Feb 23 08:28:54 crc kubenswrapper[5118]: I0223 08:28:54.563347 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5fhx" event={"ID":"4dd1848c-5073-470c-81cf-4818073b1c21","Type":"ContainerDied","Data":"8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67"} Feb 23 08:28:54 crc kubenswrapper[5118]: I0223 08:28:54.564532 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5fhx" event={"ID":"4dd1848c-5073-470c-81cf-4818073b1c21","Type":"ContainerStarted","Data":"f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace"} Feb 23 08:28:54 crc kubenswrapper[5118]: I0223 08:28:54.710820 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z5fhx" podStartSLOduration=2.291459262 podStartE2EDuration="3.71079675s" podCreationTimestamp="2026-02-23 08:28:51 +0000 UTC" firstStartedPulling="2026-02-23 08:28:52.541686602 +0000 UTC m=+6195.545471185" lastFinishedPulling="2026-02-23 08:28:53.96102406 +0000 UTC m=+6196.964808673" observedRunningTime="2026-02-23 08:28:54.702305196 +0000 UTC m=+6197.706089809" watchObservedRunningTime="2026-02-23 08:28:54.71079675 +0000 UTC m=+6197.714581323" Feb 23 08:29:00 crc kubenswrapper[5118]: I0223 08:29:00.697829 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:29:00 crc kubenswrapper[5118]: E0223 08:29:00.698635 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:29:01 crc kubenswrapper[5118]: I0223 08:29:01.802721 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:29:01 crc kubenswrapper[5118]: I0223 08:29:01.803291 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:29:01 crc kubenswrapper[5118]: I0223 08:29:01.855816 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:29:02 crc kubenswrapper[5118]: I0223 08:29:02.732015 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:29:02 crc kubenswrapper[5118]: I0223 08:29:02.833942 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5fhx"] Feb 23 08:29:04 crc kubenswrapper[5118]: I0223 08:29:04.676914 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z5fhx" podUID="4dd1848c-5073-470c-81cf-4818073b1c21" containerName="registry-server" containerID="cri-o://f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace" gracePeriod=2 Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.175389 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.297732 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-catalog-content\") pod \"4dd1848c-5073-470c-81cf-4818073b1c21\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.297836 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klnh9\" (UniqueName: \"kubernetes.io/projected/4dd1848c-5073-470c-81cf-4818073b1c21-kube-api-access-klnh9\") pod \"4dd1848c-5073-470c-81cf-4818073b1c21\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.297920 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-utilities\") pod \"4dd1848c-5073-470c-81cf-4818073b1c21\" (UID: \"4dd1848c-5073-470c-81cf-4818073b1c21\") " Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.299016 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-utilities" (OuterVolumeSpecName: "utilities") pod "4dd1848c-5073-470c-81cf-4818073b1c21" (UID: "4dd1848c-5073-470c-81cf-4818073b1c21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.305255 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd1848c-5073-470c-81cf-4818073b1c21-kube-api-access-klnh9" (OuterVolumeSpecName: "kube-api-access-klnh9") pod "4dd1848c-5073-470c-81cf-4818073b1c21" (UID: "4dd1848c-5073-470c-81cf-4818073b1c21"). InnerVolumeSpecName "kube-api-access-klnh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.339293 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4dd1848c-5073-470c-81cf-4818073b1c21" (UID: "4dd1848c-5073-470c-81cf-4818073b1c21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.399794 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.399858 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klnh9\" (UniqueName: \"kubernetes.io/projected/4dd1848c-5073-470c-81cf-4818073b1c21-kube-api-access-klnh9\") on node \"crc\" DevicePath \"\"" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.399881 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd1848c-5073-470c-81cf-4818073b1c21-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.687418 5118 generic.go:334] "Generic (PLEG): container finished" podID="4dd1848c-5073-470c-81cf-4818073b1c21" containerID="f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace" exitCode=0 Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.687514 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5fhx" event={"ID":"4dd1848c-5073-470c-81cf-4818073b1c21","Type":"ContainerDied","Data":"f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace"} Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.687637 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5fhx" event={"ID":"4dd1848c-5073-470c-81cf-4818073b1c21","Type":"ContainerDied","Data":"3b69c8e62a3db1ed144b2df2051f2dcc9b220a645fee7d7c0d9d7c1eaecf8fa0"} Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.687675 5118 scope.go:117] "RemoveContainer" containerID="f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.687530 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5fhx" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.720962 5118 scope.go:117] "RemoveContainer" containerID="8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.755670 5118 scope.go:117] "RemoveContainer" containerID="358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.755895 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5fhx"] Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.767992 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5fhx"] Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.788806 5118 scope.go:117] "RemoveContainer" containerID="f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace" Feb 23 08:29:05 crc kubenswrapper[5118]: E0223 08:29:05.789326 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace\": container with ID starting with f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace not found: ID does not exist" containerID="f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.789395 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace"} err="failed to get container status \"f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace\": rpc error: code = NotFound desc = could not find container \"f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace\": container with ID starting with f5720f763550d2afb4d9ba9ffd82cbb074c6e85b92e2a41a4608880a991edace not found: ID does not exist" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.789443 5118 scope.go:117] "RemoveContainer" containerID="8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67" Feb 23 08:29:05 crc kubenswrapper[5118]: E0223 08:29:05.789939 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67\": container with ID starting with 8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67 not found: ID does not exist" containerID="8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.789977 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67"} err="failed to get container status \"8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67\": rpc error: code = NotFound desc = could not find container \"8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67\": container with ID starting with 8f74e65f6b52a4355b7916f014d26b606573a35d8e3cd52f25e095e419662d67 not found: ID does not exist" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.789998 5118 scope.go:117] "RemoveContainer" containerID="358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7" Feb 23 08:29:05 crc kubenswrapper[5118]: E0223 08:29:05.790696 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7\": container with ID starting with 358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7 not found: ID does not exist" containerID="358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7" Feb 23 08:29:05 crc kubenswrapper[5118]: I0223 08:29:05.790719 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7"} err="failed to get container status \"358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7\": rpc error: code = NotFound desc = could not find container \"358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7\": container with ID starting with 358b856023c9d0d7288af0b0f3b9c8919abbcc87156b64a4e4320c7e9ac6fef7 not found: ID does not exist" Feb 23 08:29:07 crc kubenswrapper[5118]: I0223 08:29:07.715759 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd1848c-5073-470c-81cf-4818073b1c21" path="/var/lib/kubelet/pods/4dd1848c-5073-470c-81cf-4818073b1c21/volumes" Feb 23 08:29:14 crc kubenswrapper[5118]: I0223 08:29:14.698306 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:29:14 crc kubenswrapper[5118]: E0223 08:29:14.699466 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:29:28 crc kubenswrapper[5118]: I0223 08:29:28.697686 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:29:28 crc kubenswrapper[5118]: E0223 08:29:28.698826 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:29:40 crc kubenswrapper[5118]: I0223 08:29:40.697590 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:29:40 crc kubenswrapper[5118]: E0223 08:29:40.698426 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:29:51 crc kubenswrapper[5118]: I0223 08:29:51.698055 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:29:51 crc kubenswrapper[5118]: E0223 08:29:51.699192 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.172563 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx"] Feb 23 08:30:00 crc kubenswrapper[5118]: E0223 08:30:00.173897 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd1848c-5073-470c-81cf-4818073b1c21" containerName="extract-utilities" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.173923 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd1848c-5073-470c-81cf-4818073b1c21" containerName="extract-utilities" Feb 23 08:30:00 crc kubenswrapper[5118]: E0223 08:30:00.173945 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd1848c-5073-470c-81cf-4818073b1c21" containerName="extract-content" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.173958 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd1848c-5073-470c-81cf-4818073b1c21" containerName="extract-content" Feb 23 08:30:00 crc kubenswrapper[5118]: E0223 08:30:00.173977 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd1848c-5073-470c-81cf-4818073b1c21" containerName="registry-server" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.173991 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd1848c-5073-470c-81cf-4818073b1c21" containerName="registry-server" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.174292 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd1848c-5073-470c-81cf-4818073b1c21" containerName="registry-server" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.175335 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.180931 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.181007 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.197623 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx"] Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.318127 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b96aadd9-1e52-4dec-96cf-936f0d92aab1-config-volume\") pod \"collect-profiles-29530590-px8wx\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.318251 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b96aadd9-1e52-4dec-96cf-936f0d92aab1-secret-volume\") pod \"collect-profiles-29530590-px8wx\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.318494 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcrs4\" (UniqueName: \"kubernetes.io/projected/b96aadd9-1e52-4dec-96cf-936f0d92aab1-kube-api-access-vcrs4\") pod \"collect-profiles-29530590-px8wx\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.421129 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b96aadd9-1e52-4dec-96cf-936f0d92aab1-config-volume\") pod \"collect-profiles-29530590-px8wx\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.421223 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b96aadd9-1e52-4dec-96cf-936f0d92aab1-secret-volume\") pod \"collect-profiles-29530590-px8wx\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.421288 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcrs4\" (UniqueName: \"kubernetes.io/projected/b96aadd9-1e52-4dec-96cf-936f0d92aab1-kube-api-access-vcrs4\") pod \"collect-profiles-29530590-px8wx\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.422516 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b96aadd9-1e52-4dec-96cf-936f0d92aab1-config-volume\") pod \"collect-profiles-29530590-px8wx\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.433893 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b96aadd9-1e52-4dec-96cf-936f0d92aab1-secret-volume\") pod \"collect-profiles-29530590-px8wx\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.452757 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcrs4\" (UniqueName: \"kubernetes.io/projected/b96aadd9-1e52-4dec-96cf-936f0d92aab1-kube-api-access-vcrs4\") pod \"collect-profiles-29530590-px8wx\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:00 crc kubenswrapper[5118]: I0223 08:30:00.513810 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:01 crc kubenswrapper[5118]: I0223 08:30:01.007526 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx"] Feb 23 08:30:01 crc kubenswrapper[5118]: I0223 08:30:01.238451 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" event={"ID":"b96aadd9-1e52-4dec-96cf-936f0d92aab1","Type":"ContainerStarted","Data":"519906fd70d32fc8df3ea8cfb0359c4725b533a5689c70686c36c318a7fa512f"} Feb 23 08:30:01 crc kubenswrapper[5118]: I0223 08:30:01.239039 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" event={"ID":"b96aadd9-1e52-4dec-96cf-936f0d92aab1","Type":"ContainerStarted","Data":"7210adb0b1565be19c0ddbf1c2e6cfa6064bc19e9c0c721829f2f97d10e19511"} Feb 23 08:30:01 crc kubenswrapper[5118]: I0223 08:30:01.267522 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" podStartSLOduration=1.267499428 podStartE2EDuration="1.267499428s" podCreationTimestamp="2026-02-23 08:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:30:01.259568206 +0000 UTC m=+6264.263352789" watchObservedRunningTime="2026-02-23 08:30:01.267499428 +0000 UTC m=+6264.271284001" Feb 23 08:30:02 crc kubenswrapper[5118]: I0223 08:30:02.249221 5118 generic.go:334] "Generic (PLEG): container finished" podID="b96aadd9-1e52-4dec-96cf-936f0d92aab1" containerID="519906fd70d32fc8df3ea8cfb0359c4725b533a5689c70686c36c318a7fa512f" exitCode=0 Feb 23 08:30:02 crc kubenswrapper[5118]: I0223 08:30:02.249325 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" event={"ID":"b96aadd9-1e52-4dec-96cf-936f0d92aab1","Type":"ContainerDied","Data":"519906fd70d32fc8df3ea8cfb0359c4725b533a5689c70686c36c318a7fa512f"} Feb 23 08:30:02 crc kubenswrapper[5118]: I0223 08:30:02.697741 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:30:02 crc kubenswrapper[5118]: E0223 08:30:02.698085 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:30:03 crc kubenswrapper[5118]: I0223 08:30:03.569615 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:03 crc kubenswrapper[5118]: I0223 08:30:03.676567 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b96aadd9-1e52-4dec-96cf-936f0d92aab1-secret-volume\") pod \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " Feb 23 08:30:03 crc kubenswrapper[5118]: I0223 08:30:03.676665 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b96aadd9-1e52-4dec-96cf-936f0d92aab1-config-volume\") pod \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " Feb 23 08:30:03 crc kubenswrapper[5118]: I0223 08:30:03.676708 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcrs4\" (UniqueName: \"kubernetes.io/projected/b96aadd9-1e52-4dec-96cf-936f0d92aab1-kube-api-access-vcrs4\") pod \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\" (UID: \"b96aadd9-1e52-4dec-96cf-936f0d92aab1\") " Feb 23 08:30:03 crc kubenswrapper[5118]: I0223 08:30:03.678271 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96aadd9-1e52-4dec-96cf-936f0d92aab1-config-volume" (OuterVolumeSpecName: "config-volume") pod "b96aadd9-1e52-4dec-96cf-936f0d92aab1" (UID: "b96aadd9-1e52-4dec-96cf-936f0d92aab1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:30:03 crc kubenswrapper[5118]: I0223 08:30:03.685770 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96aadd9-1e52-4dec-96cf-936f0d92aab1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b96aadd9-1e52-4dec-96cf-936f0d92aab1" (UID: "b96aadd9-1e52-4dec-96cf-936f0d92aab1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:30:03 crc kubenswrapper[5118]: I0223 08:30:03.686372 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b96aadd9-1e52-4dec-96cf-936f0d92aab1-kube-api-access-vcrs4" (OuterVolumeSpecName: "kube-api-access-vcrs4") pod "b96aadd9-1e52-4dec-96cf-936f0d92aab1" (UID: "b96aadd9-1e52-4dec-96cf-936f0d92aab1"). InnerVolumeSpecName "kube-api-access-vcrs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:30:03 crc kubenswrapper[5118]: I0223 08:30:03.779135 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b96aadd9-1e52-4dec-96cf-936f0d92aab1-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:03 crc kubenswrapper[5118]: I0223 08:30:03.779182 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b96aadd9-1e52-4dec-96cf-936f0d92aab1-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:03 crc kubenswrapper[5118]: I0223 08:30:03.779196 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcrs4\" (UniqueName: \"kubernetes.io/projected/b96aadd9-1e52-4dec-96cf-936f0d92aab1-kube-api-access-vcrs4\") on node \"crc\" DevicePath \"\"" Feb 23 08:30:04 crc kubenswrapper[5118]: I0223 08:30:04.274916 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" event={"ID":"b96aadd9-1e52-4dec-96cf-936f0d92aab1","Type":"ContainerDied","Data":"7210adb0b1565be19c0ddbf1c2e6cfa6064bc19e9c0c721829f2f97d10e19511"} Feb 23 08:30:04 crc kubenswrapper[5118]: I0223 08:30:04.274983 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7210adb0b1565be19c0ddbf1c2e6cfa6064bc19e9c0c721829f2f97d10e19511" Feb 23 08:30:04 crc kubenswrapper[5118]: I0223 08:30:04.275018 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx" Feb 23 08:30:04 crc kubenswrapper[5118]: I0223 08:30:04.377241 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792"] Feb 23 08:30:04 crc kubenswrapper[5118]: I0223 08:30:04.385421 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530545-b4792"] Feb 23 08:30:05 crc kubenswrapper[5118]: I0223 08:30:05.710938 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033ed359-95df-44ef-bbfc-c9eee59cf768" path="/var/lib/kubelet/pods/033ed359-95df-44ef-bbfc-c9eee59cf768/volumes" Feb 23 08:30:15 crc kubenswrapper[5118]: I0223 08:30:15.698291 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:30:15 crc kubenswrapper[5118]: E0223 08:30:15.699664 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:30:27 crc kubenswrapper[5118]: I0223 08:30:27.704288 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:30:27 crc kubenswrapper[5118]: E0223 08:30:27.705463 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:30:40 crc kubenswrapper[5118]: I0223 08:30:40.697840 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:30:40 crc kubenswrapper[5118]: E0223 08:30:40.699291 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:30:51 crc kubenswrapper[5118]: I0223 08:30:51.389585 5118 scope.go:117] "RemoveContainer" containerID="c924d9579c69ecb1af079b8248649efbb7e99c1e614b745c9fc0d5afc60daa31" Feb 23 08:30:51 crc kubenswrapper[5118]: I0223 08:30:51.432621 5118 scope.go:117] "RemoveContainer" containerID="e615da9c9bdf402865129f0b334b1b9da733a273a2c5c8c73dacb4933d99735b" Feb 23 08:30:52 crc kubenswrapper[5118]: I0223 08:30:52.699118 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:30:52 crc kubenswrapper[5118]: E0223 08:30:52.701604 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:31:03 crc kubenswrapper[5118]: I0223 08:31:03.698435 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:31:03 crc kubenswrapper[5118]: E0223 08:31:03.699631 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:31:14 crc kubenswrapper[5118]: I0223 08:31:14.698408 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:31:14 crc kubenswrapper[5118]: E0223 08:31:14.699438 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:31:29 crc kubenswrapper[5118]: I0223 08:31:29.698158 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:31:29 crc kubenswrapper[5118]: E0223 08:31:29.699183 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:31:44 crc kubenswrapper[5118]: I0223 08:31:44.697317 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:31:45 crc kubenswrapper[5118]: I0223 08:31:45.301029 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"3b5aaeff40057d33bd8821cc5a710b83456558795d146f2b7bcff3286f475d3f"} Feb 23 08:31:51 crc kubenswrapper[5118]: I0223 08:31:51.551661 5118 scope.go:117] "RemoveContainer" containerID="4366b47e0c1490943c7e9055ae4b446af1780b60bd13644b14fa90c66ad14795" Feb 23 08:31:51 crc kubenswrapper[5118]: I0223 08:31:51.586591 5118 scope.go:117] "RemoveContainer" containerID="6a892b34811f623216b756a67d0349d4413e5929bacc85ee145a7e573403d069" Feb 23 08:34:02 crc kubenswrapper[5118]: I0223 08:34:02.975147 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:34:02 crc kubenswrapper[5118]: I0223 08:34:02.976139 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.216738 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzdkt"] Feb 23 08:34:24 crc kubenswrapper[5118]: E0223 08:34:24.223663 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96aadd9-1e52-4dec-96cf-936f0d92aab1" containerName="collect-profiles" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.224902 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96aadd9-1e52-4dec-96cf-936f0d92aab1" containerName="collect-profiles" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.225231 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b96aadd9-1e52-4dec-96cf-936f0d92aab1" containerName="collect-profiles" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.227505 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.236934 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzdkt"] Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.380575 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-catalog-content\") pod \"certified-operators-vzdkt\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.380640 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-utilities\") pod \"certified-operators-vzdkt\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.380839 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx5bk\" (UniqueName: \"kubernetes.io/projected/e9d7c30f-434f-4bf3-9a96-78057acb88d7-kube-api-access-nx5bk\") pod \"certified-operators-vzdkt\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.482007 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-catalog-content\") pod \"certified-operators-vzdkt\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.482064 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-utilities\") pod \"certified-operators-vzdkt\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.482166 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx5bk\" (UniqueName: \"kubernetes.io/projected/e9d7c30f-434f-4bf3-9a96-78057acb88d7-kube-api-access-nx5bk\") pod \"certified-operators-vzdkt\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.483033 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-catalog-content\") pod \"certified-operators-vzdkt\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.483337 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-utilities\") pod \"certified-operators-vzdkt\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.511535 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx5bk\" (UniqueName: \"kubernetes.io/projected/e9d7c30f-434f-4bf3-9a96-78057acb88d7-kube-api-access-nx5bk\") pod \"certified-operators-vzdkt\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:24 crc kubenswrapper[5118]: I0223 08:34:24.570027 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:25 crc kubenswrapper[5118]: I0223 08:34:25.121083 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzdkt"] Feb 23 08:34:25 crc kubenswrapper[5118]: I0223 08:34:25.911284 5118 generic.go:334] "Generic (PLEG): container finished" podID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerID="a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013" exitCode=0 Feb 23 08:34:25 crc kubenswrapper[5118]: I0223 08:34:25.911397 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzdkt" event={"ID":"e9d7c30f-434f-4bf3-9a96-78057acb88d7","Type":"ContainerDied","Data":"a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013"} Feb 23 08:34:25 crc kubenswrapper[5118]: I0223 08:34:25.911726 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzdkt" event={"ID":"e9d7c30f-434f-4bf3-9a96-78057acb88d7","Type":"ContainerStarted","Data":"a3b094707f637920b76f2e8fe5c775005c91118a21de117a0bfbb4bc0c95d5af"} Feb 23 08:34:25 crc kubenswrapper[5118]: I0223 08:34:25.916259 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:34:26 crc kubenswrapper[5118]: I0223 08:34:26.940320 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzdkt" event={"ID":"e9d7c30f-434f-4bf3-9a96-78057acb88d7","Type":"ContainerStarted","Data":"24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710"} Feb 23 08:34:27 crc kubenswrapper[5118]: I0223 08:34:27.957785 5118 generic.go:334] "Generic (PLEG): container finished" podID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerID="24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710" exitCode=0 Feb 23 08:34:27 crc kubenswrapper[5118]: I0223 08:34:27.957897 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzdkt" event={"ID":"e9d7c30f-434f-4bf3-9a96-78057acb88d7","Type":"ContainerDied","Data":"24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710"} Feb 23 08:34:28 crc kubenswrapper[5118]: I0223 08:34:28.973982 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzdkt" event={"ID":"e9d7c30f-434f-4bf3-9a96-78057acb88d7","Type":"ContainerStarted","Data":"4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95"} Feb 23 08:34:29 crc kubenswrapper[5118]: I0223 08:34:29.007198 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzdkt" podStartSLOduration=2.549095941 podStartE2EDuration="5.007161999s" podCreationTimestamp="2026-02-23 08:34:24 +0000 UTC" firstStartedPulling="2026-02-23 08:34:25.915492059 +0000 UTC m=+6528.919276662" lastFinishedPulling="2026-02-23 08:34:28.373558127 +0000 UTC m=+6531.377342720" observedRunningTime="2026-02-23 08:34:29.000262513 +0000 UTC m=+6532.004047176" watchObservedRunningTime="2026-02-23 08:34:29.007161999 +0000 UTC m=+6532.010946612" Feb 23 08:34:32 crc kubenswrapper[5118]: I0223 08:34:32.975205 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:34:32 crc kubenswrapper[5118]: I0223 08:34:32.976343 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.034969 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-98zmk"] Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.037451 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.050749 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncq5g\" (UniqueName: \"kubernetes.io/projected/88b1e241-a92f-4306-a094-996a88eb9d07-kube-api-access-ncq5g\") pod \"community-operators-98zmk\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.050844 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-catalog-content\") pod \"community-operators-98zmk\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.051008 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-utilities\") pod \"community-operators-98zmk\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.053425 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98zmk"] Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.155272 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-utilities\") pod \"community-operators-98zmk\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.155420 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncq5g\" (UniqueName: \"kubernetes.io/projected/88b1e241-a92f-4306-a094-996a88eb9d07-kube-api-access-ncq5g\") pod \"community-operators-98zmk\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.155471 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-catalog-content\") pod \"community-operators-98zmk\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.156183 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-utilities\") pod \"community-operators-98zmk\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.156516 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-catalog-content\") pod \"community-operators-98zmk\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.183977 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncq5g\" (UniqueName: \"kubernetes.io/projected/88b1e241-a92f-4306-a094-996a88eb9d07-kube-api-access-ncq5g\") pod \"community-operators-98zmk\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.419189 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:33 crc kubenswrapper[5118]: I0223 08:34:33.768613 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-98zmk"] Feb 23 08:34:34 crc kubenswrapper[5118]: I0223 08:34:34.023471 5118 generic.go:334] "Generic (PLEG): container finished" podID="88b1e241-a92f-4306-a094-996a88eb9d07" containerID="d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b" exitCode=0 Feb 23 08:34:34 crc kubenswrapper[5118]: I0223 08:34:34.023582 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98zmk" event={"ID":"88b1e241-a92f-4306-a094-996a88eb9d07","Type":"ContainerDied","Data":"d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b"} Feb 23 08:34:34 crc kubenswrapper[5118]: I0223 08:34:34.023827 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98zmk" event={"ID":"88b1e241-a92f-4306-a094-996a88eb9d07","Type":"ContainerStarted","Data":"dfebf33638291a52db8d113467d7b4de4532955fc5dec922e68f6567d8cafaa7"} Feb 23 08:34:34 crc kubenswrapper[5118]: I0223 08:34:34.571357 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:34 crc kubenswrapper[5118]: I0223 08:34:34.571445 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:34 crc kubenswrapper[5118]: I0223 08:34:34.665461 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:35 crc kubenswrapper[5118]: I0223 08:34:35.034558 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98zmk" event={"ID":"88b1e241-a92f-4306-a094-996a88eb9d07","Type":"ContainerStarted","Data":"f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a"} Feb 23 08:34:35 crc kubenswrapper[5118]: I0223 08:34:35.113220 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:36 crc kubenswrapper[5118]: I0223 08:34:36.050337 5118 generic.go:334] "Generic (PLEG): container finished" podID="88b1e241-a92f-4306-a094-996a88eb9d07" containerID="f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a" exitCode=0 Feb 23 08:34:36 crc kubenswrapper[5118]: I0223 08:34:36.050442 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98zmk" event={"ID":"88b1e241-a92f-4306-a094-996a88eb9d07","Type":"ContainerDied","Data":"f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a"} Feb 23 08:34:36 crc kubenswrapper[5118]: I0223 08:34:36.991785 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzdkt"] Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.067176 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98zmk" event={"ID":"88b1e241-a92f-4306-a094-996a88eb9d07","Type":"ContainerStarted","Data":"3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb"} Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.069565 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vzdkt" podUID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerName="registry-server" containerID="cri-o://4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95" gracePeriod=2 Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.099868 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-98zmk" podStartSLOduration=1.6547553069999998 podStartE2EDuration="4.099834933s" podCreationTimestamp="2026-02-23 08:34:33 +0000 UTC" firstStartedPulling="2026-02-23 08:34:34.025442837 +0000 UTC m=+6537.029227430" lastFinishedPulling="2026-02-23 08:34:36.470522473 +0000 UTC m=+6539.474307056" observedRunningTime="2026-02-23 08:34:37.096140823 +0000 UTC m=+6540.099925466" watchObservedRunningTime="2026-02-23 08:34:37.099834933 +0000 UTC m=+6540.103619546" Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.628870 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.643590 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx5bk\" (UniqueName: \"kubernetes.io/projected/e9d7c30f-434f-4bf3-9a96-78057acb88d7-kube-api-access-nx5bk\") pod \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.643654 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-catalog-content\") pod \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.643721 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-utilities\") pod \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\" (UID: \"e9d7c30f-434f-4bf3-9a96-78057acb88d7\") " Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.644771 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-utilities" (OuterVolumeSpecName: "utilities") pod "e9d7c30f-434f-4bf3-9a96-78057acb88d7" (UID: "e9d7c30f-434f-4bf3-9a96-78057acb88d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.656536 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d7c30f-434f-4bf3-9a96-78057acb88d7-kube-api-access-nx5bk" (OuterVolumeSpecName: "kube-api-access-nx5bk") pod "e9d7c30f-434f-4bf3-9a96-78057acb88d7" (UID: "e9d7c30f-434f-4bf3-9a96-78057acb88d7"). InnerVolumeSpecName "kube-api-access-nx5bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.707263 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9d7c30f-434f-4bf3-9a96-78057acb88d7" (UID: "e9d7c30f-434f-4bf3-9a96-78057acb88d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.745042 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx5bk\" (UniqueName: \"kubernetes.io/projected/e9d7c30f-434f-4bf3-9a96-78057acb88d7-kube-api-access-nx5bk\") on node \"crc\" DevicePath \"\"" Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.745070 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:34:37 crc kubenswrapper[5118]: I0223 08:34:37.745080 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9d7c30f-434f-4bf3-9a96-78057acb88d7-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.077465 5118 generic.go:334] "Generic (PLEG): container finished" podID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerID="4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95" exitCode=0 Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.077596 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzdkt" Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.077590 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzdkt" event={"ID":"e9d7c30f-434f-4bf3-9a96-78057acb88d7","Type":"ContainerDied","Data":"4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95"} Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.077710 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzdkt" event={"ID":"e9d7c30f-434f-4bf3-9a96-78057acb88d7","Type":"ContainerDied","Data":"a3b094707f637920b76f2e8fe5c775005c91118a21de117a0bfbb4bc0c95d5af"} Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.077746 5118 scope.go:117] "RemoveContainer" containerID="4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95" Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.107601 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzdkt"] Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.115522 5118 scope.go:117] "RemoveContainer" containerID="24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710" Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.117302 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vzdkt"] Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.145223 5118 scope.go:117] "RemoveContainer" containerID="a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013" Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.197974 5118 scope.go:117] "RemoveContainer" containerID="4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95" Feb 23 08:34:38 crc kubenswrapper[5118]: E0223 08:34:38.200943 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95\": container with ID starting with 4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95 not found: ID does not exist" containerID="4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95" Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.200985 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95"} err="failed to get container status \"4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95\": rpc error: code = NotFound desc = could not find container \"4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95\": container with ID starting with 4c672f11b5872fc3f03c2399764192f378aa8157cd6bcf592cfbc24213204a95 not found: ID does not exist" Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.201010 5118 scope.go:117] "RemoveContainer" containerID="24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710" Feb 23 08:34:38 crc kubenswrapper[5118]: E0223 08:34:38.201839 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710\": container with ID starting with 24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710 not found: ID does not exist" containerID="24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710" Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.201923 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710"} err="failed to get container status \"24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710\": rpc error: code = NotFound desc = could not find container \"24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710\": container with ID starting with 24ccb3135f8133ca2123c0bd8f70a3e37ff3f23887a0faa86ae80e8bd7e97710 not found: ID does not exist" Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.201960 5118 scope.go:117] "RemoveContainer" containerID="a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013" Feb 23 08:34:38 crc kubenswrapper[5118]: E0223 08:34:38.203374 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013\": container with ID starting with a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013 not found: ID does not exist" containerID="a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013" Feb 23 08:34:38 crc kubenswrapper[5118]: I0223 08:34:38.203411 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013"} err="failed to get container status \"a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013\": rpc error: code = NotFound desc = could not find container \"a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013\": container with ID starting with a507c3d1c916f8a3d0a2a7314f76e9b2504df4d1bfd25bf66730b95a16b8e013 not found: ID does not exist" Feb 23 08:34:39 crc kubenswrapper[5118]: I0223 08:34:39.708738 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" path="/var/lib/kubelet/pods/e9d7c30f-434f-4bf3-9a96-78057acb88d7/volumes" Feb 23 08:34:43 crc kubenswrapper[5118]: I0223 08:34:43.419249 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:43 crc kubenswrapper[5118]: I0223 08:34:43.419745 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:43 crc kubenswrapper[5118]: I0223 08:34:43.493520 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:44 crc kubenswrapper[5118]: I0223 08:34:44.240617 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:44 crc kubenswrapper[5118]: I0223 08:34:44.320786 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-98zmk"] Feb 23 08:34:46 crc kubenswrapper[5118]: I0223 08:34:46.178143 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-98zmk" podUID="88b1e241-a92f-4306-a094-996a88eb9d07" containerName="registry-server" containerID="cri-o://3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb" gracePeriod=2 Feb 23 08:34:46 crc kubenswrapper[5118]: I0223 08:34:46.649677 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:46 crc kubenswrapper[5118]: I0223 08:34:46.815461 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-utilities\") pod \"88b1e241-a92f-4306-a094-996a88eb9d07\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " Feb 23 08:34:46 crc kubenswrapper[5118]: I0223 08:34:46.815720 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-catalog-content\") pod \"88b1e241-a92f-4306-a094-996a88eb9d07\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " Feb 23 08:34:46 crc kubenswrapper[5118]: I0223 08:34:46.815823 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncq5g\" (UniqueName: \"kubernetes.io/projected/88b1e241-a92f-4306-a094-996a88eb9d07-kube-api-access-ncq5g\") pod \"88b1e241-a92f-4306-a094-996a88eb9d07\" (UID: \"88b1e241-a92f-4306-a094-996a88eb9d07\") " Feb 23 08:34:46 crc kubenswrapper[5118]: I0223 08:34:46.818509 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-utilities" (OuterVolumeSpecName: "utilities") pod "88b1e241-a92f-4306-a094-996a88eb9d07" (UID: "88b1e241-a92f-4306-a094-996a88eb9d07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:34:46 crc kubenswrapper[5118]: I0223 08:34:46.829058 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b1e241-a92f-4306-a094-996a88eb9d07-kube-api-access-ncq5g" (OuterVolumeSpecName: "kube-api-access-ncq5g") pod "88b1e241-a92f-4306-a094-996a88eb9d07" (UID: "88b1e241-a92f-4306-a094-996a88eb9d07"). InnerVolumeSpecName "kube-api-access-ncq5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:34:46 crc kubenswrapper[5118]: I0223 08:34:46.918682 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:34:46 crc kubenswrapper[5118]: I0223 08:34:46.918744 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncq5g\" (UniqueName: \"kubernetes.io/projected/88b1e241-a92f-4306-a094-996a88eb9d07-kube-api-access-ncq5g\") on node \"crc\" DevicePath \"\"" Feb 23 08:34:46 crc kubenswrapper[5118]: I0223 08:34:46.922695 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88b1e241-a92f-4306-a094-996a88eb9d07" (UID: "88b1e241-a92f-4306-a094-996a88eb9d07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.022757 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88b1e241-a92f-4306-a094-996a88eb9d07-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.191598 5118 generic.go:334] "Generic (PLEG): container finished" podID="88b1e241-a92f-4306-a094-996a88eb9d07" containerID="3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb" exitCode=0 Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.191675 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98zmk" event={"ID":"88b1e241-a92f-4306-a094-996a88eb9d07","Type":"ContainerDied","Data":"3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb"} Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.191728 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-98zmk" event={"ID":"88b1e241-a92f-4306-a094-996a88eb9d07","Type":"ContainerDied","Data":"dfebf33638291a52db8d113467d7b4de4532955fc5dec922e68f6567d8cafaa7"} Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.191758 5118 scope.go:117] "RemoveContainer" containerID="3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.191980 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-98zmk" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.245752 5118 scope.go:117] "RemoveContainer" containerID="f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.249766 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-98zmk"] Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.264361 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-98zmk"] Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.273144 5118 scope.go:117] "RemoveContainer" containerID="d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.320275 5118 scope.go:117] "RemoveContainer" containerID="3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb" Feb 23 08:34:47 crc kubenswrapper[5118]: E0223 08:34:47.322785 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb\": container with ID starting with 3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb not found: ID does not exist" containerID="3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.322847 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb"} err="failed to get container status \"3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb\": rpc error: code = NotFound desc = could not find container \"3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb\": container with ID starting with 3a54f3e5e5ece637652d0eaee98272ef4b33004e023f8948ec43f6a79d3595bb not found: ID does not exist" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.322886 5118 scope.go:117] "RemoveContainer" containerID="f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a" Feb 23 08:34:47 crc kubenswrapper[5118]: E0223 08:34:47.323594 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a\": container with ID starting with f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a not found: ID does not exist" containerID="f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.323644 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a"} err="failed to get container status \"f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a\": rpc error: code = NotFound desc = could not find container \"f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a\": container with ID starting with f15166521e265b1f813a44b0e3e45560ce9413e265221d2e8c82388ee2a2af6a not found: ID does not exist" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.323674 5118 scope.go:117] "RemoveContainer" containerID="d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b" Feb 23 08:34:47 crc kubenswrapper[5118]: E0223 08:34:47.324682 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b\": container with ID starting with d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b not found: ID does not exist" containerID="d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.324724 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b"} err="failed to get container status \"d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b\": rpc error: code = NotFound desc = could not find container \"d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b\": container with ID starting with d669ab3dd8ba3c3c0d5251d9863c5e660c21ead161c02b2e5a70af70ea0dc85b not found: ID does not exist" Feb 23 08:34:47 crc kubenswrapper[5118]: I0223 08:34:47.722786 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b1e241-a92f-4306-a094-996a88eb9d07" path="/var/lib/kubelet/pods/88b1e241-a92f-4306-a094-996a88eb9d07/volumes" Feb 23 08:35:02 crc kubenswrapper[5118]: I0223 08:35:02.975792 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:35:02 crc kubenswrapper[5118]: I0223 08:35:02.976904 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:35:02 crc kubenswrapper[5118]: I0223 08:35:02.976974 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 08:35:02 crc kubenswrapper[5118]: I0223 08:35:02.977977 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3b5aaeff40057d33bd8821cc5a710b83456558795d146f2b7bcff3286f475d3f"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:35:02 crc kubenswrapper[5118]: I0223 08:35:02.978067 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://3b5aaeff40057d33bd8821cc5a710b83456558795d146f2b7bcff3286f475d3f" gracePeriod=600 Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.260633 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z2vn4"] Feb 23 08:35:03 crc kubenswrapper[5118]: E0223 08:35:03.261122 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerName="extract-utilities" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.261137 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerName="extract-utilities" Feb 23 08:35:03 crc kubenswrapper[5118]: E0223 08:35:03.261158 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerName="registry-server" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.261166 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerName="registry-server" Feb 23 08:35:03 crc kubenswrapper[5118]: E0223 08:35:03.261181 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b1e241-a92f-4306-a094-996a88eb9d07" containerName="extract-utilities" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.261189 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b1e241-a92f-4306-a094-996a88eb9d07" containerName="extract-utilities" Feb 23 08:35:03 crc kubenswrapper[5118]: E0223 08:35:03.261204 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerName="extract-content" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.261211 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerName="extract-content" Feb 23 08:35:03 crc kubenswrapper[5118]: E0223 08:35:03.261223 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b1e241-a92f-4306-a094-996a88eb9d07" containerName="registry-server" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.261229 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b1e241-a92f-4306-a094-996a88eb9d07" containerName="registry-server" Feb 23 08:35:03 crc kubenswrapper[5118]: E0223 08:35:03.261248 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b1e241-a92f-4306-a094-996a88eb9d07" containerName="extract-content" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.261254 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b1e241-a92f-4306-a094-996a88eb9d07" containerName="extract-content" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.261481 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d7c30f-434f-4bf3-9a96-78057acb88d7" containerName="registry-server" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.261506 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b1e241-a92f-4306-a094-996a88eb9d07" containerName="registry-server" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.263156 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.293134 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z2vn4"] Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.376160 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="3b5aaeff40057d33bd8821cc5a710b83456558795d146f2b7bcff3286f475d3f" exitCode=0 Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.376689 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"3b5aaeff40057d33bd8821cc5a710b83456558795d146f2b7bcff3286f475d3f"} Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.376773 5118 scope.go:117] "RemoveContainer" containerID="b031d20b335476addf5a2e8f80899cce86dce4c334cb9a937941e345ff578b1e" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.377772 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-catalog-content\") pod \"redhat-operators-z2vn4\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.377830 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-utilities\") pod \"redhat-operators-z2vn4\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.377854 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jph6q\" (UniqueName: \"kubernetes.io/projected/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-kube-api-access-jph6q\") pod \"redhat-operators-z2vn4\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.480452 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-catalog-content\") pod \"redhat-operators-z2vn4\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.480984 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-catalog-content\") pod \"redhat-operators-z2vn4\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.481384 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-utilities\") pod \"redhat-operators-z2vn4\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.481003 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-utilities\") pod \"redhat-operators-z2vn4\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.481538 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jph6q\" (UniqueName: \"kubernetes.io/projected/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-kube-api-access-jph6q\") pod \"redhat-operators-z2vn4\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.506749 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jph6q\" (UniqueName: \"kubernetes.io/projected/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-kube-api-access-jph6q\") pod \"redhat-operators-z2vn4\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.616792 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:03 crc kubenswrapper[5118]: I0223 08:35:03.915827 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z2vn4"] Feb 23 08:35:04 crc kubenswrapper[5118]: I0223 08:35:04.388205 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92"} Feb 23 08:35:04 crc kubenswrapper[5118]: I0223 08:35:04.390725 5118 generic.go:334] "Generic (PLEG): container finished" podID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerID="9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e" exitCode=0 Feb 23 08:35:04 crc kubenswrapper[5118]: I0223 08:35:04.390779 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2vn4" event={"ID":"b18a6f7e-a6f2-497d-9178-d079fb3d25ad","Type":"ContainerDied","Data":"9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e"} Feb 23 08:35:04 crc kubenswrapper[5118]: I0223 08:35:04.390802 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2vn4" event={"ID":"b18a6f7e-a6f2-497d-9178-d079fb3d25ad","Type":"ContainerStarted","Data":"ed411ef03d1ae8b2897ca5f18011753dcfc51075a97b7f7bf45e19efc075f8e1"} Feb 23 08:35:05 crc kubenswrapper[5118]: I0223 08:35:05.402190 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2vn4" event={"ID":"b18a6f7e-a6f2-497d-9178-d079fb3d25ad","Type":"ContainerStarted","Data":"99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d"} Feb 23 08:35:06 crc kubenswrapper[5118]: I0223 08:35:06.417500 5118 generic.go:334] "Generic (PLEG): container finished" podID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerID="99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d" exitCode=0 Feb 23 08:35:06 crc kubenswrapper[5118]: I0223 08:35:06.417585 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2vn4" event={"ID":"b18a6f7e-a6f2-497d-9178-d079fb3d25ad","Type":"ContainerDied","Data":"99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d"} Feb 23 08:35:07 crc kubenswrapper[5118]: I0223 08:35:07.093715 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9vfl4"] Feb 23 08:35:07 crc kubenswrapper[5118]: I0223 08:35:07.100855 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9vfl4"] Feb 23 08:35:07 crc kubenswrapper[5118]: I0223 08:35:07.431221 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2vn4" event={"ID":"b18a6f7e-a6f2-497d-9178-d079fb3d25ad","Type":"ContainerStarted","Data":"05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516"} Feb 23 08:35:07 crc kubenswrapper[5118]: I0223 08:35:07.454376 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z2vn4" podStartSLOduration=2.043743641 podStartE2EDuration="4.454345857s" podCreationTimestamp="2026-02-23 08:35:03 +0000 UTC" firstStartedPulling="2026-02-23 08:35:04.393046918 +0000 UTC m=+6567.396831491" lastFinishedPulling="2026-02-23 08:35:06.803649094 +0000 UTC m=+6569.807433707" observedRunningTime="2026-02-23 08:35:07.452915622 +0000 UTC m=+6570.456700225" watchObservedRunningTime="2026-02-23 08:35:07.454345857 +0000 UTC m=+6570.458130450" Feb 23 08:35:07 crc kubenswrapper[5118]: I0223 08:35:07.716277 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90069789-63ca-43a0-9aea-a49dc5f2233f" path="/var/lib/kubelet/pods/90069789-63ca-43a0-9aea-a49dc5f2233f/volumes" Feb 23 08:35:13 crc kubenswrapper[5118]: I0223 08:35:13.617753 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:13 crc kubenswrapper[5118]: I0223 08:35:13.618757 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:14 crc kubenswrapper[5118]: I0223 08:35:14.680778 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z2vn4" podUID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerName="registry-server" probeResult="failure" output=< Feb 23 08:35:14 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 08:35:14 crc kubenswrapper[5118]: > Feb 23 08:35:23 crc kubenswrapper[5118]: I0223 08:35:23.714857 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:23 crc kubenswrapper[5118]: I0223 08:35:23.788155 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:23 crc kubenswrapper[5118]: I0223 08:35:23.976383 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z2vn4"] Feb 23 08:35:25 crc kubenswrapper[5118]: I0223 08:35:25.603156 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z2vn4" podUID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerName="registry-server" containerID="cri-o://05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516" gracePeriod=2 Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.098477 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.149377 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-utilities\") pod \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.149717 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jph6q\" (UniqueName: \"kubernetes.io/projected/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-kube-api-access-jph6q\") pod \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.149804 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-catalog-content\") pod \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\" (UID: \"b18a6f7e-a6f2-497d-9178-d079fb3d25ad\") " Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.151359 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-utilities" (OuterVolumeSpecName: "utilities") pod "b18a6f7e-a6f2-497d-9178-d079fb3d25ad" (UID: "b18a6f7e-a6f2-497d-9178-d079fb3d25ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.161901 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-kube-api-access-jph6q" (OuterVolumeSpecName: "kube-api-access-jph6q") pod "b18a6f7e-a6f2-497d-9178-d079fb3d25ad" (UID: "b18a6f7e-a6f2-497d-9178-d079fb3d25ad"). InnerVolumeSpecName "kube-api-access-jph6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.253200 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.253313 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jph6q\" (UniqueName: \"kubernetes.io/projected/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-kube-api-access-jph6q\") on node \"crc\" DevicePath \"\"" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.288930 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b18a6f7e-a6f2-497d-9178-d079fb3d25ad" (UID: "b18a6f7e-a6f2-497d-9178-d079fb3d25ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.354957 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18a6f7e-a6f2-497d-9178-d079fb3d25ad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.620508 5118 generic.go:334] "Generic (PLEG): container finished" podID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerID="05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516" exitCode=0 Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.620636 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z2vn4" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.620665 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2vn4" event={"ID":"b18a6f7e-a6f2-497d-9178-d079fb3d25ad","Type":"ContainerDied","Data":"05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516"} Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.620824 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z2vn4" event={"ID":"b18a6f7e-a6f2-497d-9178-d079fb3d25ad","Type":"ContainerDied","Data":"ed411ef03d1ae8b2897ca5f18011753dcfc51075a97b7f7bf45e19efc075f8e1"} Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.620879 5118 scope.go:117] "RemoveContainer" containerID="05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.654678 5118 scope.go:117] "RemoveContainer" containerID="99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.694348 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z2vn4"] Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.709933 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z2vn4"] Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.714387 5118 scope.go:117] "RemoveContainer" containerID="9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.762303 5118 scope.go:117] "RemoveContainer" containerID="05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516" Feb 23 08:35:26 crc kubenswrapper[5118]: E0223 08:35:26.763528 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516\": container with ID starting with 05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516 not found: ID does not exist" containerID="05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.763639 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516"} err="failed to get container status \"05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516\": rpc error: code = NotFound desc = could not find container \"05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516\": container with ID starting with 05a844ccdc5e04475dc783ee342f188f0798605f25d1d43e38eb15d7ce37e516 not found: ID does not exist" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.763691 5118 scope.go:117] "RemoveContainer" containerID="99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d" Feb 23 08:35:26 crc kubenswrapper[5118]: E0223 08:35:26.764288 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d\": container with ID starting with 99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d not found: ID does not exist" containerID="99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.764705 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d"} err="failed to get container status \"99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d\": rpc error: code = NotFound desc = could not find container \"99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d\": container with ID starting with 99901f4d368a068d9426b5b9ba9ae2a9e4a54097a6922bc0597b7f93322cb46d not found: ID does not exist" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.764947 5118 scope.go:117] "RemoveContainer" containerID="9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e" Feb 23 08:35:26 crc kubenswrapper[5118]: E0223 08:35:26.765689 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e\": container with ID starting with 9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e not found: ID does not exist" containerID="9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e" Feb 23 08:35:26 crc kubenswrapper[5118]: I0223 08:35:26.765789 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e"} err="failed to get container status \"9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e\": rpc error: code = NotFound desc = could not find container \"9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e\": container with ID starting with 9489dda376b9224b7eeeb8767a9db69886e886b140b1deb0d66cec2ace1c872e not found: ID does not exist" Feb 23 08:35:27 crc kubenswrapper[5118]: I0223 08:35:27.716924 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" path="/var/lib/kubelet/pods/b18a6f7e-a6f2-497d-9178-d079fb3d25ad/volumes" Feb 23 08:35:51 crc kubenswrapper[5118]: I0223 08:35:51.755820 5118 scope.go:117] "RemoveContainer" containerID="e3278b9dc1a7662d6fb2aeed4b7f12bfa7e0cabeb6e13e6905f874640e2e9f4c" Feb 23 08:37:32 crc kubenswrapper[5118]: I0223 08:37:32.975332 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:37:32 crc kubenswrapper[5118]: I0223 08:37:32.976347 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:38:02 crc kubenswrapper[5118]: I0223 08:38:02.975572 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:38:02 crc kubenswrapper[5118]: I0223 08:38:02.976453 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:38:32 crc kubenswrapper[5118]: I0223 08:38:32.974760 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:38:32 crc kubenswrapper[5118]: I0223 08:38:32.975519 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:38:32 crc kubenswrapper[5118]: I0223 08:38:32.975600 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 08:38:32 crc kubenswrapper[5118]: I0223 08:38:32.976642 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:38:32 crc kubenswrapper[5118]: I0223 08:38:32.976766 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" gracePeriod=600 Feb 23 08:38:33 crc kubenswrapper[5118]: E0223 08:38:33.109468 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:38:33 crc kubenswrapper[5118]: I0223 08:38:33.560973 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" exitCode=0 Feb 23 08:38:33 crc kubenswrapper[5118]: I0223 08:38:33.561024 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92"} Feb 23 08:38:33 crc kubenswrapper[5118]: I0223 08:38:33.561117 5118 scope.go:117] "RemoveContainer" containerID="3b5aaeff40057d33bd8821cc5a710b83456558795d146f2b7bcff3286f475d3f" Feb 23 08:38:33 crc kubenswrapper[5118]: I0223 08:38:33.561617 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:38:33 crc kubenswrapper[5118]: E0223 08:38:33.562070 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.370458 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 08:38:41 crc kubenswrapper[5118]: E0223 08:38:41.372590 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerName="registry-server" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.372668 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerName="registry-server" Feb 23 08:38:41 crc kubenswrapper[5118]: E0223 08:38:41.372725 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerName="extract-content" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.372777 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerName="extract-content" Feb 23 08:38:41 crc kubenswrapper[5118]: E0223 08:38:41.372845 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerName="extract-utilities" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.372898 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerName="extract-utilities" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.373117 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18a6f7e-a6f2-497d-9178-d079fb3d25ad" containerName="registry-server" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.375429 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.379844 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cbpwf" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.402434 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.506560 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8j8z\" (UniqueName: \"kubernetes.io/projected/38283e4b-8372-40cb-83b7-c1e08e09dd96-kube-api-access-k8j8z\") pod \"mariadb-copy-data\" (UID: \"38283e4b-8372-40cb-83b7-c1e08e09dd96\") " pod="openstack/mariadb-copy-data" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.507013 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\") pod \"mariadb-copy-data\" (UID: \"38283e4b-8372-40cb-83b7-c1e08e09dd96\") " pod="openstack/mariadb-copy-data" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.608339 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8j8z\" (UniqueName: \"kubernetes.io/projected/38283e4b-8372-40cb-83b7-c1e08e09dd96-kube-api-access-k8j8z\") pod \"mariadb-copy-data\" (UID: \"38283e4b-8372-40cb-83b7-c1e08e09dd96\") " pod="openstack/mariadb-copy-data" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.608524 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\") pod \"mariadb-copy-data\" (UID: \"38283e4b-8372-40cb-83b7-c1e08e09dd96\") " pod="openstack/mariadb-copy-data" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.612755 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.612952 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\") pod \"mariadb-copy-data\" (UID: \"38283e4b-8372-40cb-83b7-c1e08e09dd96\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7575ae75df8fca4ccf702cf25d4c158ce779fd86f8b24a4e87d93b2f8f1cce22/globalmount\"" pod="openstack/mariadb-copy-data" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.637589 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8j8z\" (UniqueName: \"kubernetes.io/projected/38283e4b-8372-40cb-83b7-c1e08e09dd96-kube-api-access-k8j8z\") pod \"mariadb-copy-data\" (UID: \"38283e4b-8372-40cb-83b7-c1e08e09dd96\") " pod="openstack/mariadb-copy-data" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.658956 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\") pod \"mariadb-copy-data\" (UID: \"38283e4b-8372-40cb-83b7-c1e08e09dd96\") " pod="openstack/mariadb-copy-data" Feb 23 08:38:41 crc kubenswrapper[5118]: I0223 08:38:41.700985 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 23 08:38:42 crc kubenswrapper[5118]: I0223 08:38:42.253412 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 08:38:42 crc kubenswrapper[5118]: I0223 08:38:42.662796 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38283e4b-8372-40cb-83b7-c1e08e09dd96","Type":"ContainerStarted","Data":"513f8feb41a7ca6f964fae4153244686fc0f735317ffdcfe8668d70db150c8c3"} Feb 23 08:38:42 crc kubenswrapper[5118]: I0223 08:38:42.662869 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38283e4b-8372-40cb-83b7-c1e08e09dd96","Type":"ContainerStarted","Data":"6e0016d09d1a32d59219ae2aeb894fc6b1b82562ed486576a8c4b86a19b79ac3"} Feb 23 08:38:42 crc kubenswrapper[5118]: I0223 08:38:42.683872 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.683842477 podStartE2EDuration="2.683842477s" podCreationTimestamp="2026-02-23 08:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:38:42.682744631 +0000 UTC m=+6785.686529204" watchObservedRunningTime="2026-02-23 08:38:42.683842477 +0000 UTC m=+6785.687627060" Feb 23 08:38:46 crc kubenswrapper[5118]: I0223 08:38:46.069958 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 23 08:38:46 crc kubenswrapper[5118]: I0223 08:38:46.075265 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:38:46 crc kubenswrapper[5118]: I0223 08:38:46.087246 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:38:46 crc kubenswrapper[5118]: I0223 08:38:46.191162 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cghs\" (UniqueName: \"kubernetes.io/projected/705e77a1-a587-4598-b674-cb45bdeeb3bd-kube-api-access-5cghs\") pod \"mariadb-client\" (UID: \"705e77a1-a587-4598-b674-cb45bdeeb3bd\") " pod="openstack/mariadb-client" Feb 23 08:38:46 crc kubenswrapper[5118]: I0223 08:38:46.293618 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cghs\" (UniqueName: \"kubernetes.io/projected/705e77a1-a587-4598-b674-cb45bdeeb3bd-kube-api-access-5cghs\") pod \"mariadb-client\" (UID: \"705e77a1-a587-4598-b674-cb45bdeeb3bd\") " pod="openstack/mariadb-client" Feb 23 08:38:46 crc kubenswrapper[5118]: I0223 08:38:46.336016 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cghs\" (UniqueName: \"kubernetes.io/projected/705e77a1-a587-4598-b674-cb45bdeeb3bd-kube-api-access-5cghs\") pod \"mariadb-client\" (UID: \"705e77a1-a587-4598-b674-cb45bdeeb3bd\") " pod="openstack/mariadb-client" Feb 23 08:38:46 crc kubenswrapper[5118]: I0223 08:38:46.424451 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:38:46 crc kubenswrapper[5118]: I0223 08:38:46.921888 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:38:46 crc kubenswrapper[5118]: W0223 08:38:46.925539 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod705e77a1_a587_4598_b674_cb45bdeeb3bd.slice/crio-6448bab94ade67f6c428a4e5b31ed076bc68f6463ca0818a77bb7b8faa609e8a WatchSource:0}: Error finding container 6448bab94ade67f6c428a4e5b31ed076bc68f6463ca0818a77bb7b8faa609e8a: Status 404 returned error can't find the container with id 6448bab94ade67f6c428a4e5b31ed076bc68f6463ca0818a77bb7b8faa609e8a Feb 23 08:38:47 crc kubenswrapper[5118]: I0223 08:38:47.710396 5118 generic.go:334] "Generic (PLEG): container finished" podID="705e77a1-a587-4598-b674-cb45bdeeb3bd" containerID="013b795a435dd3b31f324f68dbbb3a61a441afd35f9051ff16c2f2541a5cdc29" exitCode=0 Feb 23 08:38:47 crc kubenswrapper[5118]: I0223 08:38:47.713314 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"705e77a1-a587-4598-b674-cb45bdeeb3bd","Type":"ContainerDied","Data":"013b795a435dd3b31f324f68dbbb3a61a441afd35f9051ff16c2f2541a5cdc29"} Feb 23 08:38:47 crc kubenswrapper[5118]: I0223 08:38:47.713352 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"705e77a1-a587-4598-b674-cb45bdeeb3bd","Type":"ContainerStarted","Data":"6448bab94ade67f6c428a4e5b31ed076bc68f6463ca0818a77bb7b8faa609e8a"} Feb 23 08:38:48 crc kubenswrapper[5118]: I0223 08:38:48.696846 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:38:48 crc kubenswrapper[5118]: E0223 08:38:48.697480 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.108752 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.148429 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_705e77a1-a587-4598-b674-cb45bdeeb3bd/mariadb-client/0.log" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.157727 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cghs\" (UniqueName: \"kubernetes.io/projected/705e77a1-a587-4598-b674-cb45bdeeb3bd-kube-api-access-5cghs\") pod \"705e77a1-a587-4598-b674-cb45bdeeb3bd\" (UID: \"705e77a1-a587-4598-b674-cb45bdeeb3bd\") " Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.191468 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705e77a1-a587-4598-b674-cb45bdeeb3bd-kube-api-access-5cghs" (OuterVolumeSpecName: "kube-api-access-5cghs") pod "705e77a1-a587-4598-b674-cb45bdeeb3bd" (UID: "705e77a1-a587-4598-b674-cb45bdeeb3bd"). InnerVolumeSpecName "kube-api-access-5cghs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.195494 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.202944 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.260764 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cghs\" (UniqueName: \"kubernetes.io/projected/705e77a1-a587-4598-b674-cb45bdeeb3bd-kube-api-access-5cghs\") on node \"crc\" DevicePath \"\"" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.346208 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 23 08:38:49 crc kubenswrapper[5118]: E0223 08:38:49.346580 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705e77a1-a587-4598-b674-cb45bdeeb3bd" containerName="mariadb-client" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.346597 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="705e77a1-a587-4598-b674-cb45bdeeb3bd" containerName="mariadb-client" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.346760 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="705e77a1-a587-4598-b674-cb45bdeeb3bd" containerName="mariadb-client" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.347310 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.356441 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.463589 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xjgc\" (UniqueName: \"kubernetes.io/projected/2a4dd096-0a16-4e93-a192-652cc7f50dfc-kube-api-access-4xjgc\") pod \"mariadb-client\" (UID: \"2a4dd096-0a16-4e93-a192-652cc7f50dfc\") " pod="openstack/mariadb-client" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.568400 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xjgc\" (UniqueName: \"kubernetes.io/projected/2a4dd096-0a16-4e93-a192-652cc7f50dfc-kube-api-access-4xjgc\") pod \"mariadb-client\" (UID: \"2a4dd096-0a16-4e93-a192-652cc7f50dfc\") " pod="openstack/mariadb-client" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.605057 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xjgc\" (UniqueName: \"kubernetes.io/projected/2a4dd096-0a16-4e93-a192-652cc7f50dfc-kube-api-access-4xjgc\") pod \"mariadb-client\" (UID: \"2a4dd096-0a16-4e93-a192-652cc7f50dfc\") " pod="openstack/mariadb-client" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.666769 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.747153 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="705e77a1-a587-4598-b674-cb45bdeeb3bd" path="/var/lib/kubelet/pods/705e77a1-a587-4598-b674-cb45bdeeb3bd/volumes" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.750015 5118 scope.go:117] "RemoveContainer" containerID="013b795a435dd3b31f324f68dbbb3a61a441afd35f9051ff16c2f2541a5cdc29" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.750061 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:38:49 crc kubenswrapper[5118]: I0223 08:38:49.932452 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:38:50 crc kubenswrapper[5118]: I0223 08:38:50.768070 5118 generic.go:334] "Generic (PLEG): container finished" podID="2a4dd096-0a16-4e93-a192-652cc7f50dfc" containerID="f2b275fc7e60788c5a1643b1a406e58a8a1e4a5c7d059999a801e8ab0beba8ee" exitCode=0 Feb 23 08:38:50 crc kubenswrapper[5118]: I0223 08:38:50.768172 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2a4dd096-0a16-4e93-a192-652cc7f50dfc","Type":"ContainerDied","Data":"f2b275fc7e60788c5a1643b1a406e58a8a1e4a5c7d059999a801e8ab0beba8ee"} Feb 23 08:38:50 crc kubenswrapper[5118]: I0223 08:38:50.768258 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"2a4dd096-0a16-4e93-a192-652cc7f50dfc","Type":"ContainerStarted","Data":"7bae5cff3df32efb973a3018e62f3c5004e06e71e7864f547f913a5bdae9485a"} Feb 23 08:38:52 crc kubenswrapper[5118]: I0223 08:38:52.140747 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:38:52 crc kubenswrapper[5118]: I0223 08:38:52.160019 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_2a4dd096-0a16-4e93-a192-652cc7f50dfc/mariadb-client/0.log" Feb 23 08:38:52 crc kubenswrapper[5118]: I0223 08:38:52.203738 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:38:52 crc kubenswrapper[5118]: I0223 08:38:52.209351 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 23 08:38:52 crc kubenswrapper[5118]: I0223 08:38:52.218152 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xjgc\" (UniqueName: \"kubernetes.io/projected/2a4dd096-0a16-4e93-a192-652cc7f50dfc-kube-api-access-4xjgc\") pod \"2a4dd096-0a16-4e93-a192-652cc7f50dfc\" (UID: \"2a4dd096-0a16-4e93-a192-652cc7f50dfc\") " Feb 23 08:38:52 crc kubenswrapper[5118]: I0223 08:38:52.232021 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a4dd096-0a16-4e93-a192-652cc7f50dfc-kube-api-access-4xjgc" (OuterVolumeSpecName: "kube-api-access-4xjgc") pod "2a4dd096-0a16-4e93-a192-652cc7f50dfc" (UID: "2a4dd096-0a16-4e93-a192-652cc7f50dfc"). InnerVolumeSpecName "kube-api-access-4xjgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:38:52 crc kubenswrapper[5118]: I0223 08:38:52.320382 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xjgc\" (UniqueName: \"kubernetes.io/projected/2a4dd096-0a16-4e93-a192-652cc7f50dfc-kube-api-access-4xjgc\") on node \"crc\" DevicePath \"\"" Feb 23 08:38:52 crc kubenswrapper[5118]: I0223 08:38:52.797604 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bae5cff3df32efb973a3018e62f3c5004e06e71e7864f547f913a5bdae9485a" Feb 23 08:38:52 crc kubenswrapper[5118]: I0223 08:38:52.797726 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 23 08:38:53 crc kubenswrapper[5118]: I0223 08:38:53.733191 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a4dd096-0a16-4e93-a192-652cc7f50dfc" path="/var/lib/kubelet/pods/2a4dd096-0a16-4e93-a192-652cc7f50dfc/volumes" Feb 23 08:39:02 crc kubenswrapper[5118]: I0223 08:39:02.698346 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:39:02 crc kubenswrapper[5118]: E0223 08:39:02.699678 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:39:13 crc kubenswrapper[5118]: I0223 08:39:13.697480 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:39:13 crc kubenswrapper[5118]: E0223 08:39:13.698572 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.760291 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 08:39:23 crc kubenswrapper[5118]: E0223 08:39:23.761761 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4dd096-0a16-4e93-a192-652cc7f50dfc" containerName="mariadb-client" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.761796 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4dd096-0a16-4e93-a192-652cc7f50dfc" containerName="mariadb-client" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.762257 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a4dd096-0a16-4e93-a192-652cc7f50dfc" containerName="mariadb-client" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.764131 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.769860 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dkt2d" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.771374 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.771678 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.786496 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.788752 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.805970 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.831381 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.842939 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.854297 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.867355 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.882459 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50fcd88-9e24-4304-9ba8-892b357328b5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.882641 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b50fcd88-9e24-4304-9ba8-892b357328b5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.882683 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b50fcd88-9e24-4304-9ba8-892b357328b5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.882703 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50fcd88-9e24-4304-9ba8-892b357328b5-config\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.882729 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c4nz\" (UniqueName: \"kubernetes.io/projected/b50fcd88-9e24-4304-9ba8-892b357328b5-kube-api-access-7c4nz\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.882961 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3d4c0c74-464b-4206-a4ec-d4340c5edea0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d4c0c74-464b-4206-a4ec-d4340c5edea0\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.954698 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.957144 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.962533 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4lqzd" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.963025 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.963383 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.980634 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.984340 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d0221c3-d348-4f99-9f66-2ff299664efa-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.984402 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6eb64364-3710-45e8-82d7-4e6e4f444a9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb64364-3710-45e8-82d7-4e6e4f444a9a\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.984443 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c4c002-69d9-4589-8c0b-a110c31b8314-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.984704 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b50fcd88-9e24-4304-9ba8-892b357328b5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.984753 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0221c3-d348-4f99-9f66-2ff299664efa-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.984792 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b50fcd88-9e24-4304-9ba8-892b357328b5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985018 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ttml\" (UniqueName: \"kubernetes.io/projected/3d0221c3-d348-4f99-9f66-2ff299664efa-kube-api-access-7ttml\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985062 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50fcd88-9e24-4304-9ba8-892b357328b5-config\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985256 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c4nz\" (UniqueName: \"kubernetes.io/projected/b50fcd88-9e24-4304-9ba8-892b357328b5-kube-api-access-7c4nz\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985325 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9c4c002-69d9-4589-8c0b-a110c31b8314-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985359 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkbgh\" (UniqueName: \"kubernetes.io/projected/a9c4c002-69d9-4589-8c0b-a110c31b8314-kube-api-access-nkbgh\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985461 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c4c002-69d9-4589-8c0b-a110c31b8314-config\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985517 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d0221c3-d348-4f99-9f66-2ff299664efa-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985541 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9c4c002-69d9-4589-8c0b-a110c31b8314-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985626 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3d4c0c74-464b-4206-a4ec-d4340c5edea0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d4c0c74-464b-4206-a4ec-d4340c5edea0\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985901 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-751a6615-5df4-4423-b8ce-a02f9695dac9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-751a6615-5df4-4423-b8ce-a02f9695dac9\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.985944 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d0221c3-d348-4f99-9f66-2ff299664efa-config\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.986020 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50fcd88-9e24-4304-9ba8-892b357328b5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.986763 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b50fcd88-9e24-4304-9ba8-892b357328b5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.988025 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b50fcd88-9e24-4304-9ba8-892b357328b5-config\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:23 crc kubenswrapper[5118]: I0223 08:39:23.989959 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b50fcd88-9e24-4304-9ba8-892b357328b5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:23.996996 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b50fcd88-9e24-4304-9ba8-892b357328b5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:23.998023 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:23.999822 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.003292 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.003347 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3d4c0c74-464b-4206-a4ec-d4340c5edea0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d4c0c74-464b-4206-a4ec-d4340c5edea0\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/29c8cf01a2469b5b87e44041a0642904c4407e105c0060108b6f1d6d7d27c3b1/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.009510 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.016929 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.018994 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c4nz\" (UniqueName: \"kubernetes.io/projected/b50fcd88-9e24-4304-9ba8-892b357328b5-kube-api-access-7c4nz\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.022434 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.024458 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.069847 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3d4c0c74-464b-4206-a4ec-d4340c5edea0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d4c0c74-464b-4206-a4ec-d4340c5edea0\") pod \"ovsdbserver-nb-0\" (UID: \"b50fcd88-9e24-4304-9ba8-892b357328b5\") " pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.087613 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0221c3-d348-4f99-9f66-2ff299664efa-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.087678 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b32637c-2170-4283-9a68-b8c4af56ef90-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.087714 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ttml\" (UniqueName: \"kubernetes.io/projected/3d0221c3-d348-4f99-9f66-2ff299664efa-kube-api-access-7ttml\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.087742 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k9gv\" (UniqueName: \"kubernetes.io/projected/d8b50ef3-7b23-448e-86cb-6e46a16c2624-kube-api-access-4k9gv\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.087784 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d8b50ef3-7b23-448e-86cb-6e46a16c2624-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088134 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9c4c002-69d9-4589-8c0b-a110c31b8314-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088184 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkbgh\" (UniqueName: \"kubernetes.io/projected/a9c4c002-69d9-4589-8c0b-a110c31b8314-kube-api-access-nkbgh\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088313 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c4c002-69d9-4589-8c0b-a110c31b8314-config\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088373 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d0221c3-d348-4f99-9f66-2ff299664efa-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088404 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9c4c002-69d9-4589-8c0b-a110c31b8314-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088433 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8b50ef3-7b23-448e-86cb-6e46a16c2624-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088458 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54pt2\" (UniqueName: \"kubernetes.io/projected/9b32637c-2170-4283-9a68-b8c4af56ef90-kube-api-access-54pt2\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088514 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e75a46-97ac-454a-aae6-988e4b4bd679-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088599 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45e75a46-97ac-454a-aae6-988e4b4bd679-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088631 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45e75a46-97ac-454a-aae6-988e4b4bd679-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088679 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ca5afa86-3545-465b-8a63-254b4d798d5c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca5afa86-3545-465b-8a63-254b4d798d5c\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088717 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b32637c-2170-4283-9a68-b8c4af56ef90-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088717 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a9c4c002-69d9-4589-8c0b-a110c31b8314-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088746 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b32637c-2170-4283-9a68-b8c4af56ef90-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088787 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d0221c3-d348-4f99-9f66-2ff299664efa-config\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088813 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-751a6615-5df4-4423-b8ce-a02f9695dac9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-751a6615-5df4-4423-b8ce-a02f9695dac9\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.088858 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b32637c-2170-4283-9a68-b8c4af56ef90-config\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.089934 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d0221c3-d348-4f99-9f66-2ff299664efa-config\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.090079 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d0221c3-d348-4f99-9f66-2ff299664efa-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.090128 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6eb64364-3710-45e8-82d7-4e6e4f444a9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb64364-3710-45e8-82d7-4e6e4f444a9a\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.090148 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c4c002-69d9-4589-8c0b-a110c31b8314-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.090241 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e75a46-97ac-454a-aae6-988e4b4bd679-config\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.090274 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8b50ef3-7b23-448e-86cb-6e46a16c2624-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.090307 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gxv6\" (UniqueName: \"kubernetes.io/projected/45e75a46-97ac-454a-aae6-988e4b4bd679-kube-api-access-9gxv6\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.090330 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6a63eaa5-4774-420e-b08c-6a7908584160\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a63eaa5-4774-420e-b08c-6a7908584160\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.090358 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b50ef3-7b23-448e-86cb-6e46a16c2624-config\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.090559 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fbd322ff-0c23-4c1e-bef9-60b1eac3cade\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbd322ff-0c23-4c1e-bef9-60b1eac3cade\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.092146 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9c4c002-69d9-4589-8c0b-a110c31b8314-config\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.092354 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9c4c002-69d9-4589-8c0b-a110c31b8314-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.092502 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0221c3-d348-4f99-9f66-2ff299664efa-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.092996 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d0221c3-d348-4f99-9f66-2ff299664efa-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.095556 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.095593 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6eb64364-3710-45e8-82d7-4e6e4f444a9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb64364-3710-45e8-82d7-4e6e4f444a9a\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f9b1ec9df5b95ac4521e7d05db444937062194a8a104c69ee9d03aaa432f389e/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.095634 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.095714 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-751a6615-5df4-4423-b8ce-a02f9695dac9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-751a6615-5df4-4423-b8ce-a02f9695dac9\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7b029b2c0449cefc0ba073058939e8c99d48f85bf205fd4104a6f84c261d9a8e/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.097501 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9c4c002-69d9-4589-8c0b-a110c31b8314-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.098630 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d0221c3-d348-4f99-9f66-2ff299664efa-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.098867 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.105883 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkbgh\" (UniqueName: \"kubernetes.io/projected/a9c4c002-69d9-4589-8c0b-a110c31b8314-kube-api-access-nkbgh\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.107601 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ttml\" (UniqueName: \"kubernetes.io/projected/3d0221c3-d348-4f99-9f66-2ff299664efa-kube-api-access-7ttml\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.131487 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-751a6615-5df4-4423-b8ce-a02f9695dac9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-751a6615-5df4-4423-b8ce-a02f9695dac9\") pod \"ovsdbserver-nb-2\" (UID: \"3d0221c3-d348-4f99-9f66-2ff299664efa\") " pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.136252 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6eb64364-3710-45e8-82d7-4e6e4f444a9a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eb64364-3710-45e8-82d7-4e6e4f444a9a\") pod \"ovsdbserver-nb-1\" (UID: \"a9c4c002-69d9-4589-8c0b-a110c31b8314\") " pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.172364 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197236 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45e75a46-97ac-454a-aae6-988e4b4bd679-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197302 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ca5afa86-3545-465b-8a63-254b4d798d5c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca5afa86-3545-465b-8a63-254b4d798d5c\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197328 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b32637c-2170-4283-9a68-b8c4af56ef90-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197349 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b32637c-2170-4283-9a68-b8c4af56ef90-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197381 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b32637c-2170-4283-9a68-b8c4af56ef90-config\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197423 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e75a46-97ac-454a-aae6-988e4b4bd679-config\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197440 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8b50ef3-7b23-448e-86cb-6e46a16c2624-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197461 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6a63eaa5-4774-420e-b08c-6a7908584160\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a63eaa5-4774-420e-b08c-6a7908584160\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197477 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gxv6\" (UniqueName: \"kubernetes.io/projected/45e75a46-97ac-454a-aae6-988e4b4bd679-kube-api-access-9gxv6\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197498 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b50ef3-7b23-448e-86cb-6e46a16c2624-config\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197527 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fbd322ff-0c23-4c1e-bef9-60b1eac3cade\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbd322ff-0c23-4c1e-bef9-60b1eac3cade\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197557 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b32637c-2170-4283-9a68-b8c4af56ef90-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197579 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k9gv\" (UniqueName: \"kubernetes.io/projected/d8b50ef3-7b23-448e-86cb-6e46a16c2624-kube-api-access-4k9gv\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197603 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d8b50ef3-7b23-448e-86cb-6e46a16c2624-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197627 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8b50ef3-7b23-448e-86cb-6e46a16c2624-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197644 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54pt2\" (UniqueName: \"kubernetes.io/projected/9b32637c-2170-4283-9a68-b8c4af56ef90-kube-api-access-54pt2\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197663 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e75a46-97ac-454a-aae6-988e4b4bd679-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.197692 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45e75a46-97ac-454a-aae6-988e4b4bd679-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.198194 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/45e75a46-97ac-454a-aae6-988e4b4bd679-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.200346 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9b32637c-2170-4283-9a68-b8c4af56ef90-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.200747 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b32637c-2170-4283-9a68-b8c4af56ef90-config\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.200791 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e75a46-97ac-454a-aae6-988e4b4bd679-config\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.201030 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45e75a46-97ac-454a-aae6-988e4b4bd679-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.201419 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d8b50ef3-7b23-448e-86cb-6e46a16c2624-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.201657 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8b50ef3-7b23-448e-86cb-6e46a16c2624-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.202261 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b50ef3-7b23-448e-86cb-6e46a16c2624-config\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.203257 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b32637c-2170-4283-9a68-b8c4af56ef90-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.206119 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.206142 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.206160 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6a63eaa5-4774-420e-b08c-6a7908584160\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a63eaa5-4774-420e-b08c-6a7908584160\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c3d2180ba62a649e1c055bdd3a6a85b4c03f5258dfdc7e6aead1e379f4f2ac19/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.206186 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ca5afa86-3545-465b-8a63-254b4d798d5c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca5afa86-3545-465b-8a63-254b4d798d5c\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/947f44cdc48bafbc7d42a764b17ac25fad2f34fd6f7602e7b8d020b1929fcf34/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.207074 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.207149 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fbd322ff-0c23-4c1e-bef9-60b1eac3cade\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbd322ff-0c23-4c1e-bef9-60b1eac3cade\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/24230c703aadaeed8127ead1065cf7218272563822573a6fb01156b96830c08a/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.211204 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b32637c-2170-4283-9a68-b8c4af56ef90-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.212366 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8b50ef3-7b23-448e-86cb-6e46a16c2624-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.218341 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gxv6\" (UniqueName: \"kubernetes.io/projected/45e75a46-97ac-454a-aae6-988e4b4bd679-kube-api-access-9gxv6\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.225755 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45e75a46-97ac-454a-aae6-988e4b4bd679-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.228001 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54pt2\" (UniqueName: \"kubernetes.io/projected/9b32637c-2170-4283-9a68-b8c4af56ef90-kube-api-access-54pt2\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.232414 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k9gv\" (UniqueName: \"kubernetes.io/projected/d8b50ef3-7b23-448e-86cb-6e46a16c2624-kube-api-access-4k9gv\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.258948 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fbd322ff-0c23-4c1e-bef9-60b1eac3cade\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbd322ff-0c23-4c1e-bef9-60b1eac3cade\") pod \"ovsdbserver-sb-0\" (UID: \"45e75a46-97ac-454a-aae6-988e4b4bd679\") " pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.261922 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ca5afa86-3545-465b-8a63-254b4d798d5c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca5afa86-3545-465b-8a63-254b4d798d5c\") pod \"ovsdbserver-sb-1\" (UID: \"d8b50ef3-7b23-448e-86cb-6e46a16c2624\") " pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.286622 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6a63eaa5-4774-420e-b08c-6a7908584160\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6a63eaa5-4774-420e-b08c-6a7908584160\") pod \"ovsdbserver-sb-2\" (UID: \"9b32637c-2170-4283-9a68-b8c4af56ef90\") " pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.298698 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.430251 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.478109 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.489836 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.692688 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.756600 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 23 08:39:24 crc kubenswrapper[5118]: W0223 08:39:24.760838 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9c4c002_69d9_4589_8c0b_a110c31b8314.slice/crio-b5dc8b4b82ffd0d3bca9f366ac2a0ab6f052d0f782c7d0730d3dce5773feef33 WatchSource:0}: Error finding container b5dc8b4b82ffd0d3bca9f366ac2a0ab6f052d0f782c7d0730d3dce5773feef33: Status 404 returned error can't find the container with id b5dc8b4b82ffd0d3bca9f366ac2a0ab6f052d0f782c7d0730d3dce5773feef33 Feb 23 08:39:24 crc kubenswrapper[5118]: I0223 08:39:24.850666 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 08:39:24 crc kubenswrapper[5118]: W0223 08:39:24.855500 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45e75a46_97ac_454a_aae6_988e4b4bd679.slice/crio-c740b930c39b09383eda6720812cc13fffafce979b6ad38a838860924f1cb6c7 WatchSource:0}: Error finding container c740b930c39b09383eda6720812cc13fffafce979b6ad38a838860924f1cb6c7: Status 404 returned error can't find the container with id c740b930c39b09383eda6720812cc13fffafce979b6ad38a838860924f1cb6c7 Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.072081 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 23 08:39:25 crc kubenswrapper[5118]: W0223 08:39:25.084477 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b32637c_2170_4283_9a68_b8c4af56ef90.slice/crio-b64e7cf7be98612502b6e4d127a93cc814a6e16fb829b20ba5b9b50894c5f568 WatchSource:0}: Error finding container b64e7cf7be98612502b6e4d127a93cc814a6e16fb829b20ba5b9b50894c5f568: Status 404 returned error can't find the container with id b64e7cf7be98612502b6e4d127a93cc814a6e16fb829b20ba5b9b50894c5f568 Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.114213 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b50fcd88-9e24-4304-9ba8-892b357328b5","Type":"ContainerStarted","Data":"97fde88894d33a73313e78519c5b9187c1ffe68df5a83796e6a6398683baa84c"} Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.117172 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"a9c4c002-69d9-4589-8c0b-a110c31b8314","Type":"ContainerStarted","Data":"b5dc8b4b82ffd0d3bca9f366ac2a0ab6f052d0f782c7d0730d3dce5773feef33"} Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.121016 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"45e75a46-97ac-454a-aae6-988e4b4bd679","Type":"ContainerStarted","Data":"c740b930c39b09383eda6720812cc13fffafce979b6ad38a838860924f1cb6c7"} Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.122546 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9b32637c-2170-4283-9a68-b8c4af56ef90","Type":"ContainerStarted","Data":"b64e7cf7be98612502b6e4d127a93cc814a6e16fb829b20ba5b9b50894c5f568"} Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.185659 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 23 08:39:25 crc kubenswrapper[5118]: W0223 08:39:25.191363 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8b50ef3_7b23_448e_86cb_6e46a16c2624.slice/crio-8d6cc2c586546eae58ecb0d1f57eb2f5180348bce3008fb0f9f4f06e81843aac WatchSource:0}: Error finding container 8d6cc2c586546eae58ecb0d1f57eb2f5180348bce3008fb0f9f4f06e81843aac: Status 404 returned error can't find the container with id 8d6cc2c586546eae58ecb0d1f57eb2f5180348bce3008fb0f9f4f06e81843aac Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.278956 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmt6"] Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.282777 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.285344 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmt6"] Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.321255 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-catalog-content\") pod \"redhat-marketplace-ncmt6\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.321406 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f74vc\" (UniqueName: \"kubernetes.io/projected/477e91d1-5190-4dd4-a80c-eb703c79bcbc-kube-api-access-f74vc\") pod \"redhat-marketplace-ncmt6\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.321450 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-utilities\") pod \"redhat-marketplace-ncmt6\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.396860 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 23 08:39:25 crc kubenswrapper[5118]: W0223 08:39:25.407338 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d0221c3_d348_4f99_9f66_2ff299664efa.slice/crio-b95e4a34ba9482b23623a4b91927358c35811ca4be8a4dd573089f360babe737 WatchSource:0}: Error finding container b95e4a34ba9482b23623a4b91927358c35811ca4be8a4dd573089f360babe737: Status 404 returned error can't find the container with id b95e4a34ba9482b23623a4b91927358c35811ca4be8a4dd573089f360babe737 Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.423559 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f74vc\" (UniqueName: \"kubernetes.io/projected/477e91d1-5190-4dd4-a80c-eb703c79bcbc-kube-api-access-f74vc\") pod \"redhat-marketplace-ncmt6\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.423604 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-utilities\") pod \"redhat-marketplace-ncmt6\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.424242 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-catalog-content\") pod \"redhat-marketplace-ncmt6\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.424300 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-catalog-content\") pod \"redhat-marketplace-ncmt6\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.424305 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-utilities\") pod \"redhat-marketplace-ncmt6\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.444575 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f74vc\" (UniqueName: \"kubernetes.io/projected/477e91d1-5190-4dd4-a80c-eb703c79bcbc-kube-api-access-f74vc\") pod \"redhat-marketplace-ncmt6\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:25 crc kubenswrapper[5118]: I0223 08:39:25.623282 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:26 crc kubenswrapper[5118]: I0223 08:39:26.091803 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmt6"] Feb 23 08:39:26 crc kubenswrapper[5118]: I0223 08:39:26.133685 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmt6" event={"ID":"477e91d1-5190-4dd4-a80c-eb703c79bcbc","Type":"ContainerStarted","Data":"2f3e3e56799c9d343ddb5a7e2b2b87f38cbb4b0e0b70d330d35a969719ba0e8d"} Feb 23 08:39:26 crc kubenswrapper[5118]: I0223 08:39:26.136258 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"d8b50ef3-7b23-448e-86cb-6e46a16c2624","Type":"ContainerStarted","Data":"8d6cc2c586546eae58ecb0d1f57eb2f5180348bce3008fb0f9f4f06e81843aac"} Feb 23 08:39:26 crc kubenswrapper[5118]: I0223 08:39:26.138352 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3d0221c3-d348-4f99-9f66-2ff299664efa","Type":"ContainerStarted","Data":"b95e4a34ba9482b23623a4b91927358c35811ca4be8a4dd573089f360babe737"} Feb 23 08:39:27 crc kubenswrapper[5118]: I0223 08:39:27.151709 5118 generic.go:334] "Generic (PLEG): container finished" podID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerID="2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1" exitCode=0 Feb 23 08:39:27 crc kubenswrapper[5118]: I0223 08:39:27.151854 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmt6" event={"ID":"477e91d1-5190-4dd4-a80c-eb703c79bcbc","Type":"ContainerDied","Data":"2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1"} Feb 23 08:39:27 crc kubenswrapper[5118]: I0223 08:39:27.158063 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:39:28 crc kubenswrapper[5118]: I0223 08:39:28.193371 5118 generic.go:334] "Generic (PLEG): container finished" podID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerID="98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2" exitCode=0 Feb 23 08:39:28 crc kubenswrapper[5118]: I0223 08:39:28.194741 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmt6" event={"ID":"477e91d1-5190-4dd4-a80c-eb703c79bcbc","Type":"ContainerDied","Data":"98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2"} Feb 23 08:39:28 crc kubenswrapper[5118]: I0223 08:39:28.697633 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:39:28 crc kubenswrapper[5118]: E0223 08:39:28.698305 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.226213 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmt6" event={"ID":"477e91d1-5190-4dd4-a80c-eb703c79bcbc","Type":"ContainerStarted","Data":"9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.229375 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9b32637c-2170-4283-9a68-b8c4af56ef90","Type":"ContainerStarted","Data":"38fef7d03f9537993e271afab4c7c13a153b6bb51acf166e9d46826ffdadaed0"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.229432 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9b32637c-2170-4283-9a68-b8c4af56ef90","Type":"ContainerStarted","Data":"d34b56bdb33ce2d91610e48aa12a8d95771c66e14bc639ba10c5304abad161dc"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.231778 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3d0221c3-d348-4f99-9f66-2ff299664efa","Type":"ContainerStarted","Data":"c70f155bcc352a85c73997af9d50cedb083d5ee7847001ccac613e6073cb93df"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.231812 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3d0221c3-d348-4f99-9f66-2ff299664efa","Type":"ContainerStarted","Data":"5a7cc358f411637f0530c4722d35cb891f8c3e6a3b109eeb45e880b8bd53c47a"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.235296 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b50fcd88-9e24-4304-9ba8-892b357328b5","Type":"ContainerStarted","Data":"2da71d43a399445afbb3ad3f858323a243efb4c6a06eedbc7554065892758c7d"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.235327 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b50fcd88-9e24-4304-9ba8-892b357328b5","Type":"ContainerStarted","Data":"d38076a1f8d2b4405bd000cbf47b3baf61ec8b0a539cefe3bb86f46a4d77afa6"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.237492 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"a9c4c002-69d9-4589-8c0b-a110c31b8314","Type":"ContainerStarted","Data":"80a4fe92ecfb38144a8618216d3b531e9e4169661a76a11afa25c541a4443a0a"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.237516 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"a9c4c002-69d9-4589-8c0b-a110c31b8314","Type":"ContainerStarted","Data":"646be89e155b76db3de9cd217bc6d3326c3f1c34d9f0ef5f58b3e2ee0914422e"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.241582 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"45e75a46-97ac-454a-aae6-988e4b4bd679","Type":"ContainerStarted","Data":"506914125ee2b67db2dd2e59f41622d3564d1a00d8c8c7816185ff8975fcbf05"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.241635 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"45e75a46-97ac-454a-aae6-988e4b4bd679","Type":"ContainerStarted","Data":"b98beeab84fe03d56aa224534021c817dfedbfea5d2011f1cdb16f3c0141c024"} Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.258079 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ncmt6" podStartSLOduration=2.853589917 podStartE2EDuration="6.258058251s" podCreationTimestamp="2026-02-23 08:39:25 +0000 UTC" firstStartedPulling="2026-02-23 08:39:27.156674559 +0000 UTC m=+6830.160459172" lastFinishedPulling="2026-02-23 08:39:30.561142913 +0000 UTC m=+6833.564927506" observedRunningTime="2026-02-23 08:39:31.252777783 +0000 UTC m=+6834.256562356" watchObservedRunningTime="2026-02-23 08:39:31.258058251 +0000 UTC m=+6834.261842824" Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.275718 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.582103121 podStartE2EDuration="9.275700785s" podCreationTimestamp="2026-02-23 08:39:22 +0000 UTC" firstStartedPulling="2026-02-23 08:39:24.859451404 +0000 UTC m=+6827.863235977" lastFinishedPulling="2026-02-23 08:39:30.553049068 +0000 UTC m=+6833.556833641" observedRunningTime="2026-02-23 08:39:31.273982314 +0000 UTC m=+6834.277766887" watchObservedRunningTime="2026-02-23 08:39:31.275700785 +0000 UTC m=+6834.279485358" Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.300460 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.5010297379999997 podStartE2EDuration="9.300433241s" podCreationTimestamp="2026-02-23 08:39:22 +0000 UTC" firstStartedPulling="2026-02-23 08:39:24.763710777 +0000 UTC m=+6827.767495350" lastFinishedPulling="2026-02-23 08:39:30.56311427 +0000 UTC m=+6833.566898853" observedRunningTime="2026-02-23 08:39:31.293558755 +0000 UTC m=+6834.297343318" watchObservedRunningTime="2026-02-23 08:39:31.300433241 +0000 UTC m=+6834.304217814" Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.318773 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.4155917000000002 podStartE2EDuration="9.318752202s" podCreationTimestamp="2026-02-23 08:39:22 +0000 UTC" firstStartedPulling="2026-02-23 08:39:24.693269061 +0000 UTC m=+6827.697053634" lastFinishedPulling="2026-02-23 08:39:30.596429573 +0000 UTC m=+6833.600214136" observedRunningTime="2026-02-23 08:39:31.314883189 +0000 UTC m=+6834.318667762" watchObservedRunningTime="2026-02-23 08:39:31.318752202 +0000 UTC m=+6834.322536775" Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.341502 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.174684054 podStartE2EDuration="9.341479939s" podCreationTimestamp="2026-02-23 08:39:22 +0000 UTC" firstStartedPulling="2026-02-23 08:39:25.412806173 +0000 UTC m=+6828.416590746" lastFinishedPulling="2026-02-23 08:39:30.579602058 +0000 UTC m=+6833.583386631" observedRunningTime="2026-02-23 08:39:31.338252102 +0000 UTC m=+6834.342036685" watchObservedRunningTime="2026-02-23 08:39:31.341479939 +0000 UTC m=+6834.345264512" Feb 23 08:39:31 crc kubenswrapper[5118]: I0223 08:39:31.363560 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.89008766 podStartE2EDuration="9.363530801s" podCreationTimestamp="2026-02-23 08:39:22 +0000 UTC" firstStartedPulling="2026-02-23 08:39:25.089196658 +0000 UTC m=+6828.092981221" lastFinishedPulling="2026-02-23 08:39:30.562639779 +0000 UTC m=+6833.566424362" observedRunningTime="2026-02-23 08:39:31.360823046 +0000 UTC m=+6834.364607639" watchObservedRunningTime="2026-02-23 08:39:31.363530801 +0000 UTC m=+6834.367315374" Feb 23 08:39:32 crc kubenswrapper[5118]: I0223 08:39:32.253563 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"d8b50ef3-7b23-448e-86cb-6e46a16c2624","Type":"ContainerStarted","Data":"d2c6c026eb7dd785c6a10b6c8ab48d06d5f752dd912077a5f65f867dc04c0a19"} Feb 23 08:39:32 crc kubenswrapper[5118]: I0223 08:39:32.254058 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"d8b50ef3-7b23-448e-86cb-6e46a16c2624","Type":"ContainerStarted","Data":"1d4dd45274c8362817f6a97b365e6c3c12ebbf99b409c89120a38b964623146e"} Feb 23 08:39:32 crc kubenswrapper[5118]: I0223 08:39:32.281884 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.680409288 podStartE2EDuration="10.281857381s" podCreationTimestamp="2026-02-23 08:39:22 +0000 UTC" firstStartedPulling="2026-02-23 08:39:25.195617901 +0000 UTC m=+6828.199402474" lastFinishedPulling="2026-02-23 08:39:31.797066004 +0000 UTC m=+6834.800850567" observedRunningTime="2026-02-23 08:39:32.274145575 +0000 UTC m=+6835.277930148" watchObservedRunningTime="2026-02-23 08:39:32.281857381 +0000 UTC m=+6835.285641954" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.099654 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.173351 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.179205 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.235715 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.261453 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.261493 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.300187 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.335604 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.431048 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.479038 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.490425 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.490580 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:33 crc kubenswrapper[5118]: I0223 08:39:33.537461 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:34 crc kubenswrapper[5118]: I0223 08:39:34.271580 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:34 crc kubenswrapper[5118]: I0223 08:39:34.272552 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:34 crc kubenswrapper[5118]: I0223 08:39:34.272642 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:34 crc kubenswrapper[5118]: I0223 08:39:34.489983 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.342664 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.350936 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.356037 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.603880 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f9689979f-zmgv7"] Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.608482 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.611480 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.621378 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f9689979f-zmgv7"] Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.625178 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.625219 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.715200 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.732630 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-dns-svc\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.732689 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-config\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.732911 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.732938 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t2vv\" (UniqueName: \"kubernetes.io/projected/70ed0d19-84e1-46cf-81d2-997db5dc3fee-kube-api-access-8t2vv\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.800063 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f9689979f-zmgv7"] Feb 23 08:39:35 crc kubenswrapper[5118]: E0223 08:39:35.800784 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-8t2vv ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" podUID="70ed0d19-84e1-46cf-81d2-997db5dc3fee" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.834797 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.835686 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t2vv\" (UniqueName: \"kubernetes.io/projected/70ed0d19-84e1-46cf-81d2-997db5dc3fee-kube-api-access-8t2vv\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.836675 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.836783 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-dns-svc\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.836830 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-config\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.837781 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-dns-svc\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.838596 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-config\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.858165 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568588dc4c-pjd2f"] Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.859557 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.864368 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.875474 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568588dc4c-pjd2f"] Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.882234 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t2vv\" (UniqueName: \"kubernetes.io/projected/70ed0d19-84e1-46cf-81d2-997db5dc3fee-kube-api-access-8t2vv\") pod \"dnsmasq-dns-5f9689979f-zmgv7\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.938447 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-dns-svc\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.938509 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-nb\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.938661 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55t7\" (UniqueName: \"kubernetes.io/projected/4a992050-916c-4cad-873e-cef0807ef46a-kube-api-access-j55t7\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.939038 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-config\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:35 crc kubenswrapper[5118]: I0223 08:39:35.939638 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-sb\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.042059 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j55t7\" (UniqueName: \"kubernetes.io/projected/4a992050-916c-4cad-873e-cef0807ef46a-kube-api-access-j55t7\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.042309 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-config\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.042379 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-sb\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.042453 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-dns-svc\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.042496 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-nb\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.043353 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-sb\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.044211 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-dns-svc\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.045066 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-config\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.045853 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-nb\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.070211 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55t7\" (UniqueName: \"kubernetes.io/projected/4a992050-916c-4cad-873e-cef0807ef46a-kube-api-access-j55t7\") pod \"dnsmasq-dns-568588dc4c-pjd2f\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.182650 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.295306 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.325829 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.347916 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-ovsdbserver-nb\") pod \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.348531 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t2vv\" (UniqueName: \"kubernetes.io/projected/70ed0d19-84e1-46cf-81d2-997db5dc3fee-kube-api-access-8t2vv\") pod \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.348655 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-dns-svc\") pod \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.348757 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-config\") pod \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\" (UID: \"70ed0d19-84e1-46cf-81d2-997db5dc3fee\") " Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.349011 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "70ed0d19-84e1-46cf-81d2-997db5dc3fee" (UID: "70ed0d19-84e1-46cf-81d2-997db5dc3fee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.349481 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.351159 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "70ed0d19-84e1-46cf-81d2-997db5dc3fee" (UID: "70ed0d19-84e1-46cf-81d2-997db5dc3fee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.351371 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-config" (OuterVolumeSpecName: "config") pod "70ed0d19-84e1-46cf-81d2-997db5dc3fee" (UID: "70ed0d19-84e1-46cf-81d2-997db5dc3fee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.372676 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ed0d19-84e1-46cf-81d2-997db5dc3fee-kube-api-access-8t2vv" (OuterVolumeSpecName: "kube-api-access-8t2vv") pod "70ed0d19-84e1-46cf-81d2-997db5dc3fee" (UID: "70ed0d19-84e1-46cf-81d2-997db5dc3fee"). InnerVolumeSpecName "kube-api-access-8t2vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.382851 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.451633 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.451663 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70ed0d19-84e1-46cf-81d2-997db5dc3fee-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.451673 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t2vv\" (UniqueName: \"kubernetes.io/projected/70ed0d19-84e1-46cf-81d2-997db5dc3fee-kube-api-access-8t2vv\") on node \"crc\" DevicePath \"\"" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.536973 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:36 crc kubenswrapper[5118]: I0223 08:39:36.732895 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568588dc4c-pjd2f"] Feb 23 08:39:37 crc kubenswrapper[5118]: I0223 08:39:37.309524 5118 generic.go:334] "Generic (PLEG): container finished" podID="4a992050-916c-4cad-873e-cef0807ef46a" containerID="3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309" exitCode=0 Feb 23 08:39:37 crc kubenswrapper[5118]: I0223 08:39:37.309864 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9689979f-zmgv7" Feb 23 08:39:37 crc kubenswrapper[5118]: I0223 08:39:37.309668 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" event={"ID":"4a992050-916c-4cad-873e-cef0807ef46a","Type":"ContainerDied","Data":"3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309"} Feb 23 08:39:37 crc kubenswrapper[5118]: I0223 08:39:37.310271 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" event={"ID":"4a992050-916c-4cad-873e-cef0807ef46a","Type":"ContainerStarted","Data":"bd279d661d7470d6fa32ad27ef975ae99579f535b7ff02a54676ddcd9b9bfb1c"} Feb 23 08:39:37 crc kubenswrapper[5118]: I0223 08:39:37.433971 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f9689979f-zmgv7"] Feb 23 08:39:37 crc kubenswrapper[5118]: I0223 08:39:37.517553 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f9689979f-zmgv7"] Feb 23 08:39:37 crc kubenswrapper[5118]: I0223 08:39:37.712381 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ed0d19-84e1-46cf-81d2-997db5dc3fee" path="/var/lib/kubelet/pods/70ed0d19-84e1-46cf-81d2-997db5dc3fee/volumes" Feb 23 08:39:38 crc kubenswrapper[5118]: I0223 08:39:38.319078 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" event={"ID":"4a992050-916c-4cad-873e-cef0807ef46a","Type":"ContainerStarted","Data":"ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3"} Feb 23 08:39:38 crc kubenswrapper[5118]: I0223 08:39:38.319317 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:38 crc kubenswrapper[5118]: I0223 08:39:38.353897 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" podStartSLOduration=3.353870422 podStartE2EDuration="3.353870422s" podCreationTimestamp="2026-02-23 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:39:38.345121391 +0000 UTC m=+6841.348905974" watchObservedRunningTime="2026-02-23 08:39:38.353870422 +0000 UTC m=+6841.357654995" Feb 23 08:39:39 crc kubenswrapper[5118]: I0223 08:39:39.181598 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 23 08:39:39 crc kubenswrapper[5118]: I0223 08:39:39.268927 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 23 08:39:39 crc kubenswrapper[5118]: I0223 08:39:39.537483 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 23 08:39:40 crc kubenswrapper[5118]: I0223 08:39:40.649619 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmt6"] Feb 23 08:39:40 crc kubenswrapper[5118]: I0223 08:39:40.650146 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ncmt6" podUID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerName="registry-server" containerID="cri-o://9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb" gracePeriod=2 Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.193638 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.267063 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f74vc\" (UniqueName: \"kubernetes.io/projected/477e91d1-5190-4dd4-a80c-eb703c79bcbc-kube-api-access-f74vc\") pod \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.267241 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-catalog-content\") pod \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.267351 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-utilities\") pod \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\" (UID: \"477e91d1-5190-4dd4-a80c-eb703c79bcbc\") " Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.268874 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-utilities" (OuterVolumeSpecName: "utilities") pod "477e91d1-5190-4dd4-a80c-eb703c79bcbc" (UID: "477e91d1-5190-4dd4-a80c-eb703c79bcbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.275590 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477e91d1-5190-4dd4-a80c-eb703c79bcbc-kube-api-access-f74vc" (OuterVolumeSpecName: "kube-api-access-f74vc") pod "477e91d1-5190-4dd4-a80c-eb703c79bcbc" (UID: "477e91d1-5190-4dd4-a80c-eb703c79bcbc"). InnerVolumeSpecName "kube-api-access-f74vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.312275 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "477e91d1-5190-4dd4-a80c-eb703c79bcbc" (UID: "477e91d1-5190-4dd4-a80c-eb703c79bcbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.354215 5118 generic.go:334] "Generic (PLEG): container finished" podID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerID="9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb" exitCode=0 Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.354283 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmt6" event={"ID":"477e91d1-5190-4dd4-a80c-eb703c79bcbc","Type":"ContainerDied","Data":"9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb"} Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.354306 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncmt6" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.354345 5118 scope.go:117] "RemoveContainer" containerID="9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.354327 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncmt6" event={"ID":"477e91d1-5190-4dd4-a80c-eb703c79bcbc","Type":"ContainerDied","Data":"2f3e3e56799c9d343ddb5a7e2b2b87f38cbb4b0e0b70d330d35a969719ba0e8d"} Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.372324 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f74vc\" (UniqueName: \"kubernetes.io/projected/477e91d1-5190-4dd4-a80c-eb703c79bcbc-kube-api-access-f74vc\") on node \"crc\" DevicePath \"\"" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.372434 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.372453 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477e91d1-5190-4dd4-a80c-eb703c79bcbc-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.426536 5118 scope.go:117] "RemoveContainer" containerID="98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.436395 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmt6"] Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.453449 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncmt6"] Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.468146 5118 scope.go:117] "RemoveContainer" containerID="2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.532803 5118 scope.go:117] "RemoveContainer" containerID="9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb" Feb 23 08:39:41 crc kubenswrapper[5118]: E0223 08:39:41.533327 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb\": container with ID starting with 9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb not found: ID does not exist" containerID="9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.533371 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb"} err="failed to get container status \"9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb\": rpc error: code = NotFound desc = could not find container \"9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb\": container with ID starting with 9069324bb985e11300da83bcf95a8764a4fc773f02f4f0936bc91f2e8465adeb not found: ID does not exist" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.533400 5118 scope.go:117] "RemoveContainer" containerID="98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2" Feb 23 08:39:41 crc kubenswrapper[5118]: E0223 08:39:41.533689 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2\": container with ID starting with 98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2 not found: ID does not exist" containerID="98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.533731 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2"} err="failed to get container status \"98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2\": rpc error: code = NotFound desc = could not find container \"98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2\": container with ID starting with 98399c63ff25dcacbd7510eca85c0dfe649af00bb65f26368334b83ff25936f2 not found: ID does not exist" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.533808 5118 scope.go:117] "RemoveContainer" containerID="2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1" Feb 23 08:39:41 crc kubenswrapper[5118]: E0223 08:39:41.534513 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1\": container with ID starting with 2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1 not found: ID does not exist" containerID="2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.534559 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1"} err="failed to get container status \"2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1\": rpc error: code = NotFound desc = could not find container \"2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1\": container with ID starting with 2bfc5807d104015b33565fc000038bda8e6f1f76ef9680922ee94e8083fa4ae1 not found: ID does not exist" Feb 23 08:39:41 crc kubenswrapper[5118]: I0223 08:39:41.715405 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" path="/var/lib/kubelet/pods/477e91d1-5190-4dd4-a80c-eb703c79bcbc/volumes" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.377394 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 23 08:39:42 crc kubenswrapper[5118]: E0223 08:39:42.378031 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerName="extract-content" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.378051 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerName="extract-content" Feb 23 08:39:42 crc kubenswrapper[5118]: E0223 08:39:42.378081 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerName="extract-utilities" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.378108 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerName="extract-utilities" Feb 23 08:39:42 crc kubenswrapper[5118]: E0223 08:39:42.378125 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerName="registry-server" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.378134 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerName="registry-server" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.378345 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="477e91d1-5190-4dd4-a80c-eb703c79bcbc" containerName="registry-server" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.379232 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.383062 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.397541 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.494079 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d23ec113-64c1-41ec-84b2-e92ad8675129-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.494166 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-269946c1-b1f0-400d-ab26-e812b34a4691\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-269946c1-b1f0-400d-ab26-e812b34a4691\") pod \"ovn-copy-data\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.494590 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtj78\" (UniqueName: \"kubernetes.io/projected/d23ec113-64c1-41ec-84b2-e92ad8675129-kube-api-access-vtj78\") pod \"ovn-copy-data\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.596905 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtj78\" (UniqueName: \"kubernetes.io/projected/d23ec113-64c1-41ec-84b2-e92ad8675129-kube-api-access-vtj78\") pod \"ovn-copy-data\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.597035 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d23ec113-64c1-41ec-84b2-e92ad8675129-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.597148 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-269946c1-b1f0-400d-ab26-e812b34a4691\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-269946c1-b1f0-400d-ab26-e812b34a4691\") pod \"ovn-copy-data\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.600892 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.600938 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-269946c1-b1f0-400d-ab26-e812b34a4691\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-269946c1-b1f0-400d-ab26-e812b34a4691\") pod \"ovn-copy-data\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0fe6bc48d467964692a3c0e0de6c72c0309a13a66f65a6c825c5558329aaa012/globalmount\"" pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.605523 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d23ec113-64c1-41ec-84b2-e92ad8675129-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.626846 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtj78\" (UniqueName: \"kubernetes.io/projected/d23ec113-64c1-41ec-84b2-e92ad8675129-kube-api-access-vtj78\") pod \"ovn-copy-data\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.639543 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-269946c1-b1f0-400d-ab26-e812b34a4691\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-269946c1-b1f0-400d-ab26-e812b34a4691\") pod \"ovn-copy-data\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " pod="openstack/ovn-copy-data" Feb 23 08:39:42 crc kubenswrapper[5118]: I0223 08:39:42.706746 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 23 08:39:43 crc kubenswrapper[5118]: I0223 08:39:43.336142 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 23 08:39:43 crc kubenswrapper[5118]: I0223 08:39:43.382675 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d23ec113-64c1-41ec-84b2-e92ad8675129","Type":"ContainerStarted","Data":"00bc12bd6b712cc6014c490b2679ce04c13e9a32cc1a6bd6185e4f7a7f7a3a30"} Feb 23 08:39:43 crc kubenswrapper[5118]: I0223 08:39:43.697046 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:39:43 crc kubenswrapper[5118]: E0223 08:39:43.697524 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:39:44 crc kubenswrapper[5118]: I0223 08:39:44.395690 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d23ec113-64c1-41ec-84b2-e92ad8675129","Type":"ContainerStarted","Data":"1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63"} Feb 23 08:39:44 crc kubenswrapper[5118]: I0223 08:39:44.423942 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.207616245 podStartE2EDuration="3.423909764s" podCreationTimestamp="2026-02-23 08:39:41 +0000 UTC" firstStartedPulling="2026-02-23 08:39:43.342534807 +0000 UTC m=+6846.346319400" lastFinishedPulling="2026-02-23 08:39:43.558828346 +0000 UTC m=+6846.562612919" observedRunningTime="2026-02-23 08:39:44.416838904 +0000 UTC m=+6847.420623517" watchObservedRunningTime="2026-02-23 08:39:44.423909764 +0000 UTC m=+6847.427694367" Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.184492 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.275605 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-s5blm"] Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.276180 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" podUID="87add532-23a8-4a35-a83c-10fbe3457f67" containerName="dnsmasq-dns" containerID="cri-o://8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046" gracePeriod=10 Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.806502 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.896961 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-dns-svc\") pod \"87add532-23a8-4a35-a83c-10fbe3457f67\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.897026 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b87xf\" (UniqueName: \"kubernetes.io/projected/87add532-23a8-4a35-a83c-10fbe3457f67-kube-api-access-b87xf\") pod \"87add532-23a8-4a35-a83c-10fbe3457f67\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.897068 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-config\") pod \"87add532-23a8-4a35-a83c-10fbe3457f67\" (UID: \"87add532-23a8-4a35-a83c-10fbe3457f67\") " Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.903142 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87add532-23a8-4a35-a83c-10fbe3457f67-kube-api-access-b87xf" (OuterVolumeSpecName: "kube-api-access-b87xf") pod "87add532-23a8-4a35-a83c-10fbe3457f67" (UID: "87add532-23a8-4a35-a83c-10fbe3457f67"). InnerVolumeSpecName "kube-api-access-b87xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.935169 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-config" (OuterVolumeSpecName: "config") pod "87add532-23a8-4a35-a83c-10fbe3457f67" (UID: "87add532-23a8-4a35-a83c-10fbe3457f67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.947346 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87add532-23a8-4a35-a83c-10fbe3457f67" (UID: "87add532-23a8-4a35-a83c-10fbe3457f67"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.998791 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.998848 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87add532-23a8-4a35-a83c-10fbe3457f67-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:39:46 crc kubenswrapper[5118]: I0223 08:39:46.998875 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b87xf\" (UniqueName: \"kubernetes.io/projected/87add532-23a8-4a35-a83c-10fbe3457f67-kube-api-access-b87xf\") on node \"crc\" DevicePath \"\"" Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.428995 5118 generic.go:334] "Generic (PLEG): container finished" podID="87add532-23a8-4a35-a83c-10fbe3457f67" containerID="8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046" exitCode=0 Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.429054 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" event={"ID":"87add532-23a8-4a35-a83c-10fbe3457f67","Type":"ContainerDied","Data":"8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046"} Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.429059 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.429112 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79496f79cc-s5blm" event={"ID":"87add532-23a8-4a35-a83c-10fbe3457f67","Type":"ContainerDied","Data":"1d65cce26dec757f52f85f7a943c46c382a6f05790a5a29141f09bb68391e473"} Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.429136 5118 scope.go:117] "RemoveContainer" containerID="8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046" Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.474330 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-s5blm"] Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.474654 5118 scope.go:117] "RemoveContainer" containerID="15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7" Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.481387 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79496f79cc-s5blm"] Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.505588 5118 scope.go:117] "RemoveContainer" containerID="8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046" Feb 23 08:39:47 crc kubenswrapper[5118]: E0223 08:39:47.506135 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046\": container with ID starting with 8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046 not found: ID does not exist" containerID="8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046" Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.506170 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046"} err="failed to get container status \"8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046\": rpc error: code = NotFound desc = could not find container \"8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046\": container with ID starting with 8027a537755ad7f3619d05a6f4383cc416d2793fef7324afccac38db369fe046 not found: ID does not exist" Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.506193 5118 scope.go:117] "RemoveContainer" containerID="15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7" Feb 23 08:39:47 crc kubenswrapper[5118]: E0223 08:39:47.506620 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7\": container with ID starting with 15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7 not found: ID does not exist" containerID="15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7" Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.506670 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7"} err="failed to get container status \"15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7\": rpc error: code = NotFound desc = could not find container \"15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7\": container with ID starting with 15a371d268796b94044629f87066b416b624872572d0adb8331e4223d2387fd7 not found: ID does not exist" Feb 23 08:39:47 crc kubenswrapper[5118]: I0223 08:39:47.714326 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87add532-23a8-4a35-a83c-10fbe3457f67" path="/var/lib/kubelet/pods/87add532-23a8-4a35-a83c-10fbe3457f67/volumes" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.856683 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 23 08:39:52 crc kubenswrapper[5118]: E0223 08:39:52.858051 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87add532-23a8-4a35-a83c-10fbe3457f67" containerName="init" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.858068 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="87add532-23a8-4a35-a83c-10fbe3457f67" containerName="init" Feb 23 08:39:52 crc kubenswrapper[5118]: E0223 08:39:52.858082 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87add532-23a8-4a35-a83c-10fbe3457f67" containerName="dnsmasq-dns" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.858087 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="87add532-23a8-4a35-a83c-10fbe3457f67" containerName="dnsmasq-dns" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.858266 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="87add532-23a8-4a35-a83c-10fbe3457f67" containerName="dnsmasq-dns" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.859208 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.864236 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xr8xn" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.864524 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.864670 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.876425 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.929190 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c896a524-fcc3-4636-867e-09ff5e556861-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.929267 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjs9k\" (UniqueName: \"kubernetes.io/projected/c896a524-fcc3-4636-867e-09ff5e556861-kube-api-access-kjs9k\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.929292 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c896a524-fcc3-4636-867e-09ff5e556861-scripts\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.929328 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c896a524-fcc3-4636-867e-09ff5e556861-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:52 crc kubenswrapper[5118]: I0223 08:39:52.929791 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c896a524-fcc3-4636-867e-09ff5e556861-config\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.031489 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c896a524-fcc3-4636-867e-09ff5e556861-config\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.031569 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c896a524-fcc3-4636-867e-09ff5e556861-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.031608 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjs9k\" (UniqueName: \"kubernetes.io/projected/c896a524-fcc3-4636-867e-09ff5e556861-kube-api-access-kjs9k\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.031631 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c896a524-fcc3-4636-867e-09ff5e556861-scripts\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.031647 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c896a524-fcc3-4636-867e-09ff5e556861-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.032427 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c896a524-fcc3-4636-867e-09ff5e556861-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.032922 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c896a524-fcc3-4636-867e-09ff5e556861-scripts\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.033244 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c896a524-fcc3-4636-867e-09ff5e556861-config\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.040239 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c896a524-fcc3-4636-867e-09ff5e556861-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.050332 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjs9k\" (UniqueName: \"kubernetes.io/projected/c896a524-fcc3-4636-867e-09ff5e556861-kube-api-access-kjs9k\") pod \"ovn-northd-0\" (UID: \"c896a524-fcc3-4636-867e-09ff5e556861\") " pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.187382 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 08:39:53 crc kubenswrapper[5118]: I0223 08:39:53.716752 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 08:39:54 crc kubenswrapper[5118]: I0223 08:39:54.527451 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c896a524-fcc3-4636-867e-09ff5e556861","Type":"ContainerStarted","Data":"7660006c8545996e366773d7366be8fa63e0c659e0c606c4c5cdf6dd6946ba08"} Feb 23 08:39:56 crc kubenswrapper[5118]: I0223 08:39:56.558524 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c896a524-fcc3-4636-867e-09ff5e556861","Type":"ContainerStarted","Data":"6a464357b1b99fe6cd149463794c0fe203a2335b334baf7ca237a3264d52b253"} Feb 23 08:39:56 crc kubenswrapper[5118]: I0223 08:39:56.559205 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 23 08:39:56 crc kubenswrapper[5118]: I0223 08:39:56.559230 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c896a524-fcc3-4636-867e-09ff5e556861","Type":"ContainerStarted","Data":"0879c89cfcac15930d3719c2a8328bfe8cbf434eab76362a18a4b830f6fee3a6"} Feb 23 08:39:58 crc kubenswrapper[5118]: I0223 08:39:58.697939 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:39:58 crc kubenswrapper[5118]: E0223 08:39:58.698772 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.146474 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=11.033416801 podStartE2EDuration="13.146435038s" podCreationTimestamp="2026-02-23 08:39:52 +0000 UTC" firstStartedPulling="2026-02-23 08:39:53.706749935 +0000 UTC m=+6856.710534508" lastFinishedPulling="2026-02-23 08:39:55.819768172 +0000 UTC m=+6858.823552745" observedRunningTime="2026-02-23 08:39:56.589600616 +0000 UTC m=+6859.593385259" watchObservedRunningTime="2026-02-23 08:40:05.146435038 +0000 UTC m=+6868.150219631" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.148457 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-tgzf2"] Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.149953 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tgzf2" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.156574 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tgzf2"] Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.186533 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rthl\" (UniqueName: \"kubernetes.io/projected/9c7a2f84-6788-4fae-a465-b0cf157c16e5-kube-api-access-5rthl\") pod \"keystone-db-create-tgzf2\" (UID: \"9c7a2f84-6788-4fae-a465-b0cf157c16e5\") " pod="openstack/keystone-db-create-tgzf2" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.186631 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c7a2f84-6788-4fae-a465-b0cf157c16e5-operator-scripts\") pod \"keystone-db-create-tgzf2\" (UID: \"9c7a2f84-6788-4fae-a465-b0cf157c16e5\") " pod="openstack/keystone-db-create-tgzf2" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.240837 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9e19-account-create-update-pmfcs"] Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.241932 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9e19-account-create-update-pmfcs" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.243803 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.263745 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9e19-account-create-update-pmfcs"] Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.287509 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c7a2f84-6788-4fae-a465-b0cf157c16e5-operator-scripts\") pod \"keystone-db-create-tgzf2\" (UID: \"9c7a2f84-6788-4fae-a465-b0cf157c16e5\") " pod="openstack/keystone-db-create-tgzf2" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.287616 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-operator-scripts\") pod \"keystone-9e19-account-create-update-pmfcs\" (UID: \"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76\") " pod="openstack/keystone-9e19-account-create-update-pmfcs" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.287668 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsggj\" (UniqueName: \"kubernetes.io/projected/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-kube-api-access-zsggj\") pod \"keystone-9e19-account-create-update-pmfcs\" (UID: \"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76\") " pod="openstack/keystone-9e19-account-create-update-pmfcs" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.287693 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rthl\" (UniqueName: \"kubernetes.io/projected/9c7a2f84-6788-4fae-a465-b0cf157c16e5-kube-api-access-5rthl\") pod \"keystone-db-create-tgzf2\" (UID: \"9c7a2f84-6788-4fae-a465-b0cf157c16e5\") " pod="openstack/keystone-db-create-tgzf2" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.288388 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c7a2f84-6788-4fae-a465-b0cf157c16e5-operator-scripts\") pod \"keystone-db-create-tgzf2\" (UID: \"9c7a2f84-6788-4fae-a465-b0cf157c16e5\") " pod="openstack/keystone-db-create-tgzf2" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.323955 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rthl\" (UniqueName: \"kubernetes.io/projected/9c7a2f84-6788-4fae-a465-b0cf157c16e5-kube-api-access-5rthl\") pod \"keystone-db-create-tgzf2\" (UID: \"9c7a2f84-6788-4fae-a465-b0cf157c16e5\") " pod="openstack/keystone-db-create-tgzf2" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.392289 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-operator-scripts\") pod \"keystone-9e19-account-create-update-pmfcs\" (UID: \"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76\") " pod="openstack/keystone-9e19-account-create-update-pmfcs" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.392380 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsggj\" (UniqueName: \"kubernetes.io/projected/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-kube-api-access-zsggj\") pod \"keystone-9e19-account-create-update-pmfcs\" (UID: \"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76\") " pod="openstack/keystone-9e19-account-create-update-pmfcs" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.393644 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-operator-scripts\") pod \"keystone-9e19-account-create-update-pmfcs\" (UID: \"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76\") " pod="openstack/keystone-9e19-account-create-update-pmfcs" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.440442 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsggj\" (UniqueName: \"kubernetes.io/projected/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-kube-api-access-zsggj\") pod \"keystone-9e19-account-create-update-pmfcs\" (UID: \"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76\") " pod="openstack/keystone-9e19-account-create-update-pmfcs" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.470575 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tgzf2" Feb 23 08:40:05 crc kubenswrapper[5118]: I0223 08:40:05.557732 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9e19-account-create-update-pmfcs" Feb 23 08:40:06 crc kubenswrapper[5118]: I0223 08:40:06.009951 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tgzf2"] Feb 23 08:40:06 crc kubenswrapper[5118]: W0223 08:40:06.019517 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c7a2f84_6788_4fae_a465_b0cf157c16e5.slice/crio-dfcd9368dcf0dc9906bb371ae1e629f73dc4576f5a22805fd6b615a193078424 WatchSource:0}: Error finding container dfcd9368dcf0dc9906bb371ae1e629f73dc4576f5a22805fd6b615a193078424: Status 404 returned error can't find the container with id dfcd9368dcf0dc9906bb371ae1e629f73dc4576f5a22805fd6b615a193078424 Feb 23 08:40:06 crc kubenswrapper[5118]: I0223 08:40:06.150782 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9e19-account-create-update-pmfcs"] Feb 23 08:40:06 crc kubenswrapper[5118]: I0223 08:40:06.688051 5118 generic.go:334] "Generic (PLEG): container finished" podID="eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76" containerID="203e325c2db89155b3ef8068e97b6c4fddeb91994f94fab144f7d1123b4bd143" exitCode=0 Feb 23 08:40:06 crc kubenswrapper[5118]: I0223 08:40:06.688214 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9e19-account-create-update-pmfcs" event={"ID":"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76","Type":"ContainerDied","Data":"203e325c2db89155b3ef8068e97b6c4fddeb91994f94fab144f7d1123b4bd143"} Feb 23 08:40:06 crc kubenswrapper[5118]: I0223 08:40:06.688261 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9e19-account-create-update-pmfcs" event={"ID":"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76","Type":"ContainerStarted","Data":"467d38c102051f85991147e1333e73d453594913368450d7a4d68d8dd24743c1"} Feb 23 08:40:06 crc kubenswrapper[5118]: I0223 08:40:06.691187 5118 generic.go:334] "Generic (PLEG): container finished" podID="9c7a2f84-6788-4fae-a465-b0cf157c16e5" containerID="1e986731a23223ef3011ca66015a2a6164654579cd221931de7f6423bdd94d3a" exitCode=0 Feb 23 08:40:06 crc kubenswrapper[5118]: I0223 08:40:06.691271 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tgzf2" event={"ID":"9c7a2f84-6788-4fae-a465-b0cf157c16e5","Type":"ContainerDied","Data":"1e986731a23223ef3011ca66015a2a6164654579cd221931de7f6423bdd94d3a"} Feb 23 08:40:06 crc kubenswrapper[5118]: I0223 08:40:06.691321 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tgzf2" event={"ID":"9c7a2f84-6788-4fae-a465-b0cf157c16e5","Type":"ContainerStarted","Data":"dfcd9368dcf0dc9906bb371ae1e629f73dc4576f5a22805fd6b615a193078424"} Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.212751 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tgzf2" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.217111 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9e19-account-create-update-pmfcs" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.271080 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-operator-scripts\") pod \"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76\" (UID: \"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76\") " Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.271213 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rthl\" (UniqueName: \"kubernetes.io/projected/9c7a2f84-6788-4fae-a465-b0cf157c16e5-kube-api-access-5rthl\") pod \"9c7a2f84-6788-4fae-a465-b0cf157c16e5\" (UID: \"9c7a2f84-6788-4fae-a465-b0cf157c16e5\") " Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.271243 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsggj\" (UniqueName: \"kubernetes.io/projected/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-kube-api-access-zsggj\") pod \"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76\" (UID: \"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76\") " Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.271271 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c7a2f84-6788-4fae-a465-b0cf157c16e5-operator-scripts\") pod \"9c7a2f84-6788-4fae-a465-b0cf157c16e5\" (UID: \"9c7a2f84-6788-4fae-a465-b0cf157c16e5\") " Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.272159 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c7a2f84-6788-4fae-a465-b0cf157c16e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c7a2f84-6788-4fae-a465-b0cf157c16e5" (UID: "9c7a2f84-6788-4fae-a465-b0cf157c16e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.272157 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76" (UID: "eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.282209 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-kube-api-access-zsggj" (OuterVolumeSpecName: "kube-api-access-zsggj") pod "eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76" (UID: "eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76"). InnerVolumeSpecName "kube-api-access-zsggj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.282265 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c7a2f84-6788-4fae-a465-b0cf157c16e5-kube-api-access-5rthl" (OuterVolumeSpecName: "kube-api-access-5rthl") pod "9c7a2f84-6788-4fae-a465-b0cf157c16e5" (UID: "9c7a2f84-6788-4fae-a465-b0cf157c16e5"). InnerVolumeSpecName "kube-api-access-5rthl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.373435 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rthl\" (UniqueName: \"kubernetes.io/projected/9c7a2f84-6788-4fae-a465-b0cf157c16e5-kube-api-access-5rthl\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.373488 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsggj\" (UniqueName: \"kubernetes.io/projected/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-kube-api-access-zsggj\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.373506 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c7a2f84-6788-4fae-a465-b0cf157c16e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.373521 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.717257 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tgzf2" event={"ID":"9c7a2f84-6788-4fae-a465-b0cf157c16e5","Type":"ContainerDied","Data":"dfcd9368dcf0dc9906bb371ae1e629f73dc4576f5a22805fd6b615a193078424"} Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.717322 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tgzf2" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.717328 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfcd9368dcf0dc9906bb371ae1e629f73dc4576f5a22805fd6b615a193078424" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.719810 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9e19-account-create-update-pmfcs" event={"ID":"eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76","Type":"ContainerDied","Data":"467d38c102051f85991147e1333e73d453594913368450d7a4d68d8dd24743c1"} Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.719874 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="467d38c102051f85991147e1333e73d453594913368450d7a4d68d8dd24743c1" Feb 23 08:40:08 crc kubenswrapper[5118]: I0223 08:40:08.720008 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9e19-account-create-update-pmfcs" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.685805 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-t4jr4"] Feb 23 08:40:10 crc kubenswrapper[5118]: E0223 08:40:10.686222 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c7a2f84-6788-4fae-a465-b0cf157c16e5" containerName="mariadb-database-create" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.686237 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c7a2f84-6788-4fae-a465-b0cf157c16e5" containerName="mariadb-database-create" Feb 23 08:40:10 crc kubenswrapper[5118]: E0223 08:40:10.686249 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76" containerName="mariadb-account-create-update" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.686258 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76" containerName="mariadb-account-create-update" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.686479 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c7a2f84-6788-4fae-a465-b0cf157c16e5" containerName="mariadb-database-create" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.686501 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76" containerName="mariadb-account-create-update" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.687200 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.689661 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.690364 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.691274 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9sspz" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.694432 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.718283 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-t4jr4"] Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.719510 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-combined-ca-bundle\") pod \"keystone-db-sync-t4jr4\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.719670 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh97v\" (UniqueName: \"kubernetes.io/projected/d843d0a0-c5d1-47e0-8952-5504b7d79b88-kube-api-access-fh97v\") pod \"keystone-db-sync-t4jr4\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.720169 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-config-data\") pod \"keystone-db-sync-t4jr4\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.821415 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-combined-ca-bundle\") pod \"keystone-db-sync-t4jr4\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.821512 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh97v\" (UniqueName: \"kubernetes.io/projected/d843d0a0-c5d1-47e0-8952-5504b7d79b88-kube-api-access-fh97v\") pod \"keystone-db-sync-t4jr4\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.821692 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-config-data\") pod \"keystone-db-sync-t4jr4\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.828039 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-config-data\") pod \"keystone-db-sync-t4jr4\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.829025 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-combined-ca-bundle\") pod \"keystone-db-sync-t4jr4\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:10 crc kubenswrapper[5118]: I0223 08:40:10.848971 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh97v\" (UniqueName: \"kubernetes.io/projected/d843d0a0-c5d1-47e0-8952-5504b7d79b88-kube-api-access-fh97v\") pod \"keystone-db-sync-t4jr4\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:11 crc kubenswrapper[5118]: I0223 08:40:11.011939 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:11 crc kubenswrapper[5118]: I0223 08:40:11.509730 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-t4jr4"] Feb 23 08:40:11 crc kubenswrapper[5118]: W0223 08:40:11.510113 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd843d0a0_c5d1_47e0_8952_5504b7d79b88.slice/crio-f3e17d197f78f0d083c20f23cee002f910c9a78ddb16a1bde329fb8abff24f64 WatchSource:0}: Error finding container f3e17d197f78f0d083c20f23cee002f910c9a78ddb16a1bde329fb8abff24f64: Status 404 returned error can't find the container with id f3e17d197f78f0d083c20f23cee002f910c9a78ddb16a1bde329fb8abff24f64 Feb 23 08:40:11 crc kubenswrapper[5118]: I0223 08:40:11.762957 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-t4jr4" event={"ID":"d843d0a0-c5d1-47e0-8952-5504b7d79b88","Type":"ContainerStarted","Data":"f3e17d197f78f0d083c20f23cee002f910c9a78ddb16a1bde329fb8abff24f64"} Feb 23 08:40:13 crc kubenswrapper[5118]: I0223 08:40:13.269897 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 23 08:40:13 crc kubenswrapper[5118]: I0223 08:40:13.698543 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:40:13 crc kubenswrapper[5118]: E0223 08:40:13.699235 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:40:17 crc kubenswrapper[5118]: I0223 08:40:17.821213 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-t4jr4" event={"ID":"d843d0a0-c5d1-47e0-8952-5504b7d79b88","Type":"ContainerStarted","Data":"3e1a8298b1d3fc26fe3bcf9f090bc060316781292f29b2bdd1f6aad49531de10"} Feb 23 08:40:17 crc kubenswrapper[5118]: I0223 08:40:17.857275 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-t4jr4" podStartSLOduration=2.775402432 podStartE2EDuration="7.857243901s" podCreationTimestamp="2026-02-23 08:40:10 +0000 UTC" firstStartedPulling="2026-02-23 08:40:11.514300775 +0000 UTC m=+6874.518085368" lastFinishedPulling="2026-02-23 08:40:16.596142254 +0000 UTC m=+6879.599926837" observedRunningTime="2026-02-23 08:40:17.845275062 +0000 UTC m=+6880.849059645" watchObservedRunningTime="2026-02-23 08:40:17.857243901 +0000 UTC m=+6880.861028514" Feb 23 08:40:18 crc kubenswrapper[5118]: I0223 08:40:18.836739 5118 generic.go:334] "Generic (PLEG): container finished" podID="d843d0a0-c5d1-47e0-8952-5504b7d79b88" containerID="3e1a8298b1d3fc26fe3bcf9f090bc060316781292f29b2bdd1f6aad49531de10" exitCode=0 Feb 23 08:40:18 crc kubenswrapper[5118]: I0223 08:40:18.836903 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-t4jr4" event={"ID":"d843d0a0-c5d1-47e0-8952-5504b7d79b88","Type":"ContainerDied","Data":"3e1a8298b1d3fc26fe3bcf9f090bc060316781292f29b2bdd1f6aad49531de10"} Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.276933 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.411705 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-config-data\") pod \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.411892 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh97v\" (UniqueName: \"kubernetes.io/projected/d843d0a0-c5d1-47e0-8952-5504b7d79b88-kube-api-access-fh97v\") pod \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.411924 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-combined-ca-bundle\") pod \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\" (UID: \"d843d0a0-c5d1-47e0-8952-5504b7d79b88\") " Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.418753 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d843d0a0-c5d1-47e0-8952-5504b7d79b88-kube-api-access-fh97v" (OuterVolumeSpecName: "kube-api-access-fh97v") pod "d843d0a0-c5d1-47e0-8952-5504b7d79b88" (UID: "d843d0a0-c5d1-47e0-8952-5504b7d79b88"). InnerVolumeSpecName "kube-api-access-fh97v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.439417 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d843d0a0-c5d1-47e0-8952-5504b7d79b88" (UID: "d843d0a0-c5d1-47e0-8952-5504b7d79b88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.491016 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-config-data" (OuterVolumeSpecName: "config-data") pod "d843d0a0-c5d1-47e0-8952-5504b7d79b88" (UID: "d843d0a0-c5d1-47e0-8952-5504b7d79b88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.513867 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.513902 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh97v\" (UniqueName: \"kubernetes.io/projected/d843d0a0-c5d1-47e0-8952-5504b7d79b88-kube-api-access-fh97v\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.513915 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d843d0a0-c5d1-47e0-8952-5504b7d79b88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.858121 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-t4jr4" event={"ID":"d843d0a0-c5d1-47e0-8952-5504b7d79b88","Type":"ContainerDied","Data":"f3e17d197f78f0d083c20f23cee002f910c9a78ddb16a1bde329fb8abff24f64"} Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.858172 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e17d197f78f0d083c20f23cee002f910c9a78ddb16a1bde329fb8abff24f64" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.858266 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-t4jr4" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.998071 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fd79565d5-np795"] Feb 23 08:40:20 crc kubenswrapper[5118]: E0223 08:40:20.998512 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d843d0a0-c5d1-47e0-8952-5504b7d79b88" containerName="keystone-db-sync" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.998533 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d843d0a0-c5d1-47e0-8952-5504b7d79b88" containerName="keystone-db-sync" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.998723 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d843d0a0-c5d1-47e0-8952-5504b7d79b88" containerName="keystone-db-sync" Feb 23 08:40:20 crc kubenswrapper[5118]: I0223 08:40:20.999628 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.029069 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd79565d5-np795"] Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.063350 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pw6bc"] Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.064884 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.068976 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.069259 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.069433 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.069562 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9sspz" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.069732 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.078176 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pw6bc"] Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.126415 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-nb\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.126744 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-sb\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.126835 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-dns-svc\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.126924 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rgv4\" (UniqueName: \"kubernetes.io/projected/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-kube-api-access-2rgv4\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.127024 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-config\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228203 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-scripts\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228278 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvkbb\" (UniqueName: \"kubernetes.io/projected/30d42146-ffb7-4420-a334-b73d800d697c-kube-api-access-cvkbb\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228313 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rgv4\" (UniqueName: \"kubernetes.io/projected/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-kube-api-access-2rgv4\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228336 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-fernet-keys\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228368 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-combined-ca-bundle\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228408 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-config\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228478 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-config-data\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228501 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-nb\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228528 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-credential-keys\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228560 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-sb\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.228587 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-dns-svc\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.229804 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-dns-svc\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.230827 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-config\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.231555 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-nb\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.232270 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-sb\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.247481 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rgv4\" (UniqueName: \"kubernetes.io/projected/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-kube-api-access-2rgv4\") pod \"dnsmasq-dns-fd79565d5-np795\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.325220 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.332435 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-config-data\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.332508 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-credential-keys\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.332560 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-scripts\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.332604 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvkbb\" (UniqueName: \"kubernetes.io/projected/30d42146-ffb7-4420-a334-b73d800d697c-kube-api-access-cvkbb\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.332631 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-fernet-keys\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.332657 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-combined-ca-bundle\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.337214 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-fernet-keys\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.337718 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-scripts\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.337951 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-config-data\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.342511 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-credential-keys\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.344659 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-combined-ca-bundle\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.352155 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvkbb\" (UniqueName: \"kubernetes.io/projected/30d42146-ffb7-4420-a334-b73d800d697c-kube-api-access-cvkbb\") pod \"keystone-bootstrap-pw6bc\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.388286 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.836237 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd79565d5-np795"] Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.867996 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd79565d5-np795" event={"ID":"c8fc4131-d6ba-4758-876d-9b0c1fe23c83","Type":"ContainerStarted","Data":"977427d847862612e5f87fe8d0e9c7dafb3a61759cae63bf48dd859d877f0b9c"} Feb 23 08:40:21 crc kubenswrapper[5118]: I0223 08:40:21.924963 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pw6bc"] Feb 23 08:40:21 crc kubenswrapper[5118]: W0223 08:40:21.931593 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30d42146_ffb7_4420_a334_b73d800d697c.slice/crio-cf69af7f837db7aad0a9bdc5ecbdde8dfcb437e41636b6180cf4a27ad877c585 WatchSource:0}: Error finding container cf69af7f837db7aad0a9bdc5ecbdde8dfcb437e41636b6180cf4a27ad877c585: Status 404 returned error can't find the container with id cf69af7f837db7aad0a9bdc5ecbdde8dfcb437e41636b6180cf4a27ad877c585 Feb 23 08:40:22 crc kubenswrapper[5118]: I0223 08:40:22.884781 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pw6bc" event={"ID":"30d42146-ffb7-4420-a334-b73d800d697c","Type":"ContainerStarted","Data":"fe7be9c4b8a68ea5367a7935c0c4cd49788b11cc6f1fa945035c6d7f19706c89"} Feb 23 08:40:22 crc kubenswrapper[5118]: I0223 08:40:22.886484 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pw6bc" event={"ID":"30d42146-ffb7-4420-a334-b73d800d697c","Type":"ContainerStarted","Data":"cf69af7f837db7aad0a9bdc5ecbdde8dfcb437e41636b6180cf4a27ad877c585"} Feb 23 08:40:22 crc kubenswrapper[5118]: I0223 08:40:22.888306 5118 generic.go:334] "Generic (PLEG): container finished" podID="c8fc4131-d6ba-4758-876d-9b0c1fe23c83" containerID="84e51fed3b79b147343402e0dedae7126a9ab2e625829130d70116819d24569b" exitCode=0 Feb 23 08:40:22 crc kubenswrapper[5118]: I0223 08:40:22.888403 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd79565d5-np795" event={"ID":"c8fc4131-d6ba-4758-876d-9b0c1fe23c83","Type":"ContainerDied","Data":"84e51fed3b79b147343402e0dedae7126a9ab2e625829130d70116819d24569b"} Feb 23 08:40:22 crc kubenswrapper[5118]: I0223 08:40:22.927768 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pw6bc" podStartSLOduration=1.9277392359999999 podStartE2EDuration="1.927739236s" podCreationTimestamp="2026-02-23 08:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:40:22.919639781 +0000 UTC m=+6885.923424374" watchObservedRunningTime="2026-02-23 08:40:22.927739236 +0000 UTC m=+6885.931523829" Feb 23 08:40:23 crc kubenswrapper[5118]: I0223 08:40:23.903989 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd79565d5-np795" event={"ID":"c8fc4131-d6ba-4758-876d-9b0c1fe23c83","Type":"ContainerStarted","Data":"cc78c6498a22f53dcfd7275c02f021f05352f39cbc38b61b42d39fb01a110147"} Feb 23 08:40:23 crc kubenswrapper[5118]: I0223 08:40:23.944182 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fd79565d5-np795" podStartSLOduration=3.943932754 podStartE2EDuration="3.943932754s" podCreationTimestamp="2026-02-23 08:40:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:40:23.933144174 +0000 UTC m=+6886.936928777" watchObservedRunningTime="2026-02-23 08:40:23.943932754 +0000 UTC m=+6886.947717367" Feb 23 08:40:24 crc kubenswrapper[5118]: I0223 08:40:24.914086 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:25 crc kubenswrapper[5118]: I0223 08:40:25.697523 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:40:25 crc kubenswrapper[5118]: E0223 08:40:25.697745 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:40:25 crc kubenswrapper[5118]: I0223 08:40:25.926175 5118 generic.go:334] "Generic (PLEG): container finished" podID="30d42146-ffb7-4420-a334-b73d800d697c" containerID="fe7be9c4b8a68ea5367a7935c0c4cd49788b11cc6f1fa945035c6d7f19706c89" exitCode=0 Feb 23 08:40:25 crc kubenswrapper[5118]: I0223 08:40:25.926231 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pw6bc" event={"ID":"30d42146-ffb7-4420-a334-b73d800d697c","Type":"ContainerDied","Data":"fe7be9c4b8a68ea5367a7935c0c4cd49788b11cc6f1fa945035c6d7f19706c89"} Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.375822 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.501730 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-scripts\") pod \"30d42146-ffb7-4420-a334-b73d800d697c\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.501856 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-config-data\") pod \"30d42146-ffb7-4420-a334-b73d800d697c\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.501902 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-combined-ca-bundle\") pod \"30d42146-ffb7-4420-a334-b73d800d697c\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.501933 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-fernet-keys\") pod \"30d42146-ffb7-4420-a334-b73d800d697c\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.501958 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvkbb\" (UniqueName: \"kubernetes.io/projected/30d42146-ffb7-4420-a334-b73d800d697c-kube-api-access-cvkbb\") pod \"30d42146-ffb7-4420-a334-b73d800d697c\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.501981 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-credential-keys\") pod \"30d42146-ffb7-4420-a334-b73d800d697c\" (UID: \"30d42146-ffb7-4420-a334-b73d800d697c\") " Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.509207 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "30d42146-ffb7-4420-a334-b73d800d697c" (UID: "30d42146-ffb7-4420-a334-b73d800d697c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.509253 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d42146-ffb7-4420-a334-b73d800d697c-kube-api-access-cvkbb" (OuterVolumeSpecName: "kube-api-access-cvkbb") pod "30d42146-ffb7-4420-a334-b73d800d697c" (UID: "30d42146-ffb7-4420-a334-b73d800d697c"). InnerVolumeSpecName "kube-api-access-cvkbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.509680 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-scripts" (OuterVolumeSpecName: "scripts") pod "30d42146-ffb7-4420-a334-b73d800d697c" (UID: "30d42146-ffb7-4420-a334-b73d800d697c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.509977 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "30d42146-ffb7-4420-a334-b73d800d697c" (UID: "30d42146-ffb7-4420-a334-b73d800d697c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.530371 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30d42146-ffb7-4420-a334-b73d800d697c" (UID: "30d42146-ffb7-4420-a334-b73d800d697c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.539428 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-config-data" (OuterVolumeSpecName: "config-data") pod "30d42146-ffb7-4420-a334-b73d800d697c" (UID: "30d42146-ffb7-4420-a334-b73d800d697c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.604397 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.604470 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.604496 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.604520 5118 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.604544 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvkbb\" (UniqueName: \"kubernetes.io/projected/30d42146-ffb7-4420-a334-b73d800d697c-kube-api-access-cvkbb\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.604566 5118 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/30d42146-ffb7-4420-a334-b73d800d697c-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.958867 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pw6bc" event={"ID":"30d42146-ffb7-4420-a334-b73d800d697c","Type":"ContainerDied","Data":"cf69af7f837db7aad0a9bdc5ecbdde8dfcb437e41636b6180cf4a27ad877c585"} Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.959404 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf69af7f837db7aad0a9bdc5ecbdde8dfcb437e41636b6180cf4a27ad877c585" Feb 23 08:40:27 crc kubenswrapper[5118]: I0223 08:40:27.958982 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pw6bc" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.060933 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pw6bc"] Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.072540 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pw6bc"] Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.145351 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rhx92"] Feb 23 08:40:28 crc kubenswrapper[5118]: E0223 08:40:28.146475 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d42146-ffb7-4420-a334-b73d800d697c" containerName="keystone-bootstrap" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.146720 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d42146-ffb7-4420-a334-b73d800d697c" containerName="keystone-bootstrap" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.147345 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d42146-ffb7-4420-a334-b73d800d697c" containerName="keystone-bootstrap" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.148494 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.151699 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.151773 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.152138 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.152196 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.152436 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9sspz" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.156150 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rhx92"] Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.314886 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fwdr\" (UniqueName: \"kubernetes.io/projected/062a4716-72d8-4714-bee7-679602f5df50-kube-api-access-4fwdr\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.315705 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-fernet-keys\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.315945 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-combined-ca-bundle\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.316035 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-credential-keys\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.316164 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-config-data\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.316578 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-scripts\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.418699 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-combined-ca-bundle\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.418785 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-credential-keys\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.418844 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-config-data\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.419004 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-scripts\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.419296 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fwdr\" (UniqueName: \"kubernetes.io/projected/062a4716-72d8-4714-bee7-679602f5df50-kube-api-access-4fwdr\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.419765 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-fernet-keys\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.428817 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-credential-keys\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.437450 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-fernet-keys\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.439626 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-scripts\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.439647 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-config-data\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.440402 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-combined-ca-bundle\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.453730 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fwdr\" (UniqueName: \"kubernetes.io/projected/062a4716-72d8-4714-bee7-679602f5df50-kube-api-access-4fwdr\") pod \"keystone-bootstrap-rhx92\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:28 crc kubenswrapper[5118]: I0223 08:40:28.494833 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:29 crc kubenswrapper[5118]: I0223 08:40:29.027905 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rhx92"] Feb 23 08:40:29 crc kubenswrapper[5118]: I0223 08:40:29.722126 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d42146-ffb7-4420-a334-b73d800d697c" path="/var/lib/kubelet/pods/30d42146-ffb7-4420-a334-b73d800d697c/volumes" Feb 23 08:40:29 crc kubenswrapper[5118]: I0223 08:40:29.979916 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rhx92" event={"ID":"062a4716-72d8-4714-bee7-679602f5df50","Type":"ContainerStarted","Data":"2e865aad1edfc20cfe99140a3916f976fa142f7f9704d7ff8a3981adfdcb865d"} Feb 23 08:40:29 crc kubenswrapper[5118]: I0223 08:40:29.979972 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rhx92" event={"ID":"062a4716-72d8-4714-bee7-679602f5df50","Type":"ContainerStarted","Data":"450137f64c0fde28462a5e8eb0bb5d48b3b49c09c0502d2fd75a35b25157031d"} Feb 23 08:40:30 crc kubenswrapper[5118]: I0223 08:40:30.017493 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rhx92" podStartSLOduration=2.017462601 podStartE2EDuration="2.017462601s" podCreationTimestamp="2026-02-23 08:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:40:30.006716712 +0000 UTC m=+6893.010501325" watchObservedRunningTime="2026-02-23 08:40:30.017462601 +0000 UTC m=+6893.021247214" Feb 23 08:40:31 crc kubenswrapper[5118]: I0223 08:40:31.327510 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:40:31 crc kubenswrapper[5118]: I0223 08:40:31.413470 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568588dc4c-pjd2f"] Feb 23 08:40:31 crc kubenswrapper[5118]: I0223 08:40:31.413805 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" podUID="4a992050-916c-4cad-873e-cef0807ef46a" containerName="dnsmasq-dns" containerID="cri-o://ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3" gracePeriod=10 Feb 23 08:40:31 crc kubenswrapper[5118]: I0223 08:40:31.963333 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.005394 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-dns-svc\") pod \"4a992050-916c-4cad-873e-cef0807ef46a\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.005439 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-config\") pod \"4a992050-916c-4cad-873e-cef0807ef46a\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.005461 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-sb\") pod \"4a992050-916c-4cad-873e-cef0807ef46a\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.005522 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j55t7\" (UniqueName: \"kubernetes.io/projected/4a992050-916c-4cad-873e-cef0807ef46a-kube-api-access-j55t7\") pod \"4a992050-916c-4cad-873e-cef0807ef46a\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.005547 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-nb\") pod \"4a992050-916c-4cad-873e-cef0807ef46a\" (UID: \"4a992050-916c-4cad-873e-cef0807ef46a\") " Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.015392 5118 generic.go:334] "Generic (PLEG): container finished" podID="062a4716-72d8-4714-bee7-679602f5df50" containerID="2e865aad1edfc20cfe99140a3916f976fa142f7f9704d7ff8a3981adfdcb865d" exitCode=0 Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.015496 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rhx92" event={"ID":"062a4716-72d8-4714-bee7-679602f5df50","Type":"ContainerDied","Data":"2e865aad1edfc20cfe99140a3916f976fa142f7f9704d7ff8a3981adfdcb865d"} Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.027523 5118 generic.go:334] "Generic (PLEG): container finished" podID="4a992050-916c-4cad-873e-cef0807ef46a" containerID="ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3" exitCode=0 Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.027585 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" event={"ID":"4a992050-916c-4cad-873e-cef0807ef46a","Type":"ContainerDied","Data":"ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3"} Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.027622 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" event={"ID":"4a992050-916c-4cad-873e-cef0807ef46a","Type":"ContainerDied","Data":"bd279d661d7470d6fa32ad27ef975ae99579f535b7ff02a54676ddcd9b9bfb1c"} Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.027646 5118 scope.go:117] "RemoveContainer" containerID="ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.027805 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568588dc4c-pjd2f" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.042470 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a992050-916c-4cad-873e-cef0807ef46a-kube-api-access-j55t7" (OuterVolumeSpecName: "kube-api-access-j55t7") pod "4a992050-916c-4cad-873e-cef0807ef46a" (UID: "4a992050-916c-4cad-873e-cef0807ef46a"). InnerVolumeSpecName "kube-api-access-j55t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.074252 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a992050-916c-4cad-873e-cef0807ef46a" (UID: "4a992050-916c-4cad-873e-cef0807ef46a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.076807 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-config" (OuterVolumeSpecName: "config") pod "4a992050-916c-4cad-873e-cef0807ef46a" (UID: "4a992050-916c-4cad-873e-cef0807ef46a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.082561 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a992050-916c-4cad-873e-cef0807ef46a" (UID: "4a992050-916c-4cad-873e-cef0807ef46a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.088867 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a992050-916c-4cad-873e-cef0807ef46a" (UID: "4a992050-916c-4cad-873e-cef0807ef46a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.107357 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j55t7\" (UniqueName: \"kubernetes.io/projected/4a992050-916c-4cad-873e-cef0807ef46a-kube-api-access-j55t7\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.107388 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.107398 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.107410 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.107422 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a992050-916c-4cad-873e-cef0807ef46a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.138033 5118 scope.go:117] "RemoveContainer" containerID="3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.160056 5118 scope.go:117] "RemoveContainer" containerID="ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3" Feb 23 08:40:32 crc kubenswrapper[5118]: E0223 08:40:32.160772 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3\": container with ID starting with ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3 not found: ID does not exist" containerID="ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.160866 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3"} err="failed to get container status \"ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3\": rpc error: code = NotFound desc = could not find container \"ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3\": container with ID starting with ad332d5827ed6625af730fd54fc2b9880005cfab76c00b8f09c2e3f5c62e33a3 not found: ID does not exist" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.160945 5118 scope.go:117] "RemoveContainer" containerID="3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309" Feb 23 08:40:32 crc kubenswrapper[5118]: E0223 08:40:32.161413 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309\": container with ID starting with 3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309 not found: ID does not exist" containerID="3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.161461 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309"} err="failed to get container status \"3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309\": rpc error: code = NotFound desc = could not find container \"3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309\": container with ID starting with 3651d5b9d3cf4c8f5f3a56a0d53965c36ee0fb28c9266c48b769f740ef96b309 not found: ID does not exist" Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.364169 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568588dc4c-pjd2f"] Feb 23 08:40:32 crc kubenswrapper[5118]: I0223 08:40:32.372927 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568588dc4c-pjd2f"] Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.542464 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.715347 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a992050-916c-4cad-873e-cef0807ef46a" path="/var/lib/kubelet/pods/4a992050-916c-4cad-873e-cef0807ef46a/volumes" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.740652 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fwdr\" (UniqueName: \"kubernetes.io/projected/062a4716-72d8-4714-bee7-679602f5df50-kube-api-access-4fwdr\") pod \"062a4716-72d8-4714-bee7-679602f5df50\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.741127 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-combined-ca-bundle\") pod \"062a4716-72d8-4714-bee7-679602f5df50\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.741202 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-credential-keys\") pod \"062a4716-72d8-4714-bee7-679602f5df50\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.741241 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-scripts\") pod \"062a4716-72d8-4714-bee7-679602f5df50\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.741403 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-config-data\") pod \"062a4716-72d8-4714-bee7-679602f5df50\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.741478 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-fernet-keys\") pod \"062a4716-72d8-4714-bee7-679602f5df50\" (UID: \"062a4716-72d8-4714-bee7-679602f5df50\") " Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.747398 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "062a4716-72d8-4714-bee7-679602f5df50" (UID: "062a4716-72d8-4714-bee7-679602f5df50"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.748457 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-scripts" (OuterVolumeSpecName: "scripts") pod "062a4716-72d8-4714-bee7-679602f5df50" (UID: "062a4716-72d8-4714-bee7-679602f5df50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.748514 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/062a4716-72d8-4714-bee7-679602f5df50-kube-api-access-4fwdr" (OuterVolumeSpecName: "kube-api-access-4fwdr") pod "062a4716-72d8-4714-bee7-679602f5df50" (UID: "062a4716-72d8-4714-bee7-679602f5df50"). InnerVolumeSpecName "kube-api-access-4fwdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.748874 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "062a4716-72d8-4714-bee7-679602f5df50" (UID: "062a4716-72d8-4714-bee7-679602f5df50"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.771089 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "062a4716-72d8-4714-bee7-679602f5df50" (UID: "062a4716-72d8-4714-bee7-679602f5df50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.782734 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-config-data" (OuterVolumeSpecName: "config-data") pod "062a4716-72d8-4714-bee7-679602f5df50" (UID: "062a4716-72d8-4714-bee7-679602f5df50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.844858 5118 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.844896 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.844908 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.844919 5118 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.844932 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fwdr\" (UniqueName: \"kubernetes.io/projected/062a4716-72d8-4714-bee7-679602f5df50-kube-api-access-4fwdr\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:33 crc kubenswrapper[5118]: I0223 08:40:33.844951 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062a4716-72d8-4714-bee7-679602f5df50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.056907 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rhx92" event={"ID":"062a4716-72d8-4714-bee7-679602f5df50","Type":"ContainerDied","Data":"450137f64c0fde28462a5e8eb0bb5d48b3b49c09c0502d2fd75a35b25157031d"} Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.057248 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="450137f64c0fde28462a5e8eb0bb5d48b3b49c09c0502d2fd75a35b25157031d" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.057244 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rhx92" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.226868 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6998df9889-njjfn"] Feb 23 08:40:34 crc kubenswrapper[5118]: E0223 08:40:34.227250 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a992050-916c-4cad-873e-cef0807ef46a" containerName="dnsmasq-dns" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.227267 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a992050-916c-4cad-873e-cef0807ef46a" containerName="dnsmasq-dns" Feb 23 08:40:34 crc kubenswrapper[5118]: E0223 08:40:34.227288 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062a4716-72d8-4714-bee7-679602f5df50" containerName="keystone-bootstrap" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.227295 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="062a4716-72d8-4714-bee7-679602f5df50" containerName="keystone-bootstrap" Feb 23 08:40:34 crc kubenswrapper[5118]: E0223 08:40:34.227309 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a992050-916c-4cad-873e-cef0807ef46a" containerName="init" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.227315 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a992050-916c-4cad-873e-cef0807ef46a" containerName="init" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.227472 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a992050-916c-4cad-873e-cef0807ef46a" containerName="dnsmasq-dns" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.227495 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="062a4716-72d8-4714-bee7-679602f5df50" containerName="keystone-bootstrap" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.228045 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.236176 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.236454 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6998df9889-njjfn"] Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.236779 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.237264 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9sspz" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.254180 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-fernet-keys\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.254258 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-combined-ca-bundle\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.254287 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-scripts\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.254330 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-config-data\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.254357 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29gh9\" (UniqueName: \"kubernetes.io/projected/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-kube-api-access-29gh9\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.254388 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-credential-keys\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.259795 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.354871 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-credential-keys\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.355006 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-fernet-keys\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.355060 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-combined-ca-bundle\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.355108 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-scripts\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.355145 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-config-data\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.355174 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29gh9\" (UniqueName: \"kubernetes.io/projected/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-kube-api-access-29gh9\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.370310 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-fernet-keys\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.377416 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-combined-ca-bundle\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.377765 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-credential-keys\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.386855 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-config-data\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.387296 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-scripts\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.387870 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29gh9\" (UniqueName: \"kubernetes.io/projected/1688eed1-a62c-4fec-b12d-c1eb1a45d6cc-kube-api-access-29gh9\") pod \"keystone-6998df9889-njjfn\" (UID: \"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc\") " pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:34 crc kubenswrapper[5118]: I0223 08:40:34.543276 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:35 crc kubenswrapper[5118]: I0223 08:40:34.999658 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6998df9889-njjfn"] Feb 23 08:40:35 crc kubenswrapper[5118]: W0223 08:40:35.006283 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1688eed1_a62c_4fec_b12d_c1eb1a45d6cc.slice/crio-670ac78e6cc1fb62b5ed69f578be332f5702f4a745ad3ab7deff69566d93fb6f WatchSource:0}: Error finding container 670ac78e6cc1fb62b5ed69f578be332f5702f4a745ad3ab7deff69566d93fb6f: Status 404 returned error can't find the container with id 670ac78e6cc1fb62b5ed69f578be332f5702f4a745ad3ab7deff69566d93fb6f Feb 23 08:40:35 crc kubenswrapper[5118]: I0223 08:40:35.068228 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6998df9889-njjfn" event={"ID":"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc","Type":"ContainerStarted","Data":"670ac78e6cc1fb62b5ed69f578be332f5702f4a745ad3ab7deff69566d93fb6f"} Feb 23 08:40:36 crc kubenswrapper[5118]: I0223 08:40:36.079417 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6998df9889-njjfn" event={"ID":"1688eed1-a62c-4fec-b12d-c1eb1a45d6cc","Type":"ContainerStarted","Data":"1852b52a6b06bfb8b60cef2f2208279034361d91afb2a61f1c7cb79ba5e3b02e"} Feb 23 08:40:36 crc kubenswrapper[5118]: I0223 08:40:36.080037 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:40:36 crc kubenswrapper[5118]: I0223 08:40:36.120391 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6998df9889-njjfn" podStartSLOduration=2.120357004 podStartE2EDuration="2.120357004s" podCreationTimestamp="2026-02-23 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:40:36.107503684 +0000 UTC m=+6899.111288347" watchObservedRunningTime="2026-02-23 08:40:36.120357004 +0000 UTC m=+6899.124141617" Feb 23 08:40:39 crc kubenswrapper[5118]: I0223 08:40:39.697962 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:40:39 crc kubenswrapper[5118]: E0223 08:40:39.700339 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:40:52 crc kubenswrapper[5118]: I0223 08:40:52.698446 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:40:52 crc kubenswrapper[5118]: E0223 08:40:52.700414 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:41:04 crc kubenswrapper[5118]: I0223 08:41:04.698469 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:41:04 crc kubenswrapper[5118]: E0223 08:41:04.699647 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:41:06 crc kubenswrapper[5118]: I0223 08:41:06.026325 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6998df9889-njjfn" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.141749 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.144971 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.151291 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.154768 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gf7n6" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.156036 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.164132 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.232514 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 23 08:41:10 crc kubenswrapper[5118]: E0223 08:41:10.233250 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-2456z openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="931dd5ed-6c85-46c7-96c1-85106ac4b986" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.237028 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config-secret\") pod \"openstackclient\" (UID: \"931dd5ed-6c85-46c7-96c1-85106ac4b986\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.237198 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config\") pod \"openstackclient\" (UID: \"931dd5ed-6c85-46c7-96c1-85106ac4b986\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.237305 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2456z\" (UniqueName: \"kubernetes.io/projected/931dd5ed-6c85-46c7-96c1-85106ac4b986-kube-api-access-2456z\") pod \"openstackclient\" (UID: \"931dd5ed-6c85-46c7-96c1-85106ac4b986\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.241031 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.274610 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.276365 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.288957 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.339196 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config-secret\") pod \"openstackclient\" (UID: \"931dd5ed-6c85-46c7-96c1-85106ac4b986\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.339333 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config\") pod \"openstackclient\" (UID: \"931dd5ed-6c85-46c7-96c1-85106ac4b986\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.339376 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config\") pod \"openstackclient\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.339413 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvk4d\" (UniqueName: \"kubernetes.io/projected/f8be430c-8919-4285-b27c-1df1a6765f35-kube-api-access-jvk4d\") pod \"openstackclient\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.339437 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.339500 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2456z\" (UniqueName: \"kubernetes.io/projected/931dd5ed-6c85-46c7-96c1-85106ac4b986-kube-api-access-2456z\") pod \"openstackclient\" (UID: \"931dd5ed-6c85-46c7-96c1-85106ac4b986\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.340531 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config\") pod \"openstackclient\" (UID: \"931dd5ed-6c85-46c7-96c1-85106ac4b986\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: E0223 08:41:10.342417 5118 projected.go:194] Error preparing data for projected volume kube-api-access-2456z for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (931dd5ed-6c85-46c7-96c1-85106ac4b986) does not match the UID in record. The object might have been deleted and then recreated Feb 23 08:41:10 crc kubenswrapper[5118]: E0223 08:41:10.342512 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/931dd5ed-6c85-46c7-96c1-85106ac4b986-kube-api-access-2456z podName:931dd5ed-6c85-46c7-96c1-85106ac4b986 nodeName:}" failed. No retries permitted until 2026-02-23 08:41:10.842484551 +0000 UTC m=+6933.846269144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2456z" (UniqueName: "kubernetes.io/projected/931dd5ed-6c85-46c7-96c1-85106ac4b986-kube-api-access-2456z") pod "openstackclient" (UID: "931dd5ed-6c85-46c7-96c1-85106ac4b986") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (931dd5ed-6c85-46c7-96c1-85106ac4b986) does not match the UID in record. The object might have been deleted and then recreated Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.349079 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config-secret\") pod \"openstackclient\" (UID: \"931dd5ed-6c85-46c7-96c1-85106ac4b986\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.442510 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config\") pod \"openstackclient\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.442579 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvk4d\" (UniqueName: \"kubernetes.io/projected/f8be430c-8919-4285-b27c-1df1a6765f35-kube-api-access-jvk4d\") pod \"openstackclient\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.442602 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.442637 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.443863 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config\") pod \"openstackclient\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.446726 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.450803 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="931dd5ed-6c85-46c7-96c1-85106ac4b986" podUID="f8be430c-8919-4285-b27c-1df1a6765f35" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.470684 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvk4d\" (UniqueName: \"kubernetes.io/projected/f8be430c-8919-4285-b27c-1df1a6765f35-kube-api-access-jvk4d\") pod \"openstackclient\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.509058 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.544161 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config\") pod \"931dd5ed-6c85-46c7-96c1-85106ac4b986\" (UID: \"931dd5ed-6c85-46c7-96c1-85106ac4b986\") " Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.544212 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config-secret\") pod \"931dd5ed-6c85-46c7-96c1-85106ac4b986\" (UID: \"931dd5ed-6c85-46c7-96c1-85106ac4b986\") " Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.544723 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2456z\" (UniqueName: \"kubernetes.io/projected/931dd5ed-6c85-46c7-96c1-85106ac4b986-kube-api-access-2456z\") on node \"crc\" DevicePath \"\"" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.546090 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "931dd5ed-6c85-46c7-96c1-85106ac4b986" (UID: "931dd5ed-6c85-46c7-96c1-85106ac4b986"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.548829 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "931dd5ed-6c85-46c7-96c1-85106ac4b986" (UID: "931dd5ed-6c85-46c7-96c1-85106ac4b986"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.608805 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.646676 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.646734 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/931dd5ed-6c85-46c7-96c1-85106ac4b986-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 08:41:10 crc kubenswrapper[5118]: I0223 08:41:10.927645 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:41:10 crc kubenswrapper[5118]: W0223 08:41:10.930275 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8be430c_8919_4285_b27c_1df1a6765f35.slice/crio-8d78d330e236bc7910dcffc8f6b8e4ade0d44e9a81dd81167224f1b2dc835ccb WatchSource:0}: Error finding container 8d78d330e236bc7910dcffc8f6b8e4ade0d44e9a81dd81167224f1b2dc835ccb: Status 404 returned error can't find the container with id 8d78d330e236bc7910dcffc8f6b8e4ade0d44e9a81dd81167224f1b2dc835ccb Feb 23 08:41:11 crc kubenswrapper[5118]: I0223 08:41:11.454201 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f8be430c-8919-4285-b27c-1df1a6765f35","Type":"ContainerStarted","Data":"8d78d330e236bc7910dcffc8f6b8e4ade0d44e9a81dd81167224f1b2dc835ccb"} Feb 23 08:41:11 crc kubenswrapper[5118]: I0223 08:41:11.454220 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:41:11 crc kubenswrapper[5118]: I0223 08:41:11.460655 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="931dd5ed-6c85-46c7-96c1-85106ac4b986" podUID="f8be430c-8919-4285-b27c-1df1a6765f35" Feb 23 08:41:11 crc kubenswrapper[5118]: I0223 08:41:11.708991 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931dd5ed-6c85-46c7-96c1-85106ac4b986" path="/var/lib/kubelet/pods/931dd5ed-6c85-46c7-96c1-85106ac4b986/volumes" Feb 23 08:41:16 crc kubenswrapper[5118]: I0223 08:41:16.697995 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:41:16 crc kubenswrapper[5118]: E0223 08:41:16.699047 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:41:22 crc kubenswrapper[5118]: I0223 08:41:22.558952 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f8be430c-8919-4285-b27c-1df1a6765f35","Type":"ContainerStarted","Data":"0a741d03c4a5a884278bb363145d5230b6ced0174f1221aeef81de5d08cf0361"} Feb 23 08:41:22 crc kubenswrapper[5118]: I0223 08:41:22.589269 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.051859097 podStartE2EDuration="12.589189094s" podCreationTimestamp="2026-02-23 08:41:10 +0000 UTC" firstStartedPulling="2026-02-23 08:41:10.932384601 +0000 UTC m=+6933.936169174" lastFinishedPulling="2026-02-23 08:41:21.469714588 +0000 UTC m=+6944.473499171" observedRunningTime="2026-02-23 08:41:22.586937619 +0000 UTC m=+6945.590722222" watchObservedRunningTime="2026-02-23 08:41:22.589189094 +0000 UTC m=+6945.592973717" Feb 23 08:41:30 crc kubenswrapper[5118]: I0223 08:41:30.698389 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:41:30 crc kubenswrapper[5118]: E0223 08:41:30.699687 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:41:45 crc kubenswrapper[5118]: I0223 08:41:45.698396 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:41:45 crc kubenswrapper[5118]: E0223 08:41:45.699614 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:42:00 crc kubenswrapper[5118]: I0223 08:42:00.697618 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:42:00 crc kubenswrapper[5118]: E0223 08:42:00.700335 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:42:13 crc kubenswrapper[5118]: I0223 08:42:13.698214 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:42:13 crc kubenswrapper[5118]: E0223 08:42:13.699550 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:42:24 crc kubenswrapper[5118]: I0223 08:42:24.697795 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:42:24 crc kubenswrapper[5118]: E0223 08:42:24.699033 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:42:36 crc kubenswrapper[5118]: I0223 08:42:36.697311 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:42:36 crc kubenswrapper[5118]: E0223 08:42:36.698459 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:42:48 crc kubenswrapper[5118]: I0223 08:42:48.698008 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:42:48 crc kubenswrapper[5118]: E0223 08:42:48.699193 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:43:03 crc kubenswrapper[5118]: I0223 08:43:03.698427 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:43:03 crc kubenswrapper[5118]: E0223 08:43:03.699705 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.510004 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-zc4ct"] Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.512458 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zc4ct" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.521127 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8c49-account-create-update-jl45l"] Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.522746 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8c49-account-create-update-jl45l" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.525346 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.534173 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zc4ct"] Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.554928 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8c49-account-create-update-jl45l"] Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.618294 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-operator-scripts\") pod \"barbican-8c49-account-create-update-jl45l\" (UID: \"9c3d9bf0-7540-4c43-b198-e096fa76f5ec\") " pod="openstack/barbican-8c49-account-create-update-jl45l" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.618502 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c0c14ba-8d05-47c2-b951-114afe9834c1-operator-scripts\") pod \"barbican-db-create-zc4ct\" (UID: \"0c0c14ba-8d05-47c2-b951-114afe9834c1\") " pod="openstack/barbican-db-create-zc4ct" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.618545 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npvd6\" (UniqueName: \"kubernetes.io/projected/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-kube-api-access-npvd6\") pod \"barbican-8c49-account-create-update-jl45l\" (UID: \"9c3d9bf0-7540-4c43-b198-e096fa76f5ec\") " pod="openstack/barbican-8c49-account-create-update-jl45l" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.618681 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h8dn\" (UniqueName: \"kubernetes.io/projected/0c0c14ba-8d05-47c2-b951-114afe9834c1-kube-api-access-4h8dn\") pod \"barbican-db-create-zc4ct\" (UID: \"0c0c14ba-8d05-47c2-b951-114afe9834c1\") " pod="openstack/barbican-db-create-zc4ct" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.721241 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-operator-scripts\") pod \"barbican-8c49-account-create-update-jl45l\" (UID: \"9c3d9bf0-7540-4c43-b198-e096fa76f5ec\") " pod="openstack/barbican-8c49-account-create-update-jl45l" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.721463 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npvd6\" (UniqueName: \"kubernetes.io/projected/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-kube-api-access-npvd6\") pod \"barbican-8c49-account-create-update-jl45l\" (UID: \"9c3d9bf0-7540-4c43-b198-e096fa76f5ec\") " pod="openstack/barbican-8c49-account-create-update-jl45l" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.721503 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c0c14ba-8d05-47c2-b951-114afe9834c1-operator-scripts\") pod \"barbican-db-create-zc4ct\" (UID: \"0c0c14ba-8d05-47c2-b951-114afe9834c1\") " pod="openstack/barbican-db-create-zc4ct" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.721588 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h8dn\" (UniqueName: \"kubernetes.io/projected/0c0c14ba-8d05-47c2-b951-114afe9834c1-kube-api-access-4h8dn\") pod \"barbican-db-create-zc4ct\" (UID: \"0c0c14ba-8d05-47c2-b951-114afe9834c1\") " pod="openstack/barbican-db-create-zc4ct" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.722157 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-operator-scripts\") pod \"barbican-8c49-account-create-update-jl45l\" (UID: \"9c3d9bf0-7540-4c43-b198-e096fa76f5ec\") " pod="openstack/barbican-8c49-account-create-update-jl45l" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.722833 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c0c14ba-8d05-47c2-b951-114afe9834c1-operator-scripts\") pod \"barbican-db-create-zc4ct\" (UID: \"0c0c14ba-8d05-47c2-b951-114afe9834c1\") " pod="openstack/barbican-db-create-zc4ct" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.745025 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npvd6\" (UniqueName: \"kubernetes.io/projected/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-kube-api-access-npvd6\") pod \"barbican-8c49-account-create-update-jl45l\" (UID: \"9c3d9bf0-7540-4c43-b198-e096fa76f5ec\") " pod="openstack/barbican-8c49-account-create-update-jl45l" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.745726 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h8dn\" (UniqueName: \"kubernetes.io/projected/0c0c14ba-8d05-47c2-b951-114afe9834c1-kube-api-access-4h8dn\") pod \"barbican-db-create-zc4ct\" (UID: \"0c0c14ba-8d05-47c2-b951-114afe9834c1\") " pod="openstack/barbican-db-create-zc4ct" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.840404 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zc4ct" Feb 23 08:43:06 crc kubenswrapper[5118]: I0223 08:43:06.851366 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8c49-account-create-update-jl45l" Feb 23 08:43:07 crc kubenswrapper[5118]: I0223 08:43:07.356081 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zc4ct"] Feb 23 08:43:07 crc kubenswrapper[5118]: I0223 08:43:07.417464 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8c49-account-create-update-jl45l"] Feb 23 08:43:07 crc kubenswrapper[5118]: W0223 08:43:07.418392 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c3d9bf0_7540_4c43_b198_e096fa76f5ec.slice/crio-2dd02cf93ba735571e4681b0cf4acb4f849bb4b63208b629aa096fefababc982 WatchSource:0}: Error finding container 2dd02cf93ba735571e4681b0cf4acb4f849bb4b63208b629aa096fefababc982: Status 404 returned error can't find the container with id 2dd02cf93ba735571e4681b0cf4acb4f849bb4b63208b629aa096fefababc982 Feb 23 08:43:07 crc kubenswrapper[5118]: I0223 08:43:07.773825 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8c49-account-create-update-jl45l" event={"ID":"9c3d9bf0-7540-4c43-b198-e096fa76f5ec","Type":"ContainerStarted","Data":"7e1702e98e6713f525cd60a9a1e3804e6c34d0d6ebe3057c0f901b0a02810eaf"} Feb 23 08:43:07 crc kubenswrapper[5118]: I0223 08:43:07.773880 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8c49-account-create-update-jl45l" event={"ID":"9c3d9bf0-7540-4c43-b198-e096fa76f5ec","Type":"ContainerStarted","Data":"2dd02cf93ba735571e4681b0cf4acb4f849bb4b63208b629aa096fefababc982"} Feb 23 08:43:07 crc kubenswrapper[5118]: I0223 08:43:07.775534 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zc4ct" event={"ID":"0c0c14ba-8d05-47c2-b951-114afe9834c1","Type":"ContainerStarted","Data":"49d77e9bcd3309a29df1c19fcc9a49b491ea5a748b6ed16d3b07372557a17234"} Feb 23 08:43:07 crc kubenswrapper[5118]: I0223 08:43:07.775557 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zc4ct" event={"ID":"0c0c14ba-8d05-47c2-b951-114afe9834c1","Type":"ContainerStarted","Data":"64ff18d2ff3b799fd37950dc4017eea0df805cbf5fd66bb55c586195a046e718"} Feb 23 08:43:07 crc kubenswrapper[5118]: I0223 08:43:07.800292 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-8c49-account-create-update-jl45l" podStartSLOduration=1.8002551119999999 podStartE2EDuration="1.800255112s" podCreationTimestamp="2026-02-23 08:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:43:07.791615704 +0000 UTC m=+7050.795400287" watchObservedRunningTime="2026-02-23 08:43:07.800255112 +0000 UTC m=+7050.804039685" Feb 23 08:43:07 crc kubenswrapper[5118]: I0223 08:43:07.815415 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-zc4ct" podStartSLOduration=1.815386736 podStartE2EDuration="1.815386736s" podCreationTimestamp="2026-02-23 08:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:43:07.811594385 +0000 UTC m=+7050.815378958" watchObservedRunningTime="2026-02-23 08:43:07.815386736 +0000 UTC m=+7050.819171329" Feb 23 08:43:08 crc kubenswrapper[5118]: I0223 08:43:08.784589 5118 generic.go:334] "Generic (PLEG): container finished" podID="0c0c14ba-8d05-47c2-b951-114afe9834c1" containerID="49d77e9bcd3309a29df1c19fcc9a49b491ea5a748b6ed16d3b07372557a17234" exitCode=0 Feb 23 08:43:08 crc kubenswrapper[5118]: I0223 08:43:08.784652 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zc4ct" event={"ID":"0c0c14ba-8d05-47c2-b951-114afe9834c1","Type":"ContainerDied","Data":"49d77e9bcd3309a29df1c19fcc9a49b491ea5a748b6ed16d3b07372557a17234"} Feb 23 08:43:08 crc kubenswrapper[5118]: I0223 08:43:08.786752 5118 generic.go:334] "Generic (PLEG): container finished" podID="9c3d9bf0-7540-4c43-b198-e096fa76f5ec" containerID="7e1702e98e6713f525cd60a9a1e3804e6c34d0d6ebe3057c0f901b0a02810eaf" exitCode=0 Feb 23 08:43:08 crc kubenswrapper[5118]: I0223 08:43:08.786780 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8c49-account-create-update-jl45l" event={"ID":"9c3d9bf0-7540-4c43-b198-e096fa76f5ec","Type":"ContainerDied","Data":"7e1702e98e6713f525cd60a9a1e3804e6c34d0d6ebe3057c0f901b0a02810eaf"} Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.201886 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zc4ct" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.209159 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8c49-account-create-update-jl45l" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.293494 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npvd6\" (UniqueName: \"kubernetes.io/projected/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-kube-api-access-npvd6\") pod \"9c3d9bf0-7540-4c43-b198-e096fa76f5ec\" (UID: \"9c3d9bf0-7540-4c43-b198-e096fa76f5ec\") " Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.293692 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c0c14ba-8d05-47c2-b951-114afe9834c1-operator-scripts\") pod \"0c0c14ba-8d05-47c2-b951-114afe9834c1\" (UID: \"0c0c14ba-8d05-47c2-b951-114afe9834c1\") " Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.293834 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-operator-scripts\") pod \"9c3d9bf0-7540-4c43-b198-e096fa76f5ec\" (UID: \"9c3d9bf0-7540-4c43-b198-e096fa76f5ec\") " Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.293911 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h8dn\" (UniqueName: \"kubernetes.io/projected/0c0c14ba-8d05-47c2-b951-114afe9834c1-kube-api-access-4h8dn\") pod \"0c0c14ba-8d05-47c2-b951-114afe9834c1\" (UID: \"0c0c14ba-8d05-47c2-b951-114afe9834c1\") " Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.294606 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c3d9bf0-7540-4c43-b198-e096fa76f5ec" (UID: "9c3d9bf0-7540-4c43-b198-e096fa76f5ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.294669 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c0c14ba-8d05-47c2-b951-114afe9834c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c0c14ba-8d05-47c2-b951-114afe9834c1" (UID: "0c0c14ba-8d05-47c2-b951-114afe9834c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.300580 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-kube-api-access-npvd6" (OuterVolumeSpecName: "kube-api-access-npvd6") pod "9c3d9bf0-7540-4c43-b198-e096fa76f5ec" (UID: "9c3d9bf0-7540-4c43-b198-e096fa76f5ec"). InnerVolumeSpecName "kube-api-access-npvd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.307176 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0c14ba-8d05-47c2-b951-114afe9834c1-kube-api-access-4h8dn" (OuterVolumeSpecName: "kube-api-access-4h8dn") pod "0c0c14ba-8d05-47c2-b951-114afe9834c1" (UID: "0c0c14ba-8d05-47c2-b951-114afe9834c1"). InnerVolumeSpecName "kube-api-access-4h8dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.396257 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h8dn\" (UniqueName: \"kubernetes.io/projected/0c0c14ba-8d05-47c2-b951-114afe9834c1-kube-api-access-4h8dn\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.396734 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npvd6\" (UniqueName: \"kubernetes.io/projected/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-kube-api-access-npvd6\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.396754 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c0c14ba-8d05-47c2-b951-114afe9834c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.396772 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c3d9bf0-7540-4c43-b198-e096fa76f5ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.808788 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8c49-account-create-update-jl45l" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.809017 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8c49-account-create-update-jl45l" event={"ID":"9c3d9bf0-7540-4c43-b198-e096fa76f5ec","Type":"ContainerDied","Data":"2dd02cf93ba735571e4681b0cf4acb4f849bb4b63208b629aa096fefababc982"} Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.809076 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dd02cf93ba735571e4681b0cf4acb4f849bb4b63208b629aa096fefababc982" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.812473 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zc4ct" event={"ID":"0c0c14ba-8d05-47c2-b951-114afe9834c1","Type":"ContainerDied","Data":"64ff18d2ff3b799fd37950dc4017eea0df805cbf5fd66bb55c586195a046e718"} Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.812578 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64ff18d2ff3b799fd37950dc4017eea0df805cbf5fd66bb55c586195a046e718" Feb 23 08:43:10 crc kubenswrapper[5118]: I0223 08:43:10.812689 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zc4ct" Feb 23 08:43:11 crc kubenswrapper[5118]: I0223 08:43:11.897881 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-9qmfr"] Feb 23 08:43:11 crc kubenswrapper[5118]: E0223 08:43:11.898653 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3d9bf0-7540-4c43-b198-e096fa76f5ec" containerName="mariadb-account-create-update" Feb 23 08:43:11 crc kubenswrapper[5118]: I0223 08:43:11.898680 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3d9bf0-7540-4c43-b198-e096fa76f5ec" containerName="mariadb-account-create-update" Feb 23 08:43:11 crc kubenswrapper[5118]: E0223 08:43:11.898713 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0c14ba-8d05-47c2-b951-114afe9834c1" containerName="mariadb-database-create" Feb 23 08:43:11 crc kubenswrapper[5118]: I0223 08:43:11.898723 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0c14ba-8d05-47c2-b951-114afe9834c1" containerName="mariadb-database-create" Feb 23 08:43:11 crc kubenswrapper[5118]: I0223 08:43:11.898886 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3d9bf0-7540-4c43-b198-e096fa76f5ec" containerName="mariadb-account-create-update" Feb 23 08:43:11 crc kubenswrapper[5118]: I0223 08:43:11.898897 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0c14ba-8d05-47c2-b951-114afe9834c1" containerName="mariadb-database-create" Feb 23 08:43:11 crc kubenswrapper[5118]: I0223 08:43:11.899452 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:11 crc kubenswrapper[5118]: I0223 08:43:11.903461 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 23 08:43:11 crc kubenswrapper[5118]: I0223 08:43:11.903597 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f4v48" Feb 23 08:43:11 crc kubenswrapper[5118]: I0223 08:43:11.922981 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9qmfr"] Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.025758 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-db-sync-config-data\") pod \"barbican-db-sync-9qmfr\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.025821 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-combined-ca-bundle\") pod \"barbican-db-sync-9qmfr\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.025849 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f679d\" (UniqueName: \"kubernetes.io/projected/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-kube-api-access-f679d\") pod \"barbican-db-sync-9qmfr\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.127780 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-db-sync-config-data\") pod \"barbican-db-sync-9qmfr\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.128040 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-combined-ca-bundle\") pod \"barbican-db-sync-9qmfr\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.128065 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f679d\" (UniqueName: \"kubernetes.io/projected/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-kube-api-access-f679d\") pod \"barbican-db-sync-9qmfr\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.138388 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-db-sync-config-data\") pod \"barbican-db-sync-9qmfr\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.158538 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f679d\" (UniqueName: \"kubernetes.io/projected/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-kube-api-access-f679d\") pod \"barbican-db-sync-9qmfr\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.158967 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-combined-ca-bundle\") pod \"barbican-db-sync-9qmfr\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.224342 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.485336 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9qmfr"] Feb 23 08:43:12 crc kubenswrapper[5118]: I0223 08:43:12.836495 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9qmfr" event={"ID":"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf","Type":"ContainerStarted","Data":"6afeb34be16bd37cdf81d91c7f898f21f8047650ae4243f2d2cfdd1759575757"} Feb 23 08:43:17 crc kubenswrapper[5118]: I0223 08:43:17.701780 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:43:17 crc kubenswrapper[5118]: E0223 08:43:17.703720 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:43:17 crc kubenswrapper[5118]: I0223 08:43:17.879591 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9qmfr" event={"ID":"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf","Type":"ContainerStarted","Data":"292a5c10539c7b7306bc92442e41a062f52e91400f8b7275e1dbe8c80121b816"} Feb 23 08:43:17 crc kubenswrapper[5118]: I0223 08:43:17.904491 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-9qmfr" podStartSLOduration=2.090029501 podStartE2EDuration="6.904468402s" podCreationTimestamp="2026-02-23 08:43:11 +0000 UTC" firstStartedPulling="2026-02-23 08:43:12.487406407 +0000 UTC m=+7055.491190980" lastFinishedPulling="2026-02-23 08:43:17.301845298 +0000 UTC m=+7060.305629881" observedRunningTime="2026-02-23 08:43:17.894716907 +0000 UTC m=+7060.898501550" watchObservedRunningTime="2026-02-23 08:43:17.904468402 +0000 UTC m=+7060.908252975" Feb 23 08:43:19 crc kubenswrapper[5118]: I0223 08:43:19.903874 5118 generic.go:334] "Generic (PLEG): container finished" podID="ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf" containerID="292a5c10539c7b7306bc92442e41a062f52e91400f8b7275e1dbe8c80121b816" exitCode=0 Feb 23 08:43:19 crc kubenswrapper[5118]: I0223 08:43:19.904032 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9qmfr" event={"ID":"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf","Type":"ContainerDied","Data":"292a5c10539c7b7306bc92442e41a062f52e91400f8b7275e1dbe8c80121b816"} Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.317270 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.399247 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f679d\" (UniqueName: \"kubernetes.io/projected/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-kube-api-access-f679d\") pod \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.399459 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-db-sync-config-data\") pod \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.399488 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-combined-ca-bundle\") pod \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\" (UID: \"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf\") " Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.407830 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf" (UID: "ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.408886 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-kube-api-access-f679d" (OuterVolumeSpecName: "kube-api-access-f679d") pod "ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf" (UID: "ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf"). InnerVolumeSpecName "kube-api-access-f679d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.433779 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf" (UID: "ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.502617 5118 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.502663 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.502682 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f679d\" (UniqueName: \"kubernetes.io/projected/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf-kube-api-access-f679d\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.932325 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9qmfr" event={"ID":"ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf","Type":"ContainerDied","Data":"6afeb34be16bd37cdf81d91c7f898f21f8047650ae4243f2d2cfdd1759575757"} Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.932609 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6afeb34be16bd37cdf81d91c7f898f21f8047650ae4243f2d2cfdd1759575757" Feb 23 08:43:21 crc kubenswrapper[5118]: I0223 08:43:21.932460 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9qmfr" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.213221 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-c8f867454-q8dx9"] Feb 23 08:43:22 crc kubenswrapper[5118]: E0223 08:43:22.213575 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf" containerName="barbican-db-sync" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.213589 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf" containerName="barbican-db-sync" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.213734 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf" containerName="barbican-db-sync" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.214665 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.225984 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.227847 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.227932 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f4v48" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.246044 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-69b5cb8cbf-z5x2s"] Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.251304 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.261409 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.264242 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c8f867454-q8dx9"] Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.286652 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69b5cb8cbf-z5x2s"] Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.334605 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33b45cc0-f140-4ef3-ad47-56be870583a5-logs\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.334683 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-combined-ca-bundle\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.334724 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33b45cc0-f140-4ef3-ad47-56be870583a5-config-data-custom\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.334749 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbp24\" (UniqueName: \"kubernetes.io/projected/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-kube-api-access-wbp24\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.334768 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjmsz\" (UniqueName: \"kubernetes.io/projected/33b45cc0-f140-4ef3-ad47-56be870583a5-kube-api-access-jjmsz\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.334811 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b45cc0-f140-4ef3-ad47-56be870583a5-config-data\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.334845 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-config-data-custom\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.334870 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b45cc0-f140-4ef3-ad47-56be870583a5-combined-ca-bundle\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.334887 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-logs\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.334919 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-config-data\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.406693 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ff98d5d9f-v4dg9"] Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.408474 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.452562 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-config-data-custom\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.452697 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b45cc0-f140-4ef3-ad47-56be870583a5-combined-ca-bundle\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.452760 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-logs\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.452870 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-config-data\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.452974 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-dns-svc\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.453022 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-config\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.453133 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdz4s\" (UniqueName: \"kubernetes.io/projected/6113f337-ef10-43ad-9b81-84371a082261-kube-api-access-wdz4s\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.453187 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33b45cc0-f140-4ef3-ad47-56be870583a5-logs\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.453277 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-nb\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.453317 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-combined-ca-bundle\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.464485 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-sb\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.464568 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33b45cc0-f140-4ef3-ad47-56be870583a5-config-data-custom\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.464619 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbp24\" (UniqueName: \"kubernetes.io/projected/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-kube-api-access-wbp24\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.464668 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjmsz\" (UniqueName: \"kubernetes.io/projected/33b45cc0-f140-4ef3-ad47-56be870583a5-kube-api-access-jjmsz\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.464813 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b45cc0-f140-4ef3-ad47-56be870583a5-config-data\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.481327 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33b45cc0-f140-4ef3-ad47-56be870583a5-logs\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.492505 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-logs\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.511468 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-config-data-custom\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.514082 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33b45cc0-f140-4ef3-ad47-56be870583a5-config-data-custom\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.533864 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33b45cc0-f140-4ef3-ad47-56be870583a5-combined-ca-bundle\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.534852 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff98d5d9f-v4dg9"] Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.536794 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-combined-ca-bundle\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.539741 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-config-data\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.544853 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbp24\" (UniqueName: \"kubernetes.io/projected/afa3b53c-97b6-4eb6-b1f3-22b5feef02b5-kube-api-access-wbp24\") pod \"barbican-worker-69b5cb8cbf-z5x2s\" (UID: \"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5\") " pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.546061 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33b45cc0-f140-4ef3-ad47-56be870583a5-config-data\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.594087 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjmsz\" (UniqueName: \"kubernetes.io/projected/33b45cc0-f140-4ef3-ad47-56be870583a5-kube-api-access-jjmsz\") pod \"barbican-keystone-listener-c8f867454-q8dx9\" (UID: \"33b45cc0-f140-4ef3-ad47-56be870583a5\") " pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.608716 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.611388 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-dns-svc\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.611414 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-config\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.611446 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdz4s\" (UniqueName: \"kubernetes.io/projected/6113f337-ef10-43ad-9b81-84371a082261-kube-api-access-wdz4s\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.611480 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-nb\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.611511 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-sb\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.614444 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-dns-svc\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.615601 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-nb\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.631836 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-config\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.632320 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-sb\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.643163 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-678dd5c864-6nzbp"] Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.644625 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.648387 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdz4s\" (UniqueName: \"kubernetes.io/projected/6113f337-ef10-43ad-9b81-84371a082261-kube-api-access-wdz4s\") pod \"dnsmasq-dns-ff98d5d9f-v4dg9\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.648763 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.678713 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-678dd5c864-6nzbp"] Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.714024 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s2rg\" (UniqueName: \"kubernetes.io/projected/a5bb46fd-6ecf-49f2-8b75-082309e154ab-kube-api-access-8s2rg\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.714074 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bb46fd-6ecf-49f2-8b75-082309e154ab-combined-ca-bundle\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.714748 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5bb46fd-6ecf-49f2-8b75-082309e154ab-config-data-custom\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.714795 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5bb46fd-6ecf-49f2-8b75-082309e154ab-logs\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.714852 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bb46fd-6ecf-49f2-8b75-082309e154ab-config-data\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.793880 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.816956 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5bb46fd-6ecf-49f2-8b75-082309e154ab-config-data-custom\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.817027 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5bb46fd-6ecf-49f2-8b75-082309e154ab-logs\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.817574 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5bb46fd-6ecf-49f2-8b75-082309e154ab-logs\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.817705 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bb46fd-6ecf-49f2-8b75-082309e154ab-config-data\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.817765 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s2rg\" (UniqueName: \"kubernetes.io/projected/a5bb46fd-6ecf-49f2-8b75-082309e154ab-kube-api-access-8s2rg\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.818785 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bb46fd-6ecf-49f2-8b75-082309e154ab-combined-ca-bundle\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.823102 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bb46fd-6ecf-49f2-8b75-082309e154ab-combined-ca-bundle\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.823788 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bb46fd-6ecf-49f2-8b75-082309e154ab-config-data\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.824053 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5bb46fd-6ecf-49f2-8b75-082309e154ab-config-data-custom\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.846397 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" Feb 23 08:43:22 crc kubenswrapper[5118]: I0223 08:43:22.860334 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s2rg\" (UniqueName: \"kubernetes.io/projected/a5bb46fd-6ecf-49f2-8b75-082309e154ab-kube-api-access-8s2rg\") pod \"barbican-api-678dd5c864-6nzbp\" (UID: \"a5bb46fd-6ecf-49f2-8b75-082309e154ab\") " pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.042121 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.172260 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69b5cb8cbf-z5x2s"] Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.360725 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff98d5d9f-v4dg9"] Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.547678 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c8f867454-q8dx9"] Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.620518 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-678dd5c864-6nzbp"] Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.978781 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-678dd5c864-6nzbp" event={"ID":"a5bb46fd-6ecf-49f2-8b75-082309e154ab","Type":"ContainerStarted","Data":"c4310a0e37dc6cebb74f4a6ce59458fe651bd03e8fa53ab1ed3aa86f7cb24671"} Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.979274 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-678dd5c864-6nzbp" event={"ID":"a5bb46fd-6ecf-49f2-8b75-082309e154ab","Type":"ContainerStarted","Data":"8d5b4f8f4bac49323dc82f46a29e82514f044a74e4bf28a9a059c9837076e7c1"} Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.981893 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" event={"ID":"33b45cc0-f140-4ef3-ad47-56be870583a5","Type":"ContainerStarted","Data":"1be4d8d16c5136b5135e72321aaa2cdea1f648886b763189f4e4c593282ee6f2"} Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.985983 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" event={"ID":"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5","Type":"ContainerStarted","Data":"72a4b65b5b61e25cf3b1ab4fa5ecf400e611df4a1e847be271fe58a59defe35b"} Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.989709 5118 generic.go:334] "Generic (PLEG): container finished" podID="6113f337-ef10-43ad-9b81-84371a082261" containerID="bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb" exitCode=0 Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.989776 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" event={"ID":"6113f337-ef10-43ad-9b81-84371a082261","Type":"ContainerDied","Data":"bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb"} Feb 23 08:43:23 crc kubenswrapper[5118]: I0223 08:43:23.989804 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" event={"ID":"6113f337-ef10-43ad-9b81-84371a082261","Type":"ContainerStarted","Data":"3332fb5d5e5d45001c87c1bb5669d99486b692413cab29da8ad402fca7ad995a"} Feb 23 08:43:25 crc kubenswrapper[5118]: I0223 08:43:25.010938 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-678dd5c864-6nzbp" event={"ID":"a5bb46fd-6ecf-49f2-8b75-082309e154ab","Type":"ContainerStarted","Data":"320038e5fdb7de89b2376d91c67d6118e647482e2c0f0fac83564ca443faa09b"} Feb 23 08:43:25 crc kubenswrapper[5118]: I0223 08:43:25.011666 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:25 crc kubenswrapper[5118]: I0223 08:43:25.011715 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:25 crc kubenswrapper[5118]: I0223 08:43:25.047702 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-678dd5c864-6nzbp" podStartSLOduration=3.04767065 podStartE2EDuration="3.04767065s" podCreationTimestamp="2026-02-23 08:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:43:25.041825048 +0000 UTC m=+7068.045609611" watchObservedRunningTime="2026-02-23 08:43:25.04767065 +0000 UTC m=+7068.051455243" Feb 23 08:43:26 crc kubenswrapper[5118]: I0223 08:43:26.033407 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" event={"ID":"33b45cc0-f140-4ef3-ad47-56be870583a5","Type":"ContainerStarted","Data":"fb48c73aa94ded27397df835b6dff17b95abb2d1c872260ddde263155a2318ae"} Feb 23 08:43:26 crc kubenswrapper[5118]: I0223 08:43:26.033504 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" event={"ID":"33b45cc0-f140-4ef3-ad47-56be870583a5","Type":"ContainerStarted","Data":"e90b183c0461754eafd56b2e49b705455258437ffe4a0db20d62e43993013b4e"} Feb 23 08:43:26 crc kubenswrapper[5118]: I0223 08:43:26.037306 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" event={"ID":"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5","Type":"ContainerStarted","Data":"455495075d44fde7af61be121f05a832c489da97d73ed93efbbf1df3cc035b06"} Feb 23 08:43:26 crc kubenswrapper[5118]: I0223 08:43:26.037418 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" event={"ID":"afa3b53c-97b6-4eb6-b1f3-22b5feef02b5","Type":"ContainerStarted","Data":"72339f1d0f93390b64d0245fee39520f3e33a8b71f923e9c24557a1900ac82ee"} Feb 23 08:43:26 crc kubenswrapper[5118]: I0223 08:43:26.042303 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" event={"ID":"6113f337-ef10-43ad-9b81-84371a082261","Type":"ContainerStarted","Data":"cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d"} Feb 23 08:43:26 crc kubenswrapper[5118]: I0223 08:43:26.043711 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:26 crc kubenswrapper[5118]: I0223 08:43:26.123619 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-c8f867454-q8dx9" podStartSLOduration=2.816370448 podStartE2EDuration="4.123596742s" podCreationTimestamp="2026-02-23 08:43:22 +0000 UTC" firstStartedPulling="2026-02-23 08:43:23.54505793 +0000 UTC m=+7066.548842503" lastFinishedPulling="2026-02-23 08:43:24.852284224 +0000 UTC m=+7067.856068797" observedRunningTime="2026-02-23 08:43:26.120562619 +0000 UTC m=+7069.124347192" watchObservedRunningTime="2026-02-23 08:43:26.123596742 +0000 UTC m=+7069.127381315" Feb 23 08:43:26 crc kubenswrapper[5118]: I0223 08:43:26.183947 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" podStartSLOduration=4.183913874 podStartE2EDuration="4.183913874s" podCreationTimestamp="2026-02-23 08:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:43:26.17624444 +0000 UTC m=+7069.180029013" watchObservedRunningTime="2026-02-23 08:43:26.183913874 +0000 UTC m=+7069.187698467" Feb 23 08:43:26 crc kubenswrapper[5118]: I0223 08:43:26.221969 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-69b5cb8cbf-z5x2s" podStartSLOduration=2.59225352 podStartE2EDuration="4.22194497s" podCreationTimestamp="2026-02-23 08:43:22 +0000 UTC" firstStartedPulling="2026-02-23 08:43:23.221658741 +0000 UTC m=+7066.225443314" lastFinishedPulling="2026-02-23 08:43:24.851350191 +0000 UTC m=+7067.855134764" observedRunningTime="2026-02-23 08:43:26.214406079 +0000 UTC m=+7069.218190652" watchObservedRunningTime="2026-02-23 08:43:26.22194497 +0000 UTC m=+7069.225729543" Feb 23 08:43:29 crc kubenswrapper[5118]: I0223 08:43:29.714269 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:31 crc kubenswrapper[5118]: I0223 08:43:31.306057 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-678dd5c864-6nzbp" Feb 23 08:43:32 crc kubenswrapper[5118]: I0223 08:43:32.697362 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:43:32 crc kubenswrapper[5118]: E0223 08:43:32.697929 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:43:32 crc kubenswrapper[5118]: I0223 08:43:32.798539 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:43:32 crc kubenswrapper[5118]: I0223 08:43:32.888431 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd79565d5-np795"] Feb 23 08:43:32 crc kubenswrapper[5118]: I0223 08:43:32.888843 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fd79565d5-np795" podUID="c8fc4131-d6ba-4758-876d-9b0c1fe23c83" containerName="dnsmasq-dns" containerID="cri-o://cc78c6498a22f53dcfd7275c02f021f05352f39cbc38b61b42d39fb01a110147" gracePeriod=10 Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.129638 5118 generic.go:334] "Generic (PLEG): container finished" podID="c8fc4131-d6ba-4758-876d-9b0c1fe23c83" containerID="cc78c6498a22f53dcfd7275c02f021f05352f39cbc38b61b42d39fb01a110147" exitCode=0 Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.129734 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd79565d5-np795" event={"ID":"c8fc4131-d6ba-4758-876d-9b0c1fe23c83","Type":"ContainerDied","Data":"cc78c6498a22f53dcfd7275c02f021f05352f39cbc38b61b42d39fb01a110147"} Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.460141 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.512360 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-config\") pod \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.512526 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-sb\") pod \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.512550 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-nb\") pod \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.512572 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-dns-svc\") pod \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.512612 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rgv4\" (UniqueName: \"kubernetes.io/projected/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-kube-api-access-2rgv4\") pod \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\" (UID: \"c8fc4131-d6ba-4758-876d-9b0c1fe23c83\") " Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.546645 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-kube-api-access-2rgv4" (OuterVolumeSpecName: "kube-api-access-2rgv4") pod "c8fc4131-d6ba-4758-876d-9b0c1fe23c83" (UID: "c8fc4131-d6ba-4758-876d-9b0c1fe23c83"). InnerVolumeSpecName "kube-api-access-2rgv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.582038 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8fc4131-d6ba-4758-876d-9b0c1fe23c83" (UID: "c8fc4131-d6ba-4758-876d-9b0c1fe23c83"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.593397 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8fc4131-d6ba-4758-876d-9b0c1fe23c83" (UID: "c8fc4131-d6ba-4758-876d-9b0c1fe23c83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.606306 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-config" (OuterVolumeSpecName: "config") pod "c8fc4131-d6ba-4758-876d-9b0c1fe23c83" (UID: "c8fc4131-d6ba-4758-876d-9b0c1fe23c83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.608420 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8fc4131-d6ba-4758-876d-9b0c1fe23c83" (UID: "c8fc4131-d6ba-4758-876d-9b0c1fe23c83"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.614234 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.614267 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rgv4\" (UniqueName: \"kubernetes.io/projected/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-kube-api-access-2rgv4\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.614279 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.614289 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:33 crc kubenswrapper[5118]: I0223 08:43:33.614298 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8fc4131-d6ba-4758-876d-9b0c1fe23c83-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:34 crc kubenswrapper[5118]: I0223 08:43:34.142005 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd79565d5-np795" event={"ID":"c8fc4131-d6ba-4758-876d-9b0c1fe23c83","Type":"ContainerDied","Data":"977427d847862612e5f87fe8d0e9c7dafb3a61759cae63bf48dd859d877f0b9c"} Feb 23 08:43:34 crc kubenswrapper[5118]: I0223 08:43:34.142123 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd79565d5-np795" Feb 23 08:43:34 crc kubenswrapper[5118]: I0223 08:43:34.142427 5118 scope.go:117] "RemoveContainer" containerID="cc78c6498a22f53dcfd7275c02f021f05352f39cbc38b61b42d39fb01a110147" Feb 23 08:43:34 crc kubenswrapper[5118]: I0223 08:43:34.174503 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd79565d5-np795"] Feb 23 08:43:34 crc kubenswrapper[5118]: I0223 08:43:34.183600 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fd79565d5-np795"] Feb 23 08:43:34 crc kubenswrapper[5118]: I0223 08:43:34.183706 5118 scope.go:117] "RemoveContainer" containerID="84e51fed3b79b147343402e0dedae7126a9ab2e625829130d70116819d24569b" Feb 23 08:43:35 crc kubenswrapper[5118]: I0223 08:43:35.716967 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8fc4131-d6ba-4758-876d-9b0c1fe23c83" path="/var/lib/kubelet/pods/c8fc4131-d6ba-4758-876d-9b0c1fe23c83/volumes" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.746996 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-9cp6b"] Feb 23 08:43:42 crc kubenswrapper[5118]: E0223 08:43:42.749239 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8fc4131-d6ba-4758-876d-9b0c1fe23c83" containerName="init" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.749358 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8fc4131-d6ba-4758-876d-9b0c1fe23c83" containerName="init" Feb 23 08:43:42 crc kubenswrapper[5118]: E0223 08:43:42.749469 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8fc4131-d6ba-4758-876d-9b0c1fe23c83" containerName="dnsmasq-dns" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.749553 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8fc4131-d6ba-4758-876d-9b0c1fe23c83" containerName="dnsmasq-dns" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.749827 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8fc4131-d6ba-4758-876d-9b0c1fe23c83" containerName="dnsmasq-dns" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.750766 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9cp6b" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.761614 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9cp6b"] Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.825801 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf806736-41ea-43fa-8bef-3a6e1099f205-operator-scripts\") pod \"neutron-db-create-9cp6b\" (UID: \"cf806736-41ea-43fa-8bef-3a6e1099f205\") " pod="openstack/neutron-db-create-9cp6b" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.825952 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghsml\" (UniqueName: \"kubernetes.io/projected/cf806736-41ea-43fa-8bef-3a6e1099f205-kube-api-access-ghsml\") pod \"neutron-db-create-9cp6b\" (UID: \"cf806736-41ea-43fa-8bef-3a6e1099f205\") " pod="openstack/neutron-db-create-9cp6b" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.859351 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-599a-account-create-update-cbg84"] Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.860546 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-599a-account-create-update-cbg84" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.864906 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.881057 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-599a-account-create-update-cbg84"] Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.927090 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghsml\" (UniqueName: \"kubernetes.io/projected/cf806736-41ea-43fa-8bef-3a6e1099f205-kube-api-access-ghsml\") pod \"neutron-db-create-9cp6b\" (UID: \"cf806736-41ea-43fa-8bef-3a6e1099f205\") " pod="openstack/neutron-db-create-9cp6b" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.927204 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwc4q\" (UniqueName: \"kubernetes.io/projected/da4a68c4-1cea-47aa-8c95-14ac66230b69-kube-api-access-dwc4q\") pod \"neutron-599a-account-create-update-cbg84\" (UID: \"da4a68c4-1cea-47aa-8c95-14ac66230b69\") " pod="openstack/neutron-599a-account-create-update-cbg84" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.927266 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf806736-41ea-43fa-8bef-3a6e1099f205-operator-scripts\") pod \"neutron-db-create-9cp6b\" (UID: \"cf806736-41ea-43fa-8bef-3a6e1099f205\") " pod="openstack/neutron-db-create-9cp6b" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.927522 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da4a68c4-1cea-47aa-8c95-14ac66230b69-operator-scripts\") pod \"neutron-599a-account-create-update-cbg84\" (UID: \"da4a68c4-1cea-47aa-8c95-14ac66230b69\") " pod="openstack/neutron-599a-account-create-update-cbg84" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.927980 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf806736-41ea-43fa-8bef-3a6e1099f205-operator-scripts\") pod \"neutron-db-create-9cp6b\" (UID: \"cf806736-41ea-43fa-8bef-3a6e1099f205\") " pod="openstack/neutron-db-create-9cp6b" Feb 23 08:43:42 crc kubenswrapper[5118]: I0223 08:43:42.945179 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghsml\" (UniqueName: \"kubernetes.io/projected/cf806736-41ea-43fa-8bef-3a6e1099f205-kube-api-access-ghsml\") pod \"neutron-db-create-9cp6b\" (UID: \"cf806736-41ea-43fa-8bef-3a6e1099f205\") " pod="openstack/neutron-db-create-9cp6b" Feb 23 08:43:43 crc kubenswrapper[5118]: I0223 08:43:43.029553 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwc4q\" (UniqueName: \"kubernetes.io/projected/da4a68c4-1cea-47aa-8c95-14ac66230b69-kube-api-access-dwc4q\") pod \"neutron-599a-account-create-update-cbg84\" (UID: \"da4a68c4-1cea-47aa-8c95-14ac66230b69\") " pod="openstack/neutron-599a-account-create-update-cbg84" Feb 23 08:43:43 crc kubenswrapper[5118]: I0223 08:43:43.029726 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da4a68c4-1cea-47aa-8c95-14ac66230b69-operator-scripts\") pod \"neutron-599a-account-create-update-cbg84\" (UID: \"da4a68c4-1cea-47aa-8c95-14ac66230b69\") " pod="openstack/neutron-599a-account-create-update-cbg84" Feb 23 08:43:43 crc kubenswrapper[5118]: I0223 08:43:43.031073 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da4a68c4-1cea-47aa-8c95-14ac66230b69-operator-scripts\") pod \"neutron-599a-account-create-update-cbg84\" (UID: \"da4a68c4-1cea-47aa-8c95-14ac66230b69\") " pod="openstack/neutron-599a-account-create-update-cbg84" Feb 23 08:43:43 crc kubenswrapper[5118]: I0223 08:43:43.047595 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwc4q\" (UniqueName: \"kubernetes.io/projected/da4a68c4-1cea-47aa-8c95-14ac66230b69-kube-api-access-dwc4q\") pod \"neutron-599a-account-create-update-cbg84\" (UID: \"da4a68c4-1cea-47aa-8c95-14ac66230b69\") " pod="openstack/neutron-599a-account-create-update-cbg84" Feb 23 08:43:43 crc kubenswrapper[5118]: I0223 08:43:43.074245 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9cp6b" Feb 23 08:43:43 crc kubenswrapper[5118]: I0223 08:43:43.178273 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-599a-account-create-update-cbg84" Feb 23 08:43:43 crc kubenswrapper[5118]: I0223 08:43:43.690044 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9cp6b"] Feb 23 08:43:43 crc kubenswrapper[5118]: I0223 08:43:43.813514 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-599a-account-create-update-cbg84"] Feb 23 08:43:43 crc kubenswrapper[5118]: W0223 08:43:43.817814 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda4a68c4_1cea_47aa_8c95_14ac66230b69.slice/crio-f867ed9847f7a6d941d9893ce45ef77be34e09e150b79c45fb9c2fba3333a4a8 WatchSource:0}: Error finding container f867ed9847f7a6d941d9893ce45ef77be34e09e150b79c45fb9c2fba3333a4a8: Status 404 returned error can't find the container with id f867ed9847f7a6d941d9893ce45ef77be34e09e150b79c45fb9c2fba3333a4a8 Feb 23 08:43:44 crc kubenswrapper[5118]: I0223 08:43:44.300446 5118 generic.go:334] "Generic (PLEG): container finished" podID="cf806736-41ea-43fa-8bef-3a6e1099f205" containerID="52295287fb99064626e4f85b029292f39e059b8b3af1b87256e2917cb5bcbca8" exitCode=0 Feb 23 08:43:44 crc kubenswrapper[5118]: I0223 08:43:44.300513 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9cp6b" event={"ID":"cf806736-41ea-43fa-8bef-3a6e1099f205","Type":"ContainerDied","Data":"52295287fb99064626e4f85b029292f39e059b8b3af1b87256e2917cb5bcbca8"} Feb 23 08:43:44 crc kubenswrapper[5118]: I0223 08:43:44.300585 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9cp6b" event={"ID":"cf806736-41ea-43fa-8bef-3a6e1099f205","Type":"ContainerStarted","Data":"6fbfcbf059352deb6918b2d7c3309c89f42baf36c026e9e30a7f0eac9dd5a9d9"} Feb 23 08:43:44 crc kubenswrapper[5118]: I0223 08:43:44.303749 5118 generic.go:334] "Generic (PLEG): container finished" podID="da4a68c4-1cea-47aa-8c95-14ac66230b69" containerID="ee73b33aec80b30dc9c913fd721b96c9b2a925fd87b7f41085d40b5c946603c7" exitCode=0 Feb 23 08:43:44 crc kubenswrapper[5118]: I0223 08:43:44.303816 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-599a-account-create-update-cbg84" event={"ID":"da4a68c4-1cea-47aa-8c95-14ac66230b69","Type":"ContainerDied","Data":"ee73b33aec80b30dc9c913fd721b96c9b2a925fd87b7f41085d40b5c946603c7"} Feb 23 08:43:44 crc kubenswrapper[5118]: I0223 08:43:44.304190 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-599a-account-create-update-cbg84" event={"ID":"da4a68c4-1cea-47aa-8c95-14ac66230b69","Type":"ContainerStarted","Data":"f867ed9847f7a6d941d9893ce45ef77be34e09e150b79c45fb9c2fba3333a4a8"} Feb 23 08:43:45 crc kubenswrapper[5118]: I0223 08:43:45.697942 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:43:45 crc kubenswrapper[5118]: I0223 08:43:45.878113 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-599a-account-create-update-cbg84" Feb 23 08:43:45 crc kubenswrapper[5118]: I0223 08:43:45.883162 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9cp6b" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.008423 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwc4q\" (UniqueName: \"kubernetes.io/projected/da4a68c4-1cea-47aa-8c95-14ac66230b69-kube-api-access-dwc4q\") pod \"da4a68c4-1cea-47aa-8c95-14ac66230b69\" (UID: \"da4a68c4-1cea-47aa-8c95-14ac66230b69\") " Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.008685 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghsml\" (UniqueName: \"kubernetes.io/projected/cf806736-41ea-43fa-8bef-3a6e1099f205-kube-api-access-ghsml\") pod \"cf806736-41ea-43fa-8bef-3a6e1099f205\" (UID: \"cf806736-41ea-43fa-8bef-3a6e1099f205\") " Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.008845 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf806736-41ea-43fa-8bef-3a6e1099f205-operator-scripts\") pod \"cf806736-41ea-43fa-8bef-3a6e1099f205\" (UID: \"cf806736-41ea-43fa-8bef-3a6e1099f205\") " Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.008915 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da4a68c4-1cea-47aa-8c95-14ac66230b69-operator-scripts\") pod \"da4a68c4-1cea-47aa-8c95-14ac66230b69\" (UID: \"da4a68c4-1cea-47aa-8c95-14ac66230b69\") " Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.011266 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf806736-41ea-43fa-8bef-3a6e1099f205-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf806736-41ea-43fa-8bef-3a6e1099f205" (UID: "cf806736-41ea-43fa-8bef-3a6e1099f205"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.012412 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da4a68c4-1cea-47aa-8c95-14ac66230b69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da4a68c4-1cea-47aa-8c95-14ac66230b69" (UID: "da4a68c4-1cea-47aa-8c95-14ac66230b69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.018867 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf806736-41ea-43fa-8bef-3a6e1099f205-kube-api-access-ghsml" (OuterVolumeSpecName: "kube-api-access-ghsml") pod "cf806736-41ea-43fa-8bef-3a6e1099f205" (UID: "cf806736-41ea-43fa-8bef-3a6e1099f205"). InnerVolumeSpecName "kube-api-access-ghsml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.019301 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da4a68c4-1cea-47aa-8c95-14ac66230b69-kube-api-access-dwc4q" (OuterVolumeSpecName: "kube-api-access-dwc4q") pod "da4a68c4-1cea-47aa-8c95-14ac66230b69" (UID: "da4a68c4-1cea-47aa-8c95-14ac66230b69"). InnerVolumeSpecName "kube-api-access-dwc4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.111057 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwc4q\" (UniqueName: \"kubernetes.io/projected/da4a68c4-1cea-47aa-8c95-14ac66230b69-kube-api-access-dwc4q\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.111138 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghsml\" (UniqueName: \"kubernetes.io/projected/cf806736-41ea-43fa-8bef-3a6e1099f205-kube-api-access-ghsml\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.111154 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf806736-41ea-43fa-8bef-3a6e1099f205-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.111165 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da4a68c4-1cea-47aa-8c95-14ac66230b69-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.328465 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9cp6b" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.328475 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9cp6b" event={"ID":"cf806736-41ea-43fa-8bef-3a6e1099f205","Type":"ContainerDied","Data":"6fbfcbf059352deb6918b2d7c3309c89f42baf36c026e9e30a7f0eac9dd5a9d9"} Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.328936 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fbfcbf059352deb6918b2d7c3309c89f42baf36c026e9e30a7f0eac9dd5a9d9" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.331335 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"736e449ee6985bd3eea8148239e5d9c3b28aa25fd251975364e0cd9fe953dfbb"} Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.334349 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-599a-account-create-update-cbg84" event={"ID":"da4a68c4-1cea-47aa-8c95-14ac66230b69","Type":"ContainerDied","Data":"f867ed9847f7a6d941d9893ce45ef77be34e09e150b79c45fb9c2fba3333a4a8"} Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.334399 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f867ed9847f7a6d941d9893ce45ef77be34e09e150b79c45fb9c2fba3333a4a8" Feb 23 08:43:46 crc kubenswrapper[5118]: I0223 08:43:46.334416 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-599a-account-create-update-cbg84" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.196882 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-svcx6"] Feb 23 08:43:48 crc kubenswrapper[5118]: E0223 08:43:48.198005 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4a68c4-1cea-47aa-8c95-14ac66230b69" containerName="mariadb-account-create-update" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.198027 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4a68c4-1cea-47aa-8c95-14ac66230b69" containerName="mariadb-account-create-update" Feb 23 08:43:48 crc kubenswrapper[5118]: E0223 08:43:48.198062 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf806736-41ea-43fa-8bef-3a6e1099f205" containerName="mariadb-database-create" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.198074 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf806736-41ea-43fa-8bef-3a6e1099f205" containerName="mariadb-database-create" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.198341 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf806736-41ea-43fa-8bef-3a6e1099f205" containerName="mariadb-database-create" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.198362 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="da4a68c4-1cea-47aa-8c95-14ac66230b69" containerName="mariadb-account-create-update" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.199248 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.202174 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.202485 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pxkbr" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.202657 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.208605 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-svcx6"] Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.254286 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqkg\" (UniqueName: \"kubernetes.io/projected/b363e1ea-c245-4751-a7d7-595f7cb83b25-kube-api-access-fjqkg\") pod \"neutron-db-sync-svcx6\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.254389 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-config\") pod \"neutron-db-sync-svcx6\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.254427 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-combined-ca-bundle\") pod \"neutron-db-sync-svcx6\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.355583 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-config\") pod \"neutron-db-sync-svcx6\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.356034 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-combined-ca-bundle\") pod \"neutron-db-sync-svcx6\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.356212 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqkg\" (UniqueName: \"kubernetes.io/projected/b363e1ea-c245-4751-a7d7-595f7cb83b25-kube-api-access-fjqkg\") pod \"neutron-db-sync-svcx6\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.362706 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-config\") pod \"neutron-db-sync-svcx6\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.367951 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-combined-ca-bundle\") pod \"neutron-db-sync-svcx6\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.376269 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqkg\" (UniqueName: \"kubernetes.io/projected/b363e1ea-c245-4751-a7d7-595f7cb83b25-kube-api-access-fjqkg\") pod \"neutron-db-sync-svcx6\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.524595 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:48 crc kubenswrapper[5118]: I0223 08:43:48.795362 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-svcx6"] Feb 23 08:43:49 crc kubenswrapper[5118]: I0223 08:43:49.365009 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-svcx6" event={"ID":"b363e1ea-c245-4751-a7d7-595f7cb83b25","Type":"ContainerStarted","Data":"02a9aea63e1501316ecf7d3daf0e6b20b4b0c3c97060236f6d4545d4837ced3b"} Feb 23 08:43:49 crc kubenswrapper[5118]: I0223 08:43:49.365424 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-svcx6" event={"ID":"b363e1ea-c245-4751-a7d7-595f7cb83b25","Type":"ContainerStarted","Data":"1e8e0fcfebe1288216b6abfbb69be30210aecb0a7a4a4b97e8d55676754e1987"} Feb 23 08:43:49 crc kubenswrapper[5118]: I0223 08:43:49.407631 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-svcx6" podStartSLOduration=1.407604905 podStartE2EDuration="1.407604905s" podCreationTimestamp="2026-02-23 08:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:43:49.396413176 +0000 UTC m=+7092.400197759" watchObservedRunningTime="2026-02-23 08:43:49.407604905 +0000 UTC m=+7092.411389488" Feb 23 08:43:53 crc kubenswrapper[5118]: I0223 08:43:53.404736 5118 generic.go:334] "Generic (PLEG): container finished" podID="b363e1ea-c245-4751-a7d7-595f7cb83b25" containerID="02a9aea63e1501316ecf7d3daf0e6b20b4b0c3c97060236f6d4545d4837ced3b" exitCode=0 Feb 23 08:43:53 crc kubenswrapper[5118]: I0223 08:43:53.404857 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-svcx6" event={"ID":"b363e1ea-c245-4751-a7d7-595f7cb83b25","Type":"ContainerDied","Data":"02a9aea63e1501316ecf7d3daf0e6b20b4b0c3c97060236f6d4545d4837ced3b"} Feb 23 08:43:54 crc kubenswrapper[5118]: I0223 08:43:54.829641 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:54 crc kubenswrapper[5118]: I0223 08:43:54.909438 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjqkg\" (UniqueName: \"kubernetes.io/projected/b363e1ea-c245-4751-a7d7-595f7cb83b25-kube-api-access-fjqkg\") pod \"b363e1ea-c245-4751-a7d7-595f7cb83b25\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " Feb 23 08:43:54 crc kubenswrapper[5118]: I0223 08:43:54.909978 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-config\") pod \"b363e1ea-c245-4751-a7d7-595f7cb83b25\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " Feb 23 08:43:54 crc kubenswrapper[5118]: I0223 08:43:54.910339 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-combined-ca-bundle\") pod \"b363e1ea-c245-4751-a7d7-595f7cb83b25\" (UID: \"b363e1ea-c245-4751-a7d7-595f7cb83b25\") " Feb 23 08:43:54 crc kubenswrapper[5118]: I0223 08:43:54.916818 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b363e1ea-c245-4751-a7d7-595f7cb83b25-kube-api-access-fjqkg" (OuterVolumeSpecName: "kube-api-access-fjqkg") pod "b363e1ea-c245-4751-a7d7-595f7cb83b25" (UID: "b363e1ea-c245-4751-a7d7-595f7cb83b25"). InnerVolumeSpecName "kube-api-access-fjqkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:43:54 crc kubenswrapper[5118]: I0223 08:43:54.940825 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-config" (OuterVolumeSpecName: "config") pod "b363e1ea-c245-4751-a7d7-595f7cb83b25" (UID: "b363e1ea-c245-4751-a7d7-595f7cb83b25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:43:54 crc kubenswrapper[5118]: I0223 08:43:54.951929 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b363e1ea-c245-4751-a7d7-595f7cb83b25" (UID: "b363e1ea-c245-4751-a7d7-595f7cb83b25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.014003 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjqkg\" (UniqueName: \"kubernetes.io/projected/b363e1ea-c245-4751-a7d7-595f7cb83b25-kube-api-access-fjqkg\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.014055 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.014069 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b363e1ea-c245-4751-a7d7-595f7cb83b25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.470642 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-svcx6" event={"ID":"b363e1ea-c245-4751-a7d7-595f7cb83b25","Type":"ContainerDied","Data":"1e8e0fcfebe1288216b6abfbb69be30210aecb0a7a4a4b97e8d55676754e1987"} Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.471149 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e8e0fcfebe1288216b6abfbb69be30210aecb0a7a4a4b97e8d55676754e1987" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.470694 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-svcx6" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.618913 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-546fd8f979-dwngr"] Feb 23 08:43:55 crc kubenswrapper[5118]: E0223 08:43:55.619341 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b363e1ea-c245-4751-a7d7-595f7cb83b25" containerName="neutron-db-sync" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.619359 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b363e1ea-c245-4751-a7d7-595f7cb83b25" containerName="neutron-db-sync" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.619526 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b363e1ea-c245-4751-a7d7-595f7cb83b25" containerName="neutron-db-sync" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.620447 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.657962 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546fd8f979-dwngr"] Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.727022 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-config\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.727118 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7jhn\" (UniqueName: \"kubernetes.io/projected/377d595e-f15a-4914-8860-b3f447cf04fc-kube-api-access-m7jhn\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.727145 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-nb\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.727460 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-sb\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.727562 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-dns-svc\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.791515 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-674bbb4f9c-tfqpc"] Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.796172 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.799040 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.799468 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pxkbr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.802709 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.812894 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-674bbb4f9c-tfqpc"] Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.833238 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-sb\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.834606 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-dns-svc\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.834848 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-config\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.834977 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7jhn\" (UniqueName: \"kubernetes.io/projected/377d595e-f15a-4914-8860-b3f447cf04fc-kube-api-access-m7jhn\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.835088 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-nb\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.835406 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-dns-svc\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.835536 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-config\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.837014 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-nb\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.834479 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-sb\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.862165 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7jhn\" (UniqueName: \"kubernetes.io/projected/377d595e-f15a-4914-8860-b3f447cf04fc-kube-api-access-m7jhn\") pod \"dnsmasq-dns-546fd8f979-dwngr\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.937276 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5af9d028-a91a-4540-926f-d0c7c50ca61b-config\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.937424 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5af9d028-a91a-4540-926f-d0c7c50ca61b-httpd-config\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.937453 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af9d028-a91a-4540-926f-d0c7c50ca61b-combined-ca-bundle\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.937544 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbb2z\" (UniqueName: \"kubernetes.io/projected/5af9d028-a91a-4540-926f-d0c7c50ca61b-kube-api-access-tbb2z\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:55 crc kubenswrapper[5118]: I0223 08:43:55.949021 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.039318 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5af9d028-a91a-4540-926f-d0c7c50ca61b-httpd-config\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.039398 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af9d028-a91a-4540-926f-d0c7c50ca61b-combined-ca-bundle\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.039473 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbb2z\" (UniqueName: \"kubernetes.io/projected/5af9d028-a91a-4540-926f-d0c7c50ca61b-kube-api-access-tbb2z\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.039619 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5af9d028-a91a-4540-926f-d0c7c50ca61b-config\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.046734 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5af9d028-a91a-4540-926f-d0c7c50ca61b-httpd-config\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.046890 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5af9d028-a91a-4540-926f-d0c7c50ca61b-config\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.056038 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af9d028-a91a-4540-926f-d0c7c50ca61b-combined-ca-bundle\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.069312 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbb2z\" (UniqueName: \"kubernetes.io/projected/5af9d028-a91a-4540-926f-d0c7c50ca61b-kube-api-access-tbb2z\") pod \"neutron-674bbb4f9c-tfqpc\" (UID: \"5af9d028-a91a-4540-926f-d0c7c50ca61b\") " pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.121807 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.545875 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546fd8f979-dwngr"] Feb 23 08:43:56 crc kubenswrapper[5118]: W0223 08:43:56.548217 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod377d595e_f15a_4914_8860_b3f447cf04fc.slice/crio-1c8b17204cf89652438fc4813a53992e500d7d3fd141659d7366eb77a5003925 WatchSource:0}: Error finding container 1c8b17204cf89652438fc4813a53992e500d7d3fd141659d7366eb77a5003925: Status 404 returned error can't find the container with id 1c8b17204cf89652438fc4813a53992e500d7d3fd141659d7366eb77a5003925 Feb 23 08:43:56 crc kubenswrapper[5118]: I0223 08:43:56.818872 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-674bbb4f9c-tfqpc"] Feb 23 08:43:56 crc kubenswrapper[5118]: W0223 08:43:56.856480 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5af9d028_a91a_4540_926f_d0c7c50ca61b.slice/crio-47a25a4a2e87a6bfb9ae86d9e1de137a06b36d672c5d273c0e0051b7ff641cac WatchSource:0}: Error finding container 47a25a4a2e87a6bfb9ae86d9e1de137a06b36d672c5d273c0e0051b7ff641cac: Status 404 returned error can't find the container with id 47a25a4a2e87a6bfb9ae86d9e1de137a06b36d672c5d273c0e0051b7ff641cac Feb 23 08:43:57 crc kubenswrapper[5118]: I0223 08:43:57.527360 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-674bbb4f9c-tfqpc" event={"ID":"5af9d028-a91a-4540-926f-d0c7c50ca61b","Type":"ContainerStarted","Data":"8aef6176b0b32ead10cd420fbc095a549267c1820bf75ebb646a168bc09ec72b"} Feb 23 08:43:57 crc kubenswrapper[5118]: I0223 08:43:57.528034 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-674bbb4f9c-tfqpc" event={"ID":"5af9d028-a91a-4540-926f-d0c7c50ca61b","Type":"ContainerStarted","Data":"41d86ee6a5aa2be367f6dba50461e63a883652dd7a9b5521becfb2857561a4b3"} Feb 23 08:43:57 crc kubenswrapper[5118]: I0223 08:43:57.528154 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-674bbb4f9c-tfqpc" event={"ID":"5af9d028-a91a-4540-926f-d0c7c50ca61b","Type":"ContainerStarted","Data":"47a25a4a2e87a6bfb9ae86d9e1de137a06b36d672c5d273c0e0051b7ff641cac"} Feb 23 08:43:57 crc kubenswrapper[5118]: I0223 08:43:57.528271 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:43:57 crc kubenswrapper[5118]: I0223 08:43:57.543467 5118 generic.go:334] "Generic (PLEG): container finished" podID="377d595e-f15a-4914-8860-b3f447cf04fc" containerID="f26674dc4bed47340c4cc48ec1b53d50988d15b7575e1bb64f16ebedd754588e" exitCode=0 Feb 23 08:43:57 crc kubenswrapper[5118]: I0223 08:43:57.543527 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" event={"ID":"377d595e-f15a-4914-8860-b3f447cf04fc","Type":"ContainerDied","Data":"f26674dc4bed47340c4cc48ec1b53d50988d15b7575e1bb64f16ebedd754588e"} Feb 23 08:43:57 crc kubenswrapper[5118]: I0223 08:43:57.543564 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" event={"ID":"377d595e-f15a-4914-8860-b3f447cf04fc","Type":"ContainerStarted","Data":"1c8b17204cf89652438fc4813a53992e500d7d3fd141659d7366eb77a5003925"} Feb 23 08:43:57 crc kubenswrapper[5118]: I0223 08:43:57.569691 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-674bbb4f9c-tfqpc" podStartSLOduration=2.56966339 podStartE2EDuration="2.56966339s" podCreationTimestamp="2026-02-23 08:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:43:57.562329964 +0000 UTC m=+7100.566114537" watchObservedRunningTime="2026-02-23 08:43:57.56966339 +0000 UTC m=+7100.573447963" Feb 23 08:43:58 crc kubenswrapper[5118]: I0223 08:43:58.553976 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" event={"ID":"377d595e-f15a-4914-8860-b3f447cf04fc","Type":"ContainerStarted","Data":"e11321901c3d910ba16b14f88ec264ba526b1ee0c8e6a29b59254945447c8a77"} Feb 23 08:43:58 crc kubenswrapper[5118]: I0223 08:43:58.554474 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:44:05 crc kubenswrapper[5118]: I0223 08:44:05.951383 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:44:05 crc kubenswrapper[5118]: I0223 08:44:05.986600 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" podStartSLOduration=10.986565554 podStartE2EDuration="10.986565554s" podCreationTimestamp="2026-02-23 08:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:43:58.585560777 +0000 UTC m=+7101.589345350" watchObservedRunningTime="2026-02-23 08:44:05.986565554 +0000 UTC m=+7108.990350167" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.042977 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff98d5d9f-v4dg9"] Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.043347 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" podUID="6113f337-ef10-43ad-9b81-84371a082261" containerName="dnsmasq-dns" containerID="cri-o://cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d" gracePeriod=10 Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.512013 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.577874 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdz4s\" (UniqueName: \"kubernetes.io/projected/6113f337-ef10-43ad-9b81-84371a082261-kube-api-access-wdz4s\") pod \"6113f337-ef10-43ad-9b81-84371a082261\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.578014 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-sb\") pod \"6113f337-ef10-43ad-9b81-84371a082261\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.578118 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-dns-svc\") pod \"6113f337-ef10-43ad-9b81-84371a082261\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.578220 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-nb\") pod \"6113f337-ef10-43ad-9b81-84371a082261\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.578249 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-config\") pod \"6113f337-ef10-43ad-9b81-84371a082261\" (UID: \"6113f337-ef10-43ad-9b81-84371a082261\") " Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.597696 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6113f337-ef10-43ad-9b81-84371a082261-kube-api-access-wdz4s" (OuterVolumeSpecName: "kube-api-access-wdz4s") pod "6113f337-ef10-43ad-9b81-84371a082261" (UID: "6113f337-ef10-43ad-9b81-84371a082261"). InnerVolumeSpecName "kube-api-access-wdz4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.624746 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6113f337-ef10-43ad-9b81-84371a082261" (UID: "6113f337-ef10-43ad-9b81-84371a082261"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.624740 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-config" (OuterVolumeSpecName: "config") pod "6113f337-ef10-43ad-9b81-84371a082261" (UID: "6113f337-ef10-43ad-9b81-84371a082261"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.639331 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6113f337-ef10-43ad-9b81-84371a082261" (UID: "6113f337-ef10-43ad-9b81-84371a082261"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.642034 5118 generic.go:334] "Generic (PLEG): container finished" podID="6113f337-ef10-43ad-9b81-84371a082261" containerID="cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d" exitCode=0 Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.642081 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" event={"ID":"6113f337-ef10-43ad-9b81-84371a082261","Type":"ContainerDied","Data":"cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d"} Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.642128 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" event={"ID":"6113f337-ef10-43ad-9b81-84371a082261","Type":"ContainerDied","Data":"3332fb5d5e5d45001c87c1bb5669d99486b692413cab29da8ad402fca7ad995a"} Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.642147 5118 scope.go:117] "RemoveContainer" containerID="cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.642155 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff98d5d9f-v4dg9" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.650982 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6113f337-ef10-43ad-9b81-84371a082261" (UID: "6113f337-ef10-43ad-9b81-84371a082261"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.679894 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.679922 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.679933 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.679944 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdz4s\" (UniqueName: \"kubernetes.io/projected/6113f337-ef10-43ad-9b81-84371a082261-kube-api-access-wdz4s\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.679954 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6113f337-ef10-43ad-9b81-84371a082261-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.682110 5118 scope.go:117] "RemoveContainer" containerID="bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.709915 5118 scope.go:117] "RemoveContainer" containerID="cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d" Feb 23 08:44:06 crc kubenswrapper[5118]: E0223 08:44:06.710400 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d\": container with ID starting with cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d not found: ID does not exist" containerID="cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.710437 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d"} err="failed to get container status \"cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d\": rpc error: code = NotFound desc = could not find container \"cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d\": container with ID starting with cb45a60681b01b20299c601c156988e50c8e050abd45a284472ab5cfb41ff71d not found: ID does not exist" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.710461 5118 scope.go:117] "RemoveContainer" containerID="bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb" Feb 23 08:44:06 crc kubenswrapper[5118]: E0223 08:44:06.710849 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb\": container with ID starting with bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb not found: ID does not exist" containerID="bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.710873 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb"} err="failed to get container status \"bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb\": rpc error: code = NotFound desc = could not find container \"bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb\": container with ID starting with bb5678451d708659a55c2e262fd481019296698e7419a1c83977bbc3946b18fb not found: ID does not exist" Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.976640 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff98d5d9f-v4dg9"] Feb 23 08:44:06 crc kubenswrapper[5118]: I0223 08:44:06.995359 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ff98d5d9f-v4dg9"] Feb 23 08:44:07 crc kubenswrapper[5118]: I0223 08:44:07.716438 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6113f337-ef10-43ad-9b81-84371a082261" path="/var/lib/kubelet/pods/6113f337-ef10-43ad-9b81-84371a082261/volumes" Feb 23 08:44:26 crc kubenswrapper[5118]: I0223 08:44:26.142231 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-674bbb4f9c-tfqpc" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.096078 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5b81-account-create-update-7bbtq"] Feb 23 08:44:33 crc kubenswrapper[5118]: E0223 08:44:33.097490 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6113f337-ef10-43ad-9b81-84371a082261" containerName="dnsmasq-dns" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.097509 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6113f337-ef10-43ad-9b81-84371a082261" containerName="dnsmasq-dns" Feb 23 08:44:33 crc kubenswrapper[5118]: E0223 08:44:33.097561 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6113f337-ef10-43ad-9b81-84371a082261" containerName="init" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.097570 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6113f337-ef10-43ad-9b81-84371a082261" containerName="init" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.097836 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="6113f337-ef10-43ad-9b81-84371a082261" containerName="dnsmasq-dns" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.098888 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5b81-account-create-update-7bbtq" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.104423 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.108766 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-28f62"] Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.110539 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-28f62" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.117334 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5b81-account-create-update-7bbtq"] Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.142917 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjz9t\" (UniqueName: \"kubernetes.io/projected/cf300251-6812-4636-8f38-10cfe37fdcd0-kube-api-access-fjz9t\") pod \"glance-5b81-account-create-update-7bbtq\" (UID: \"cf300251-6812-4636-8f38-10cfe37fdcd0\") " pod="openstack/glance-5b81-account-create-update-7bbtq" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.143214 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90b03a3-618b-48e0-8de1-6290a7cf0140-operator-scripts\") pod \"glance-db-create-28f62\" (UID: \"c90b03a3-618b-48e0-8de1-6290a7cf0140\") " pod="openstack/glance-db-create-28f62" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.143361 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf300251-6812-4636-8f38-10cfe37fdcd0-operator-scripts\") pod \"glance-5b81-account-create-update-7bbtq\" (UID: \"cf300251-6812-4636-8f38-10cfe37fdcd0\") " pod="openstack/glance-5b81-account-create-update-7bbtq" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.143471 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmpf\" (UniqueName: \"kubernetes.io/projected/c90b03a3-618b-48e0-8de1-6290a7cf0140-kube-api-access-vhmpf\") pod \"glance-db-create-28f62\" (UID: \"c90b03a3-618b-48e0-8de1-6290a7cf0140\") " pod="openstack/glance-db-create-28f62" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.153664 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-28f62"] Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.245203 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf300251-6812-4636-8f38-10cfe37fdcd0-operator-scripts\") pod \"glance-5b81-account-create-update-7bbtq\" (UID: \"cf300251-6812-4636-8f38-10cfe37fdcd0\") " pod="openstack/glance-5b81-account-create-update-7bbtq" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.245251 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmpf\" (UniqueName: \"kubernetes.io/projected/c90b03a3-618b-48e0-8de1-6290a7cf0140-kube-api-access-vhmpf\") pod \"glance-db-create-28f62\" (UID: \"c90b03a3-618b-48e0-8de1-6290a7cf0140\") " pod="openstack/glance-db-create-28f62" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.246231 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjz9t\" (UniqueName: \"kubernetes.io/projected/cf300251-6812-4636-8f38-10cfe37fdcd0-kube-api-access-fjz9t\") pod \"glance-5b81-account-create-update-7bbtq\" (UID: \"cf300251-6812-4636-8f38-10cfe37fdcd0\") " pod="openstack/glance-5b81-account-create-update-7bbtq" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.246334 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90b03a3-618b-48e0-8de1-6290a7cf0140-operator-scripts\") pod \"glance-db-create-28f62\" (UID: \"c90b03a3-618b-48e0-8de1-6290a7cf0140\") " pod="openstack/glance-db-create-28f62" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.246557 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf300251-6812-4636-8f38-10cfe37fdcd0-operator-scripts\") pod \"glance-5b81-account-create-update-7bbtq\" (UID: \"cf300251-6812-4636-8f38-10cfe37fdcd0\") " pod="openstack/glance-5b81-account-create-update-7bbtq" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.247118 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90b03a3-618b-48e0-8de1-6290a7cf0140-operator-scripts\") pod \"glance-db-create-28f62\" (UID: \"c90b03a3-618b-48e0-8de1-6290a7cf0140\") " pod="openstack/glance-db-create-28f62" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.268327 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmpf\" (UniqueName: \"kubernetes.io/projected/c90b03a3-618b-48e0-8de1-6290a7cf0140-kube-api-access-vhmpf\") pod \"glance-db-create-28f62\" (UID: \"c90b03a3-618b-48e0-8de1-6290a7cf0140\") " pod="openstack/glance-db-create-28f62" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.268774 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjz9t\" (UniqueName: \"kubernetes.io/projected/cf300251-6812-4636-8f38-10cfe37fdcd0-kube-api-access-fjz9t\") pod \"glance-5b81-account-create-update-7bbtq\" (UID: \"cf300251-6812-4636-8f38-10cfe37fdcd0\") " pod="openstack/glance-5b81-account-create-update-7bbtq" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.428799 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5b81-account-create-update-7bbtq" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.450517 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-28f62" Feb 23 08:44:33 crc kubenswrapper[5118]: I0223 08:44:33.954404 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-28f62"] Feb 23 08:44:34 crc kubenswrapper[5118]: I0223 08:44:34.022206 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5b81-account-create-update-7bbtq"] Feb 23 08:44:34 crc kubenswrapper[5118]: W0223 08:44:34.023503 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf300251_6812_4636_8f38_10cfe37fdcd0.slice/crio-5c860e389aa528a767ddba40b3d39be370b6cf4aeeeaa56c24eed1a05ad5d024 WatchSource:0}: Error finding container 5c860e389aa528a767ddba40b3d39be370b6cf4aeeeaa56c24eed1a05ad5d024: Status 404 returned error can't find the container with id 5c860e389aa528a767ddba40b3d39be370b6cf4aeeeaa56c24eed1a05ad5d024 Feb 23 08:44:34 crc kubenswrapper[5118]: I0223 08:44:34.964505 5118 generic.go:334] "Generic (PLEG): container finished" podID="cf300251-6812-4636-8f38-10cfe37fdcd0" containerID="86dd6def0e4fea1e79f818bbe1560fa124fc10038c2e66f922c5f5e605a928b5" exitCode=0 Feb 23 08:44:34 crc kubenswrapper[5118]: I0223 08:44:34.964802 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5b81-account-create-update-7bbtq" event={"ID":"cf300251-6812-4636-8f38-10cfe37fdcd0","Type":"ContainerDied","Data":"86dd6def0e4fea1e79f818bbe1560fa124fc10038c2e66f922c5f5e605a928b5"} Feb 23 08:44:34 crc kubenswrapper[5118]: I0223 08:44:34.964829 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5b81-account-create-update-7bbtq" event={"ID":"cf300251-6812-4636-8f38-10cfe37fdcd0","Type":"ContainerStarted","Data":"5c860e389aa528a767ddba40b3d39be370b6cf4aeeeaa56c24eed1a05ad5d024"} Feb 23 08:44:34 crc kubenswrapper[5118]: I0223 08:44:34.968322 5118 generic.go:334] "Generic (PLEG): container finished" podID="c90b03a3-618b-48e0-8de1-6290a7cf0140" containerID="2528c65314fe9fefc01886e924b1e15c1e14b905f86bd5ce7affd82fafe9704e" exitCode=0 Feb 23 08:44:34 crc kubenswrapper[5118]: I0223 08:44:34.968347 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-28f62" event={"ID":"c90b03a3-618b-48e0-8de1-6290a7cf0140","Type":"ContainerDied","Data":"2528c65314fe9fefc01886e924b1e15c1e14b905f86bd5ce7affd82fafe9704e"} Feb 23 08:44:34 crc kubenswrapper[5118]: I0223 08:44:34.968362 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-28f62" event={"ID":"c90b03a3-618b-48e0-8de1-6290a7cf0140","Type":"ContainerStarted","Data":"1294a7e12b52c9f92793c502d6c79feae9414df8c02a31b07dd34b3791162c04"} Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.420779 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-28f62" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.425855 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5b81-account-create-update-7bbtq" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.622610 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf300251-6812-4636-8f38-10cfe37fdcd0-operator-scripts\") pod \"cf300251-6812-4636-8f38-10cfe37fdcd0\" (UID: \"cf300251-6812-4636-8f38-10cfe37fdcd0\") " Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.622719 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhmpf\" (UniqueName: \"kubernetes.io/projected/c90b03a3-618b-48e0-8de1-6290a7cf0140-kube-api-access-vhmpf\") pod \"c90b03a3-618b-48e0-8de1-6290a7cf0140\" (UID: \"c90b03a3-618b-48e0-8de1-6290a7cf0140\") " Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.622795 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjz9t\" (UniqueName: \"kubernetes.io/projected/cf300251-6812-4636-8f38-10cfe37fdcd0-kube-api-access-fjz9t\") pod \"cf300251-6812-4636-8f38-10cfe37fdcd0\" (UID: \"cf300251-6812-4636-8f38-10cfe37fdcd0\") " Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.622852 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90b03a3-618b-48e0-8de1-6290a7cf0140-operator-scripts\") pod \"c90b03a3-618b-48e0-8de1-6290a7cf0140\" (UID: \"c90b03a3-618b-48e0-8de1-6290a7cf0140\") " Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.623564 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf300251-6812-4636-8f38-10cfe37fdcd0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf300251-6812-4636-8f38-10cfe37fdcd0" (UID: "cf300251-6812-4636-8f38-10cfe37fdcd0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.623647 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c90b03a3-618b-48e0-8de1-6290a7cf0140-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c90b03a3-618b-48e0-8de1-6290a7cf0140" (UID: "c90b03a3-618b-48e0-8de1-6290a7cf0140"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.631764 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90b03a3-618b-48e0-8de1-6290a7cf0140-kube-api-access-vhmpf" (OuterVolumeSpecName: "kube-api-access-vhmpf") pod "c90b03a3-618b-48e0-8de1-6290a7cf0140" (UID: "c90b03a3-618b-48e0-8de1-6290a7cf0140"). InnerVolumeSpecName "kube-api-access-vhmpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.631959 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf300251-6812-4636-8f38-10cfe37fdcd0-kube-api-access-fjz9t" (OuterVolumeSpecName: "kube-api-access-fjz9t") pod "cf300251-6812-4636-8f38-10cfe37fdcd0" (UID: "cf300251-6812-4636-8f38-10cfe37fdcd0"). InnerVolumeSpecName "kube-api-access-fjz9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.725015 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90b03a3-618b-48e0-8de1-6290a7cf0140-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.725063 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf300251-6812-4636-8f38-10cfe37fdcd0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.725084 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhmpf\" (UniqueName: \"kubernetes.io/projected/c90b03a3-618b-48e0-8de1-6290a7cf0140-kube-api-access-vhmpf\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.725138 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjz9t\" (UniqueName: \"kubernetes.io/projected/cf300251-6812-4636-8f38-10cfe37fdcd0-kube-api-access-fjz9t\") on node \"crc\" DevicePath \"\"" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.990825 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-28f62" event={"ID":"c90b03a3-618b-48e0-8de1-6290a7cf0140","Type":"ContainerDied","Data":"1294a7e12b52c9f92793c502d6c79feae9414df8c02a31b07dd34b3791162c04"} Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.990898 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1294a7e12b52c9f92793c502d6c79feae9414df8c02a31b07dd34b3791162c04" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.990999 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-28f62" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.993503 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5b81-account-create-update-7bbtq" event={"ID":"cf300251-6812-4636-8f38-10cfe37fdcd0","Type":"ContainerDied","Data":"5c860e389aa528a767ddba40b3d39be370b6cf4aeeeaa56c24eed1a05ad5d024"} Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.993547 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5b81-account-create-update-7bbtq" Feb 23 08:44:36 crc kubenswrapper[5118]: I0223 08:44:36.993569 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c860e389aa528a767ddba40b3d39be370b6cf4aeeeaa56c24eed1a05ad5d024" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.265782 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-9mmtr"] Feb 23 08:44:38 crc kubenswrapper[5118]: E0223 08:44:38.266840 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90b03a3-618b-48e0-8de1-6290a7cf0140" containerName="mariadb-database-create" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.266864 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90b03a3-618b-48e0-8de1-6290a7cf0140" containerName="mariadb-database-create" Feb 23 08:44:38 crc kubenswrapper[5118]: E0223 08:44:38.266911 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf300251-6812-4636-8f38-10cfe37fdcd0" containerName="mariadb-account-create-update" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.266923 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf300251-6812-4636-8f38-10cfe37fdcd0" containerName="mariadb-account-create-update" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.267200 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf300251-6812-4636-8f38-10cfe37fdcd0" containerName="mariadb-account-create-update" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.267226 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90b03a3-618b-48e0-8de1-6290a7cf0140" containerName="mariadb-database-create" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.268114 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.276665 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.277687 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9r2mg" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.282778 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9mmtr"] Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.362444 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-db-sync-config-data\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.362750 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmlht\" (UniqueName: \"kubernetes.io/projected/ff365ce8-8d93-410b-a97d-5f5b947652d9-kube-api-access-kmlht\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.362867 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-config-data\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.362979 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-combined-ca-bundle\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.464433 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-db-sync-config-data\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.464521 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmlht\" (UniqueName: \"kubernetes.io/projected/ff365ce8-8d93-410b-a97d-5f5b947652d9-kube-api-access-kmlht\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.464570 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-config-data\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.464593 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-combined-ca-bundle\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.470018 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-combined-ca-bundle\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.476468 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-config-data\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.478516 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-db-sync-config-data\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.485459 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmlht\" (UniqueName: \"kubernetes.io/projected/ff365ce8-8d93-410b-a97d-5f5b947652d9-kube-api-access-kmlht\") pod \"glance-db-sync-9mmtr\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:38 crc kubenswrapper[5118]: I0223 08:44:38.638721 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9mmtr" Feb 23 08:44:39 crc kubenswrapper[5118]: I0223 08:44:39.357022 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-9mmtr"] Feb 23 08:44:39 crc kubenswrapper[5118]: I0223 08:44:39.376242 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:44:40 crc kubenswrapper[5118]: I0223 08:44:40.026120 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9mmtr" event={"ID":"ff365ce8-8d93-410b-a97d-5f5b947652d9","Type":"ContainerStarted","Data":"50b1a5fc6185693403239688bc4d4b7f264dd1958b8106fb449db33560cdf80b"} Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.007997 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tspgk"] Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.013827 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.042716 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tspgk"] Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.100062 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-catalog-content\") pod \"certified-operators-tspgk\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.100283 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqqn\" (UniqueName: \"kubernetes.io/projected/606c4177-3dc3-4d64-a849-30f92171f7f3-kube-api-access-5jqqn\") pod \"certified-operators-tspgk\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.100333 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-utilities\") pod \"certified-operators-tspgk\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.201811 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-catalog-content\") pod \"certified-operators-tspgk\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.201981 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqqn\" (UniqueName: \"kubernetes.io/projected/606c4177-3dc3-4d64-a849-30f92171f7f3-kube-api-access-5jqqn\") pod \"certified-operators-tspgk\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.202050 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-utilities\") pod \"certified-operators-tspgk\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.202641 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-catalog-content\") pod \"certified-operators-tspgk\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.204146 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-utilities\") pod \"certified-operators-tspgk\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.228534 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqqn\" (UniqueName: \"kubernetes.io/projected/606c4177-3dc3-4d64-a849-30f92171f7f3-kube-api-access-5jqqn\") pod \"certified-operators-tspgk\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.351743 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:49 crc kubenswrapper[5118]: I0223 08:44:49.937343 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tspgk"] Feb 23 08:44:50 crc kubenswrapper[5118]: I0223 08:44:50.130818 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tspgk" event={"ID":"606c4177-3dc3-4d64-a849-30f92171f7f3","Type":"ContainerStarted","Data":"49dfc8f46b402d09156b9ed3329351f488c13489e5e5bf32f48a23d30d16b055"} Feb 23 08:44:51 crc kubenswrapper[5118]: I0223 08:44:51.147071 5118 generic.go:334] "Generic (PLEG): container finished" podID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerID="9463b6a476911d7f6e58f1245fcee01d8539f039ceeb1b129db4d08ab1397d9d" exitCode=0 Feb 23 08:44:51 crc kubenswrapper[5118]: I0223 08:44:51.147203 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tspgk" event={"ID":"606c4177-3dc3-4d64-a849-30f92171f7f3","Type":"ContainerDied","Data":"9463b6a476911d7f6e58f1245fcee01d8539f039ceeb1b129db4d08ab1397d9d"} Feb 23 08:44:52 crc kubenswrapper[5118]: I0223 08:44:52.128727 5118 scope.go:117] "RemoveContainer" containerID="f2b275fc7e60788c5a1643b1a406e58a8a1e4a5c7d059999a801e8ab0beba8ee" Feb 23 08:44:58 crc kubenswrapper[5118]: I0223 08:44:58.258604 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9mmtr" event={"ID":"ff365ce8-8d93-410b-a97d-5f5b947652d9","Type":"ContainerStarted","Data":"9ff3711eb0b440cc3fc36d5622e4325e7b5eb58682bbe5f6e167110cf8a7c7be"} Feb 23 08:44:58 crc kubenswrapper[5118]: I0223 08:44:58.262427 5118 generic.go:334] "Generic (PLEG): container finished" podID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerID="d817b045c95fd4486230aca6365f0216939986e38978f6a0b349b448aa3572d8" exitCode=0 Feb 23 08:44:58 crc kubenswrapper[5118]: I0223 08:44:58.262861 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tspgk" event={"ID":"606c4177-3dc3-4d64-a849-30f92171f7f3","Type":"ContainerDied","Data":"d817b045c95fd4486230aca6365f0216939986e38978f6a0b349b448aa3572d8"} Feb 23 08:44:58 crc kubenswrapper[5118]: I0223 08:44:58.292433 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-9mmtr" podStartSLOduration=2.54000054 podStartE2EDuration="20.292403579s" podCreationTimestamp="2026-02-23 08:44:38 +0000 UTC" firstStartedPulling="2026-02-23 08:44:39.375915454 +0000 UTC m=+7142.379700047" lastFinishedPulling="2026-02-23 08:44:57.128318473 +0000 UTC m=+7160.132103086" observedRunningTime="2026-02-23 08:44:58.279191291 +0000 UTC m=+7161.282975914" watchObservedRunningTime="2026-02-23 08:44:58.292403579 +0000 UTC m=+7161.296188162" Feb 23 08:44:59 crc kubenswrapper[5118]: I0223 08:44:59.275297 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tspgk" event={"ID":"606c4177-3dc3-4d64-a849-30f92171f7f3","Type":"ContainerStarted","Data":"576d1edb7ea12debcea3793664e6d0fa2c8e08920d7f4b14a76cc10184f20317"} Feb 23 08:44:59 crc kubenswrapper[5118]: I0223 08:44:59.311150 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tspgk" podStartSLOduration=9.586238051 podStartE2EDuration="11.311127034s" podCreationTimestamp="2026-02-23 08:44:48 +0000 UTC" firstStartedPulling="2026-02-23 08:44:56.989713615 +0000 UTC m=+7159.993498228" lastFinishedPulling="2026-02-23 08:44:58.714602608 +0000 UTC m=+7161.718387211" observedRunningTime="2026-02-23 08:44:59.300869388 +0000 UTC m=+7162.304653961" watchObservedRunningTime="2026-02-23 08:44:59.311127034 +0000 UTC m=+7162.314911617" Feb 23 08:44:59 crc kubenswrapper[5118]: I0223 08:44:59.353149 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:44:59 crc kubenswrapper[5118]: I0223 08:44:59.353197 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.142501 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc"] Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.144247 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.148289 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.152737 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.156211 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc"] Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.257441 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsp2c\" (UniqueName: \"kubernetes.io/projected/3523972d-1d1d-48dc-9cef-537318ac933f-kube-api-access-vsp2c\") pod \"collect-profiles-29530605-lfmnc\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.257555 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3523972d-1d1d-48dc-9cef-537318ac933f-config-volume\") pod \"collect-profiles-29530605-lfmnc\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.257630 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3523972d-1d1d-48dc-9cef-537318ac933f-secret-volume\") pod \"collect-profiles-29530605-lfmnc\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.359934 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsp2c\" (UniqueName: \"kubernetes.io/projected/3523972d-1d1d-48dc-9cef-537318ac933f-kube-api-access-vsp2c\") pod \"collect-profiles-29530605-lfmnc\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.360007 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3523972d-1d1d-48dc-9cef-537318ac933f-config-volume\") pod \"collect-profiles-29530605-lfmnc\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.360048 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3523972d-1d1d-48dc-9cef-537318ac933f-secret-volume\") pod \"collect-profiles-29530605-lfmnc\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.361869 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3523972d-1d1d-48dc-9cef-537318ac933f-config-volume\") pod \"collect-profiles-29530605-lfmnc\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.370157 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3523972d-1d1d-48dc-9cef-537318ac933f-secret-volume\") pod \"collect-profiles-29530605-lfmnc\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.405002 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsp2c\" (UniqueName: \"kubernetes.io/projected/3523972d-1d1d-48dc-9cef-537318ac933f-kube-api-access-vsp2c\") pod \"collect-profiles-29530605-lfmnc\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.408557 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-tspgk" podUID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerName="registry-server" probeResult="failure" output=< Feb 23 08:45:00 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 08:45:00 crc kubenswrapper[5118]: > Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.474894 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:00 crc kubenswrapper[5118]: I0223 08:45:00.980794 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc"] Feb 23 08:45:01 crc kubenswrapper[5118]: I0223 08:45:01.298561 5118 generic.go:334] "Generic (PLEG): container finished" podID="ff365ce8-8d93-410b-a97d-5f5b947652d9" containerID="9ff3711eb0b440cc3fc36d5622e4325e7b5eb58682bbe5f6e167110cf8a7c7be" exitCode=0 Feb 23 08:45:01 crc kubenswrapper[5118]: I0223 08:45:01.298642 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9mmtr" event={"ID":"ff365ce8-8d93-410b-a97d-5f5b947652d9","Type":"ContainerDied","Data":"9ff3711eb0b440cc3fc36d5622e4325e7b5eb58682bbe5f6e167110cf8a7c7be"} Feb 23 08:45:01 crc kubenswrapper[5118]: I0223 08:45:01.300232 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" event={"ID":"3523972d-1d1d-48dc-9cef-537318ac933f","Type":"ContainerStarted","Data":"60457dc2873dba059482d074740ef754f88622c1391d18ffc2c9bbad22daeed1"} Feb 23 08:45:01 crc kubenswrapper[5118]: I0223 08:45:01.300276 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" event={"ID":"3523972d-1d1d-48dc-9cef-537318ac933f","Type":"ContainerStarted","Data":"6994eeab729f48c37e3240c27428a99de44a2e5c178ff73724128dd944c78af0"} Feb 23 08:45:01 crc kubenswrapper[5118]: I0223 08:45:01.347231 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" podStartSLOduration=1.347204581 podStartE2EDuration="1.347204581s" podCreationTimestamp="2026-02-23 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:01.334909085 +0000 UTC m=+7164.338693658" watchObservedRunningTime="2026-02-23 08:45:01.347204581 +0000 UTC m=+7164.350989154" Feb 23 08:45:02 crc kubenswrapper[5118]: I0223 08:45:02.317221 5118 generic.go:334] "Generic (PLEG): container finished" podID="3523972d-1d1d-48dc-9cef-537318ac933f" containerID="60457dc2873dba059482d074740ef754f88622c1391d18ffc2c9bbad22daeed1" exitCode=0 Feb 23 08:45:02 crc kubenswrapper[5118]: I0223 08:45:02.317508 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" event={"ID":"3523972d-1d1d-48dc-9cef-537318ac933f","Type":"ContainerDied","Data":"60457dc2873dba059482d074740ef754f88622c1391d18ffc2c9bbad22daeed1"} Feb 23 08:45:02 crc kubenswrapper[5118]: I0223 08:45:02.825385 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9mmtr" Feb 23 08:45:02 crc kubenswrapper[5118]: I0223 08:45:02.921474 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-db-sync-config-data\") pod \"ff365ce8-8d93-410b-a97d-5f5b947652d9\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " Feb 23 08:45:02 crc kubenswrapper[5118]: I0223 08:45:02.922476 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-combined-ca-bundle\") pod \"ff365ce8-8d93-410b-a97d-5f5b947652d9\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " Feb 23 08:45:02 crc kubenswrapper[5118]: I0223 08:45:02.922767 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmlht\" (UniqueName: \"kubernetes.io/projected/ff365ce8-8d93-410b-a97d-5f5b947652d9-kube-api-access-kmlht\") pod \"ff365ce8-8d93-410b-a97d-5f5b947652d9\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " Feb 23 08:45:02 crc kubenswrapper[5118]: I0223 08:45:02.922924 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-config-data\") pod \"ff365ce8-8d93-410b-a97d-5f5b947652d9\" (UID: \"ff365ce8-8d93-410b-a97d-5f5b947652d9\") " Feb 23 08:45:02 crc kubenswrapper[5118]: I0223 08:45:02.929970 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff365ce8-8d93-410b-a97d-5f5b947652d9-kube-api-access-kmlht" (OuterVolumeSpecName: "kube-api-access-kmlht") pod "ff365ce8-8d93-410b-a97d-5f5b947652d9" (UID: "ff365ce8-8d93-410b-a97d-5f5b947652d9"). InnerVolumeSpecName "kube-api-access-kmlht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:02 crc kubenswrapper[5118]: I0223 08:45:02.931076 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ff365ce8-8d93-410b-a97d-5f5b947652d9" (UID: "ff365ce8-8d93-410b-a97d-5f5b947652d9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:02 crc kubenswrapper[5118]: I0223 08:45:02.975271 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff365ce8-8d93-410b-a97d-5f5b947652d9" (UID: "ff365ce8-8d93-410b-a97d-5f5b947652d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.007977 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-config-data" (OuterVolumeSpecName: "config-data") pod "ff365ce8-8d93-410b-a97d-5f5b947652d9" (UID: "ff365ce8-8d93-410b-a97d-5f5b947652d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.026869 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.026918 5118 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.026939 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff365ce8-8d93-410b-a97d-5f5b947652d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.026959 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmlht\" (UniqueName: \"kubernetes.io/projected/ff365ce8-8d93-410b-a97d-5f5b947652d9-kube-api-access-kmlht\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.328707 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-9mmtr" event={"ID":"ff365ce8-8d93-410b-a97d-5f5b947652d9","Type":"ContainerDied","Data":"50b1a5fc6185693403239688bc4d4b7f264dd1958b8106fb449db33560cdf80b"} Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.328767 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50b1a5fc6185693403239688bc4d4b7f264dd1958b8106fb449db33560cdf80b" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.328739 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-9mmtr" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.702862 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.751682 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3523972d-1d1d-48dc-9cef-537318ac933f-secret-volume\") pod \"3523972d-1d1d-48dc-9cef-537318ac933f\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.751805 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3523972d-1d1d-48dc-9cef-537318ac933f-config-volume\") pod \"3523972d-1d1d-48dc-9cef-537318ac933f\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.751846 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsp2c\" (UniqueName: \"kubernetes.io/projected/3523972d-1d1d-48dc-9cef-537318ac933f-kube-api-access-vsp2c\") pod \"3523972d-1d1d-48dc-9cef-537318ac933f\" (UID: \"3523972d-1d1d-48dc-9cef-537318ac933f\") " Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.753012 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3523972d-1d1d-48dc-9cef-537318ac933f-config-volume" (OuterVolumeSpecName: "config-volume") pod "3523972d-1d1d-48dc-9cef-537318ac933f" (UID: "3523972d-1d1d-48dc-9cef-537318ac933f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.763349 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3523972d-1d1d-48dc-9cef-537318ac933f-kube-api-access-vsp2c" (OuterVolumeSpecName: "kube-api-access-vsp2c") pod "3523972d-1d1d-48dc-9cef-537318ac933f" (UID: "3523972d-1d1d-48dc-9cef-537318ac933f"). InnerVolumeSpecName "kube-api-access-vsp2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.784226 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dfb985d97-f7qdh"] Feb 23 08:45:03 crc kubenswrapper[5118]: E0223 08:45:03.785026 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff365ce8-8d93-410b-a97d-5f5b947652d9" containerName="glance-db-sync" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.785150 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff365ce8-8d93-410b-a97d-5f5b947652d9" containerName="glance-db-sync" Feb 23 08:45:03 crc kubenswrapper[5118]: E0223 08:45:03.785259 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3523972d-1d1d-48dc-9cef-537318ac933f" containerName="collect-profiles" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.785331 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="3523972d-1d1d-48dc-9cef-537318ac933f" containerName="collect-profiles" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.785680 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="3523972d-1d1d-48dc-9cef-537318ac933f" containerName="collect-profiles" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.786245 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff365ce8-8d93-410b-a97d-5f5b947652d9" containerName="glance-db-sync" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.786063 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3523972d-1d1d-48dc-9cef-537318ac933f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3523972d-1d1d-48dc-9cef-537318ac933f" (UID: "3523972d-1d1d-48dc-9cef-537318ac933f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.787853 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.796537 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.798316 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.804756 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.805021 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9r2mg" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.805221 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.805348 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.818418 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.834188 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dfb985d97-f7qdh"] Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856028 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-nb\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856126 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856167 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-scripts\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856363 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-config-data\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856487 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-dns-svc\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856513 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-ceph\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856555 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856598 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5hhh\" (UniqueName: \"kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-kube-api-access-w5hhh\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856629 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-logs\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856654 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-sb\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856696 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhbhb\" (UniqueName: \"kubernetes.io/projected/b3afb984-0794-450e-a141-0068ac7fca00-kube-api-access-nhbhb\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856723 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-config\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856796 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3523972d-1d1d-48dc-9cef-537318ac933f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856814 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3523972d-1d1d-48dc-9cef-537318ac933f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.856826 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsp2c\" (UniqueName: \"kubernetes.io/projected/3523972d-1d1d-48dc-9cef-537318ac933f-kube-api-access-vsp2c\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.870249 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.872598 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.879734 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.918157 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959118 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-config-data\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959190 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959226 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-dns-svc\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959248 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-ceph\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959267 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959302 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959332 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959353 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959373 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5hhh\" (UniqueName: \"kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-kube-api-access-w5hhh\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959390 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959415 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-logs\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959438 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959460 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-sb\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959490 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhbhb\" (UniqueName: \"kubernetes.io/projected/b3afb984-0794-450e-a141-0068ac7fca00-kube-api-access-nhbhb\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959511 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpc75\" (UniqueName: \"kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-kube-api-access-lpc75\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959535 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-config\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959564 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-nb\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959602 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.959625 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-scripts\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.960727 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.960907 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-dns-svc\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.961510 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-logs\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.962045 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-sb\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.962282 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-config\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.962716 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-nb\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.966000 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-config-data\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.966608 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-ceph\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.967405 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-scripts\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.967540 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.990004 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhbhb\" (UniqueName: \"kubernetes.io/projected/b3afb984-0794-450e-a141-0068ac7fca00-kube-api-access-nhbhb\") pod \"dnsmasq-dns-6dfb985d97-f7qdh\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:03 crc kubenswrapper[5118]: I0223 08:45:03.991216 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5hhh\" (UniqueName: \"kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-kube-api-access-w5hhh\") pod \"glance-default-external-api-0\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.061243 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.061295 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.061349 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.061366 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.061385 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.061405 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.061436 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc75\" (UniqueName: \"kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-kube-api-access-lpc75\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.062511 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.062639 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.064958 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.065569 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.066432 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.067035 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.076474 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpc75\" (UniqueName: \"kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-kube-api-access-lpc75\") pod \"glance-default-internal-api-0\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.150424 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.160898 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.202250 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.362444 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" event={"ID":"3523972d-1d1d-48dc-9cef-537318ac933f","Type":"ContainerDied","Data":"6994eeab729f48c37e3240c27428a99de44a2e5c178ff73724128dd944c78af0"} Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.362870 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6994eeab729f48c37e3240c27428a99de44a2e5c178ff73724128dd944c78af0" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.363006 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc" Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.446212 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj"] Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.453561 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530560-kr8cj"] Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.712949 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dfb985d97-f7qdh"] Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.945085 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:45:04 crc kubenswrapper[5118]: I0223 08:45:04.980538 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:45:04 crc kubenswrapper[5118]: W0223 08:45:04.990416 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd74d2ea_d09e_4df3_ae4c_7c5d08a89514.slice/crio-3c46d973307f6e5412711e294ff2c92b94e6aa273b74e32a52679754fc30ccb3 WatchSource:0}: Error finding container 3c46d973307f6e5412711e294ff2c92b94e6aa273b74e32a52679754fc30ccb3: Status 404 returned error can't find the container with id 3c46d973307f6e5412711e294ff2c92b94e6aa273b74e32a52679754fc30ccb3 Feb 23 08:45:05 crc kubenswrapper[5118]: I0223 08:45:05.377975 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" event={"ID":"b3afb984-0794-450e-a141-0068ac7fca00","Type":"ContainerStarted","Data":"f60eee71263987cee4f1d0ba9f8a5ae8df7c302cef506c431d928a67874cea05"} Feb 23 08:45:05 crc kubenswrapper[5118]: I0223 08:45:05.379472 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514","Type":"ContainerStarted","Data":"3c46d973307f6e5412711e294ff2c92b94e6aa273b74e32a52679754fc30ccb3"} Feb 23 08:45:05 crc kubenswrapper[5118]: I0223 08:45:05.714447 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="917bdb73-96fa-41cf-b160-791f8e4503b7" path="/var/lib/kubelet/pods/917bdb73-96fa-41cf-b160-791f8e4503b7/volumes" Feb 23 08:45:06 crc kubenswrapper[5118]: I0223 08:45:06.039167 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:45:06 crc kubenswrapper[5118]: I0223 08:45:06.399283 5118 generic.go:334] "Generic (PLEG): container finished" podID="b3afb984-0794-450e-a141-0068ac7fca00" containerID="aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b" exitCode=0 Feb 23 08:45:06 crc kubenswrapper[5118]: I0223 08:45:06.399385 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" event={"ID":"b3afb984-0794-450e-a141-0068ac7fca00","Type":"ContainerDied","Data":"aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b"} Feb 23 08:45:06 crc kubenswrapper[5118]: I0223 08:45:06.404573 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514","Type":"ContainerStarted","Data":"07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741"} Feb 23 08:45:06 crc kubenswrapper[5118]: I0223 08:45:06.407849 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a","Type":"ContainerStarted","Data":"a5e1d11b032570cc4bfeb53a4231fe8c064ee749f68c905a417d97eb6651e854"} Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.331607 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.437226 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" event={"ID":"b3afb984-0794-450e-a141-0068ac7fca00","Type":"ContainerStarted","Data":"a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805"} Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.437407 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.443884 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514","Type":"ContainerStarted","Data":"2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956"} Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.444059 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" containerName="glance-log" containerID="cri-o://07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741" gracePeriod=30 Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.444310 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" containerName="glance-httpd" containerID="cri-o://2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956" gracePeriod=30 Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.452116 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a","Type":"ContainerStarted","Data":"79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3"} Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.452156 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a","Type":"ContainerStarted","Data":"93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb"} Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.465201 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" podStartSLOduration=4.465184837 podStartE2EDuration="4.465184837s" podCreationTimestamp="2026-02-23 08:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:07.461506378 +0000 UTC m=+7170.465290951" watchObservedRunningTime="2026-02-23 08:45:07.465184837 +0000 UTC m=+7170.468969410" Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.495027 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.495005385 podStartE2EDuration="4.495005385s" podCreationTimestamp="2026-02-23 08:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:07.491385768 +0000 UTC m=+7170.495170341" watchObservedRunningTime="2026-02-23 08:45:07.495005385 +0000 UTC m=+7170.498789958" Feb 23 08:45:07 crc kubenswrapper[5118]: I0223 08:45:07.530820 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.530790647 podStartE2EDuration="4.530790647s" podCreationTimestamp="2026-02-23 08:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:07.528222545 +0000 UTC m=+7170.532007118" watchObservedRunningTime="2026-02-23 08:45:07.530790647 +0000 UTC m=+7170.534575220" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.235631 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.324672 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-combined-ca-bundle\") pod \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.324817 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-config-data\") pod \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.324877 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-httpd-run\") pod \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.324986 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-ceph\") pod \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.325037 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5hhh\" (UniqueName: \"kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-kube-api-access-w5hhh\") pod \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.325086 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-scripts\") pod \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.326050 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" (UID: "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.326234 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-logs\") pod \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\" (UID: \"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514\") " Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.326580 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-logs" (OuterVolumeSpecName: "logs") pod "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" (UID: "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.327318 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.327350 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.330478 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-kube-api-access-w5hhh" (OuterVolumeSpecName: "kube-api-access-w5hhh") pod "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" (UID: "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514"). InnerVolumeSpecName "kube-api-access-w5hhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.331174 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-ceph" (OuterVolumeSpecName: "ceph") pod "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" (UID: "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.332913 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-scripts" (OuterVolumeSpecName: "scripts") pod "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" (UID: "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.367411 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" (UID: "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.376780 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-config-data" (OuterVolumeSpecName: "config-data") pod "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" (UID: "cd74d2ea-d09e-4df3-ae4c-7c5d08a89514"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.430188 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.430240 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.430255 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.430271 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5hhh\" (UniqueName: \"kubernetes.io/projected/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-kube-api-access-w5hhh\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.430287 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.465893 5118 generic.go:334] "Generic (PLEG): container finished" podID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" containerID="2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956" exitCode=0 Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.465976 5118 generic.go:334] "Generic (PLEG): container finished" podID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" containerID="07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741" exitCode=143 Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.466071 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514","Type":"ContainerDied","Data":"2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956"} Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.466218 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514","Type":"ContainerDied","Data":"07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741"} Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.466231 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd74d2ea-d09e-4df3-ae4c-7c5d08a89514","Type":"ContainerDied","Data":"3c46d973307f6e5412711e294ff2c92b94e6aa273b74e32a52679754fc30ccb3"} Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.466254 5118 scope.go:117] "RemoveContainer" containerID="2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.466447 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" containerName="glance-log" containerID="cri-o://93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb" gracePeriod=30 Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.466530 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" containerName="glance-httpd" containerID="cri-o://79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3" gracePeriod=30 Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.466626 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.504497 5118 scope.go:117] "RemoveContainer" containerID="07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.513631 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.520408 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.542023 5118 scope.go:117] "RemoveContainer" containerID="2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956" Feb 23 08:45:08 crc kubenswrapper[5118]: E0223 08:45:08.542737 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956\": container with ID starting with 2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956 not found: ID does not exist" containerID="2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.542776 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956"} err="failed to get container status \"2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956\": rpc error: code = NotFound desc = could not find container \"2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956\": container with ID starting with 2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956 not found: ID does not exist" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.542801 5118 scope.go:117] "RemoveContainer" containerID="07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741" Feb 23 08:45:08 crc kubenswrapper[5118]: E0223 08:45:08.543438 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741\": container with ID starting with 07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741 not found: ID does not exist" containerID="07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.543598 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741"} err="failed to get container status \"07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741\": rpc error: code = NotFound desc = could not find container \"07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741\": container with ID starting with 07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741 not found: ID does not exist" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.543743 5118 scope.go:117] "RemoveContainer" containerID="2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.543974 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:45:08 crc kubenswrapper[5118]: E0223 08:45:08.544354 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" containerName="glance-httpd" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.544373 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" containerName="glance-httpd" Feb 23 08:45:08 crc kubenswrapper[5118]: E0223 08:45:08.544391 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" containerName="glance-log" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.544397 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" containerName="glance-log" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.544506 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956"} err="failed to get container status \"2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956\": rpc error: code = NotFound desc = could not find container \"2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956\": container with ID starting with 2fc482a6923cbe86b65240f618681d1972cf6f3ba73bd11af1d8740efbc4b956 not found: ID does not exist" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.544546 5118 scope.go:117] "RemoveContainer" containerID="07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.544559 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" containerName="glance-httpd" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.544579 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" containerName="glance-log" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.544834 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741"} err="failed to get container status \"07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741\": rpc error: code = NotFound desc = could not find container \"07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741\": container with ID starting with 07565d8820717a39ad95cb5b56e12a239b124027416b52d218f30d3c25104741 not found: ID does not exist" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.545488 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.549697 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.557816 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.633626 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-ceph\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.633823 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.633929 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.634141 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.634298 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.634368 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-logs\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.634434 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h744\" (UniqueName: \"kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-kube-api-access-4h744\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.736025 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.736353 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.736471 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.736473 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-logs\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.736518 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h744\" (UniqueName: \"kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-kube-api-access-4h744\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.736585 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-ceph\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.736623 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.736647 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.737434 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-logs\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.741731 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-ceph\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.742198 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.744755 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.745928 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.755505 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h744\" (UniqueName: \"kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-kube-api-access-4h744\") pod \"glance-default-external-api-0\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " pod="openstack/glance-default-external-api-0" Feb 23 08:45:08 crc kubenswrapper[5118]: I0223 08:45:08.871129 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.058442 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.152660 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-logs\") pod \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.152713 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-config-data\") pod \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.152734 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-scripts\") pod \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.152781 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-ceph\") pod \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.152809 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-combined-ca-bundle\") pod \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.152844 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpc75\" (UniqueName: \"kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-kube-api-access-lpc75\") pod \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.152983 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-httpd-run\") pod \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\" (UID: \"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a\") " Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.153192 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-logs" (OuterVolumeSpecName: "logs") pod "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" (UID: "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.153402 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.153681 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" (UID: "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.158773 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-ceph" (OuterVolumeSpecName: "ceph") pod "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" (UID: "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.159240 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-kube-api-access-lpc75" (OuterVolumeSpecName: "kube-api-access-lpc75") pod "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" (UID: "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a"). InnerVolumeSpecName "kube-api-access-lpc75". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.161256 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-scripts" (OuterVolumeSpecName: "scripts") pod "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" (UID: "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.204675 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" (UID: "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.228275 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-config-data" (OuterVolumeSpecName: "config-data") pod "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" (UID: "f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.255762 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.255845 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.255857 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.255867 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.255877 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.255892 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpc75\" (UniqueName: \"kubernetes.io/projected/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a-kube-api-access-lpc75\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.403585 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.444728 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.485448 5118 generic.go:334] "Generic (PLEG): container finished" podID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" containerID="79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3" exitCode=0 Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.485485 5118 generic.go:334] "Generic (PLEG): container finished" podID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" containerID="93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb" exitCode=143 Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.485536 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a","Type":"ContainerDied","Data":"79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3"} Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.485569 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a","Type":"ContainerDied","Data":"93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb"} Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.485579 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a","Type":"ContainerDied","Data":"a5e1d11b032570cc4bfeb53a4231fe8c064ee749f68c905a417d97eb6651e854"} Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.485597 5118 scope.go:117] "RemoveContainer" containerID="79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.485840 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.526686 5118 scope.go:117] "RemoveContainer" containerID="93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.532788 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.551362 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.556181 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.567233 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:45:09 crc kubenswrapper[5118]: E0223 08:45:09.567679 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" containerName="glance-httpd" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.567693 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" containerName="glance-httpd" Feb 23 08:45:09 crc kubenswrapper[5118]: E0223 08:45:09.567707 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" containerName="glance-log" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.567713 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" containerName="glance-log" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.567858 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" containerName="glance-httpd" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.567886 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" containerName="glance-log" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.568809 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.573883 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.575410 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.615888 5118 scope.go:117] "RemoveContainer" containerID="79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3" Feb 23 08:45:09 crc kubenswrapper[5118]: E0223 08:45:09.616417 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3\": container with ID starting with 79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3 not found: ID does not exist" containerID="79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.616460 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3"} err="failed to get container status \"79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3\": rpc error: code = NotFound desc = could not find container \"79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3\": container with ID starting with 79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3 not found: ID does not exist" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.616494 5118 scope.go:117] "RemoveContainer" containerID="93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb" Feb 23 08:45:09 crc kubenswrapper[5118]: E0223 08:45:09.616968 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb\": container with ID starting with 93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb not found: ID does not exist" containerID="93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.617000 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb"} err="failed to get container status \"93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb\": rpc error: code = NotFound desc = could not find container \"93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb\": container with ID starting with 93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb not found: ID does not exist" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.617028 5118 scope.go:117] "RemoveContainer" containerID="79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.617244 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3"} err="failed to get container status \"79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3\": rpc error: code = NotFound desc = could not find container \"79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3\": container with ID starting with 79d50518154f6dafa688d06a7434354ed4c334235fa61e08b33ef4f2bb19baa3 not found: ID does not exist" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.617266 5118 scope.go:117] "RemoveContainer" containerID="93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.617579 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb"} err="failed to get container status \"93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb\": rpc error: code = NotFound desc = could not find container \"93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb\": container with ID starting with 93d99a6d20893af510cde86e8b371d04af49ab4add8c01bcb4fb29d5feb158bb not found: ID does not exist" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.644700 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tspgk"] Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.664404 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.664490 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-logs\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.664609 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.664722 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.664818 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.664883 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-ceph\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.665025 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq8zr\" (UniqueName: \"kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-kube-api-access-mq8zr\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.705739 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd74d2ea-d09e-4df3-ae4c-7c5d08a89514" path="/var/lib/kubelet/pods/cd74d2ea-d09e-4df3-ae4c-7c5d08a89514/volumes" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.706646 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a" path="/var/lib/kubelet/pods/f0f2300b-5511-4cd9-a3f9-1d3c6ddd8e5a/volumes" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.769050 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq8zr\" (UniqueName: \"kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-kube-api-access-mq8zr\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.769201 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.769264 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-logs\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.769314 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.769362 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.769416 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.769447 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-ceph\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.770408 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-logs\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.770638 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.774962 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-ceph\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.775754 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.776018 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-scripts\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.776164 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-config-data\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.787694 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq8zr\" (UniqueName: \"kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-kube-api-access-mq8zr\") pod \"glance-default-internal-api-0\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:45:09 crc kubenswrapper[5118]: I0223 08:45:09.948415 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:10 crc kubenswrapper[5118]: I0223 08:45:10.298859 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:45:10 crc kubenswrapper[5118]: W0223 08:45:10.304867 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde407020_eb83_4653_8880_2fd15bee8791.slice/crio-61cf4b0db9dc81c3ee85f5820d3705495f5040ce7ee960c88c33f2e57ded0c0d WatchSource:0}: Error finding container 61cf4b0db9dc81c3ee85f5820d3705495f5040ce7ee960c88c33f2e57ded0c0d: Status 404 returned error can't find the container with id 61cf4b0db9dc81c3ee85f5820d3705495f5040ce7ee960c88c33f2e57ded0c0d Feb 23 08:45:10 crc kubenswrapper[5118]: I0223 08:45:10.501899 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e2a35af-1457-4c23-b5c8-425735c0d833","Type":"ContainerStarted","Data":"8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25"} Feb 23 08:45:10 crc kubenswrapper[5118]: I0223 08:45:10.501945 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e2a35af-1457-4c23-b5c8-425735c0d833","Type":"ContainerStarted","Data":"c7efe83597a93a1ddee41fa59313a0f0ddabc9ef894abcd1abf42ee725c4001e"} Feb 23 08:45:10 crc kubenswrapper[5118]: I0223 08:45:10.503403 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tspgk" podUID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerName="registry-server" containerID="cri-o://576d1edb7ea12debcea3793664e6d0fa2c8e08920d7f4b14a76cc10184f20317" gracePeriod=2 Feb 23 08:45:10 crc kubenswrapper[5118]: I0223 08:45:10.503720 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de407020-eb83-4653-8880-2fd15bee8791","Type":"ContainerStarted","Data":"61cf4b0db9dc81c3ee85f5820d3705495f5040ce7ee960c88c33f2e57ded0c0d"} Feb 23 08:45:11 crc kubenswrapper[5118]: I0223 08:45:11.523706 5118 generic.go:334] "Generic (PLEG): container finished" podID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerID="576d1edb7ea12debcea3793664e6d0fa2c8e08920d7f4b14a76cc10184f20317" exitCode=0 Feb 23 08:45:11 crc kubenswrapper[5118]: I0223 08:45:11.523798 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tspgk" event={"ID":"606c4177-3dc3-4d64-a849-30f92171f7f3","Type":"ContainerDied","Data":"576d1edb7ea12debcea3793664e6d0fa2c8e08920d7f4b14a76cc10184f20317"} Feb 23 08:45:12 crc kubenswrapper[5118]: I0223 08:45:12.807898 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:45:12 crc kubenswrapper[5118]: I0223 08:45:12.931298 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-utilities\") pod \"606c4177-3dc3-4d64-a849-30f92171f7f3\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " Feb 23 08:45:12 crc kubenswrapper[5118]: I0223 08:45:12.931517 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jqqn\" (UniqueName: \"kubernetes.io/projected/606c4177-3dc3-4d64-a849-30f92171f7f3-kube-api-access-5jqqn\") pod \"606c4177-3dc3-4d64-a849-30f92171f7f3\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " Feb 23 08:45:12 crc kubenswrapper[5118]: I0223 08:45:12.931555 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-catalog-content\") pod \"606c4177-3dc3-4d64-a849-30f92171f7f3\" (UID: \"606c4177-3dc3-4d64-a849-30f92171f7f3\") " Feb 23 08:45:12 crc kubenswrapper[5118]: I0223 08:45:12.933315 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-utilities" (OuterVolumeSpecName: "utilities") pod "606c4177-3dc3-4d64-a849-30f92171f7f3" (UID: "606c4177-3dc3-4d64-a849-30f92171f7f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:45:12 crc kubenswrapper[5118]: I0223 08:45:12.934650 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/606c4177-3dc3-4d64-a849-30f92171f7f3-kube-api-access-5jqqn" (OuterVolumeSpecName: "kube-api-access-5jqqn") pod "606c4177-3dc3-4d64-a849-30f92171f7f3" (UID: "606c4177-3dc3-4d64-a849-30f92171f7f3"). InnerVolumeSpecName "kube-api-access-5jqqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:12 crc kubenswrapper[5118]: I0223 08:45:12.983692 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "606c4177-3dc3-4d64-a849-30f92171f7f3" (UID: "606c4177-3dc3-4d64-a849-30f92171f7f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.034204 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.034260 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jqqn\" (UniqueName: \"kubernetes.io/projected/606c4177-3dc3-4d64-a849-30f92171f7f3-kube-api-access-5jqqn\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.034278 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/606c4177-3dc3-4d64-a849-30f92171f7f3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.552218 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de407020-eb83-4653-8880-2fd15bee8791","Type":"ContainerStarted","Data":"fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6"} Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.552298 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de407020-eb83-4653-8880-2fd15bee8791","Type":"ContainerStarted","Data":"fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213"} Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.555281 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e2a35af-1457-4c23-b5c8-425735c0d833","Type":"ContainerStarted","Data":"db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26"} Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.559164 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tspgk" event={"ID":"606c4177-3dc3-4d64-a849-30f92171f7f3","Type":"ContainerDied","Data":"49dfc8f46b402d09156b9ed3329351f488c13489e5e5bf32f48a23d30d16b055"} Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.559222 5118 scope.go:117] "RemoveContainer" containerID="576d1edb7ea12debcea3793664e6d0fa2c8e08920d7f4b14a76cc10184f20317" Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.559317 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tspgk" Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.597237 5118 scope.go:117] "RemoveContainer" containerID="d817b045c95fd4486230aca6365f0216939986e38978f6a0b349b448aa3572d8" Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.616300 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.6162763 podStartE2EDuration="4.6162763s" podCreationTimestamp="2026-02-23 08:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:13.584831352 +0000 UTC m=+7176.588615975" watchObservedRunningTime="2026-02-23 08:45:13.6162763 +0000 UTC m=+7176.620060883" Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.684368 5118 scope.go:117] "RemoveContainer" containerID="9463b6a476911d7f6e58f1245fcee01d8539f039ceeb1b129db4d08ab1397d9d" Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.685164 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.685140418 podStartE2EDuration="5.685140418s" podCreationTimestamp="2026-02-23 08:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:13.66029925 +0000 UTC m=+7176.664083833" watchObservedRunningTime="2026-02-23 08:45:13.685140418 +0000 UTC m=+7176.688925001" Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.712289 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tspgk"] Feb 23 08:45:13 crc kubenswrapper[5118]: I0223 08:45:13.712335 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tspgk"] Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.154438 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.254862 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546fd8f979-dwngr"] Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.255753 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" podUID="377d595e-f15a-4914-8860-b3f447cf04fc" containerName="dnsmasq-dns" containerID="cri-o://e11321901c3d910ba16b14f88ec264ba526b1ee0c8e6a29b59254945447c8a77" gracePeriod=10 Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.590681 5118 generic.go:334] "Generic (PLEG): container finished" podID="377d595e-f15a-4914-8860-b3f447cf04fc" containerID="e11321901c3d910ba16b14f88ec264ba526b1ee0c8e6a29b59254945447c8a77" exitCode=0 Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.590743 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" event={"ID":"377d595e-f15a-4914-8860-b3f447cf04fc","Type":"ContainerDied","Data":"e11321901c3d910ba16b14f88ec264ba526b1ee0c8e6a29b59254945447c8a77"} Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.834752 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.986855 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7jhn\" (UniqueName: \"kubernetes.io/projected/377d595e-f15a-4914-8860-b3f447cf04fc-kube-api-access-m7jhn\") pod \"377d595e-f15a-4914-8860-b3f447cf04fc\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.987017 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-sb\") pod \"377d595e-f15a-4914-8860-b3f447cf04fc\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.987083 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-config\") pod \"377d595e-f15a-4914-8860-b3f447cf04fc\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.987161 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-nb\") pod \"377d595e-f15a-4914-8860-b3f447cf04fc\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.987207 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-dns-svc\") pod \"377d595e-f15a-4914-8860-b3f447cf04fc\" (UID: \"377d595e-f15a-4914-8860-b3f447cf04fc\") " Feb 23 08:45:14 crc kubenswrapper[5118]: I0223 08:45:14.994698 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377d595e-f15a-4914-8860-b3f447cf04fc-kube-api-access-m7jhn" (OuterVolumeSpecName: "kube-api-access-m7jhn") pod "377d595e-f15a-4914-8860-b3f447cf04fc" (UID: "377d595e-f15a-4914-8860-b3f447cf04fc"). InnerVolumeSpecName "kube-api-access-m7jhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.039612 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "377d595e-f15a-4914-8860-b3f447cf04fc" (UID: "377d595e-f15a-4914-8860-b3f447cf04fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.051939 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-config" (OuterVolumeSpecName: "config") pod "377d595e-f15a-4914-8860-b3f447cf04fc" (UID: "377d595e-f15a-4914-8860-b3f447cf04fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.051978 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "377d595e-f15a-4914-8860-b3f447cf04fc" (UID: "377d595e-f15a-4914-8860-b3f447cf04fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.059883 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "377d595e-f15a-4914-8860-b3f447cf04fc" (UID: "377d595e-f15a-4914-8860-b3f447cf04fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.090470 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.090542 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7jhn\" (UniqueName: \"kubernetes.io/projected/377d595e-f15a-4914-8860-b3f447cf04fc-kube-api-access-m7jhn\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.090575 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.090601 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.090627 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/377d595e-f15a-4914-8860-b3f447cf04fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.605871 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" event={"ID":"377d595e-f15a-4914-8860-b3f447cf04fc","Type":"ContainerDied","Data":"1c8b17204cf89652438fc4813a53992e500d7d3fd141659d7366eb77a5003925"} Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.605950 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546fd8f979-dwngr" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.605990 5118 scope.go:117] "RemoveContainer" containerID="e11321901c3d910ba16b14f88ec264ba526b1ee0c8e6a29b59254945447c8a77" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.648323 5118 scope.go:117] "RemoveContainer" containerID="f26674dc4bed47340c4cc48ec1b53d50988d15b7575e1bb64f16ebedd754588e" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.674502 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546fd8f979-dwngr"] Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.686188 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-546fd8f979-dwngr"] Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.719422 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377d595e-f15a-4914-8860-b3f447cf04fc" path="/var/lib/kubelet/pods/377d595e-f15a-4914-8860-b3f447cf04fc/volumes" Feb 23 08:45:15 crc kubenswrapper[5118]: I0223 08:45:15.720748 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="606c4177-3dc3-4d64-a849-30f92171f7f3" path="/var/lib/kubelet/pods/606c4177-3dc3-4d64-a849-30f92171f7f3/volumes" Feb 23 08:45:18 crc kubenswrapper[5118]: I0223 08:45:18.872674 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 08:45:18 crc kubenswrapper[5118]: I0223 08:45:18.873230 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 08:45:18 crc kubenswrapper[5118]: I0223 08:45:18.932964 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 08:45:18 crc kubenswrapper[5118]: I0223 08:45:18.940956 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 08:45:19 crc kubenswrapper[5118]: I0223 08:45:19.657558 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 08:45:19 crc kubenswrapper[5118]: I0223 08:45:19.658199 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 08:45:19 crc kubenswrapper[5118]: I0223 08:45:19.948659 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:19 crc kubenswrapper[5118]: I0223 08:45:19.950689 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:19 crc kubenswrapper[5118]: I0223 08:45:19.990987 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:20 crc kubenswrapper[5118]: I0223 08:45:20.006463 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:20 crc kubenswrapper[5118]: I0223 08:45:20.672540 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:20 crc kubenswrapper[5118]: I0223 08:45:20.673318 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:21 crc kubenswrapper[5118]: I0223 08:45:21.618253 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 08:45:21 crc kubenswrapper[5118]: I0223 08:45:21.670334 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 08:45:22 crc kubenswrapper[5118]: I0223 08:45:22.780003 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:22 crc kubenswrapper[5118]: I0223 08:45:22.780597 5118 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 08:45:22 crc kubenswrapper[5118]: I0223 08:45:22.904769 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.028203 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-lv8ql"] Feb 23 08:45:31 crc kubenswrapper[5118]: E0223 08:45:31.029351 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerName="registry-server" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.029368 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerName="registry-server" Feb 23 08:45:31 crc kubenswrapper[5118]: E0223 08:45:31.029385 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377d595e-f15a-4914-8860-b3f447cf04fc" containerName="init" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.029391 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="377d595e-f15a-4914-8860-b3f447cf04fc" containerName="init" Feb 23 08:45:31 crc kubenswrapper[5118]: E0223 08:45:31.029411 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerName="extract-utilities" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.029418 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerName="extract-utilities" Feb 23 08:45:31 crc kubenswrapper[5118]: E0223 08:45:31.029439 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerName="extract-content" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.029444 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerName="extract-content" Feb 23 08:45:31 crc kubenswrapper[5118]: E0223 08:45:31.029458 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377d595e-f15a-4914-8860-b3f447cf04fc" containerName="dnsmasq-dns" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.029464 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="377d595e-f15a-4914-8860-b3f447cf04fc" containerName="dnsmasq-dns" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.029651 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="606c4177-3dc3-4d64-a849-30f92171f7f3" containerName="registry-server" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.029674 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="377d595e-f15a-4914-8860-b3f447cf04fc" containerName="dnsmasq-dns" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.030464 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lv8ql" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.045277 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lv8ql"] Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.053819 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-operator-scripts\") pod \"placement-db-create-lv8ql\" (UID: \"70f07931-7cb9-4a0e-a08d-1bcb570dc50e\") " pod="openstack/placement-db-create-lv8ql" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.053980 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9np\" (UniqueName: \"kubernetes.io/projected/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-kube-api-access-gn9np\") pod \"placement-db-create-lv8ql\" (UID: \"70f07931-7cb9-4a0e-a08d-1bcb570dc50e\") " pod="openstack/placement-db-create-lv8ql" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.119968 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9edb-account-create-update-njmtp"] Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.121725 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9edb-account-create-update-njmtp" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.129328 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.144906 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9edb-account-create-update-njmtp"] Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.159364 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-operator-scripts\") pod \"placement-9edb-account-create-update-njmtp\" (UID: \"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2\") " pod="openstack/placement-9edb-account-create-update-njmtp" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.159502 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9np\" (UniqueName: \"kubernetes.io/projected/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-kube-api-access-gn9np\") pod \"placement-db-create-lv8ql\" (UID: \"70f07931-7cb9-4a0e-a08d-1bcb570dc50e\") " pod="openstack/placement-db-create-lv8ql" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.159600 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-operator-scripts\") pod \"placement-db-create-lv8ql\" (UID: \"70f07931-7cb9-4a0e-a08d-1bcb570dc50e\") " pod="openstack/placement-db-create-lv8ql" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.159661 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjvr7\" (UniqueName: \"kubernetes.io/projected/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-kube-api-access-jjvr7\") pod \"placement-9edb-account-create-update-njmtp\" (UID: \"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2\") " pod="openstack/placement-9edb-account-create-update-njmtp" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.161150 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-operator-scripts\") pod \"placement-db-create-lv8ql\" (UID: \"70f07931-7cb9-4a0e-a08d-1bcb570dc50e\") " pod="openstack/placement-db-create-lv8ql" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.208012 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9np\" (UniqueName: \"kubernetes.io/projected/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-kube-api-access-gn9np\") pod \"placement-db-create-lv8ql\" (UID: \"70f07931-7cb9-4a0e-a08d-1bcb570dc50e\") " pod="openstack/placement-db-create-lv8ql" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.271672 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjvr7\" (UniqueName: \"kubernetes.io/projected/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-kube-api-access-jjvr7\") pod \"placement-9edb-account-create-update-njmtp\" (UID: \"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2\") " pod="openstack/placement-9edb-account-create-update-njmtp" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.271741 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-operator-scripts\") pod \"placement-9edb-account-create-update-njmtp\" (UID: \"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2\") " pod="openstack/placement-9edb-account-create-update-njmtp" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.272674 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-operator-scripts\") pod \"placement-9edb-account-create-update-njmtp\" (UID: \"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2\") " pod="openstack/placement-9edb-account-create-update-njmtp" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.292737 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjvr7\" (UniqueName: \"kubernetes.io/projected/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-kube-api-access-jjvr7\") pod \"placement-9edb-account-create-update-njmtp\" (UID: \"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2\") " pod="openstack/placement-9edb-account-create-update-njmtp" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.349032 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lv8ql" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.437241 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9edb-account-create-update-njmtp" Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.799155 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lv8ql"] Feb 23 08:45:31 crc kubenswrapper[5118]: W0223 08:45:31.811225 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70f07931_7cb9_4a0e_a08d_1bcb570dc50e.slice/crio-d09df76fe7b310948192c50854adf41a5b882a4bb710d780c0c167fabc9ac0dd WatchSource:0}: Error finding container d09df76fe7b310948192c50854adf41a5b882a4bb710d780c0c167fabc9ac0dd: Status 404 returned error can't find the container with id d09df76fe7b310948192c50854adf41a5b882a4bb710d780c0c167fabc9ac0dd Feb 23 08:45:31 crc kubenswrapper[5118]: I0223 08:45:31.955186 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9edb-account-create-update-njmtp"] Feb 23 08:45:31 crc kubenswrapper[5118]: W0223 08:45:31.965333 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1da00b9_26a7_4bed_8ac6_f79d47d1f2e2.slice/crio-309f84a8770648dd32d2ee1dc1fc488ef41223e281be48462a607445b0ec4d15 WatchSource:0}: Error finding container 309f84a8770648dd32d2ee1dc1fc488ef41223e281be48462a607445b0ec4d15: Status 404 returned error can't find the container with id 309f84a8770648dd32d2ee1dc1fc488ef41223e281be48462a607445b0ec4d15 Feb 23 08:45:32 crc kubenswrapper[5118]: I0223 08:45:32.826557 5118 generic.go:334] "Generic (PLEG): container finished" podID="c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2" containerID="f2c8e7a7bb3db481c12653260835aa44391f71ce1f64ab8bda764963bb758289" exitCode=0 Feb 23 08:45:32 crc kubenswrapper[5118]: I0223 08:45:32.826685 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9edb-account-create-update-njmtp" event={"ID":"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2","Type":"ContainerDied","Data":"f2c8e7a7bb3db481c12653260835aa44391f71ce1f64ab8bda764963bb758289"} Feb 23 08:45:32 crc kubenswrapper[5118]: I0223 08:45:32.827173 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9edb-account-create-update-njmtp" event={"ID":"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2","Type":"ContainerStarted","Data":"309f84a8770648dd32d2ee1dc1fc488ef41223e281be48462a607445b0ec4d15"} Feb 23 08:45:32 crc kubenswrapper[5118]: I0223 08:45:32.830488 5118 generic.go:334] "Generic (PLEG): container finished" podID="70f07931-7cb9-4a0e-a08d-1bcb570dc50e" containerID="75beeca2c1a18c4bee7e944f7edb32ec5389bcec31df6b1cab9f3caff2e0274c" exitCode=0 Feb 23 08:45:32 crc kubenswrapper[5118]: I0223 08:45:32.830547 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lv8ql" event={"ID":"70f07931-7cb9-4a0e-a08d-1bcb570dc50e","Type":"ContainerDied","Data":"75beeca2c1a18c4bee7e944f7edb32ec5389bcec31df6b1cab9f3caff2e0274c"} Feb 23 08:45:32 crc kubenswrapper[5118]: I0223 08:45:32.830620 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lv8ql" event={"ID":"70f07931-7cb9-4a0e-a08d-1bcb570dc50e","Type":"ContainerStarted","Data":"d09df76fe7b310948192c50854adf41a5b882a4bb710d780c0c167fabc9ac0dd"} Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.347826 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lv8ql" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.354008 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9edb-account-create-update-njmtp" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.445071 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-operator-scripts\") pod \"70f07931-7cb9-4a0e-a08d-1bcb570dc50e\" (UID: \"70f07931-7cb9-4a0e-a08d-1bcb570dc50e\") " Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.445537 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjvr7\" (UniqueName: \"kubernetes.io/projected/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-kube-api-access-jjvr7\") pod \"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2\" (UID: \"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2\") " Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.445576 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn9np\" (UniqueName: \"kubernetes.io/projected/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-kube-api-access-gn9np\") pod \"70f07931-7cb9-4a0e-a08d-1bcb570dc50e\" (UID: \"70f07931-7cb9-4a0e-a08d-1bcb570dc50e\") " Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.445595 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "70f07931-7cb9-4a0e-a08d-1bcb570dc50e" (UID: "70f07931-7cb9-4a0e-a08d-1bcb570dc50e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.445609 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-operator-scripts\") pod \"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2\" (UID: \"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2\") " Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.446444 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2" (UID: "c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.446919 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.446945 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.453311 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-kube-api-access-jjvr7" (OuterVolumeSpecName: "kube-api-access-jjvr7") pod "c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2" (UID: "c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2"). InnerVolumeSpecName "kube-api-access-jjvr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.457426 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-kube-api-access-gn9np" (OuterVolumeSpecName: "kube-api-access-gn9np") pod "70f07931-7cb9-4a0e-a08d-1bcb570dc50e" (UID: "70f07931-7cb9-4a0e-a08d-1bcb570dc50e"). InnerVolumeSpecName "kube-api-access-gn9np". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.548279 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjvr7\" (UniqueName: \"kubernetes.io/projected/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2-kube-api-access-jjvr7\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.548322 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn9np\" (UniqueName: \"kubernetes.io/projected/70f07931-7cb9-4a0e-a08d-1bcb570dc50e-kube-api-access-gn9np\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.852990 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lv8ql" event={"ID":"70f07931-7cb9-4a0e-a08d-1bcb570dc50e","Type":"ContainerDied","Data":"d09df76fe7b310948192c50854adf41a5b882a4bb710d780c0c167fabc9ac0dd"} Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.853956 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d09df76fe7b310948192c50854adf41a5b882a4bb710d780c0c167fabc9ac0dd" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.853062 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lv8ql" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.856186 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9edb-account-create-update-njmtp" event={"ID":"c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2","Type":"ContainerDied","Data":"309f84a8770648dd32d2ee1dc1fc488ef41223e281be48462a607445b0ec4d15"} Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.856249 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="309f84a8770648dd32d2ee1dc1fc488ef41223e281be48462a607445b0ec4d15" Feb 23 08:45:34 crc kubenswrapper[5118]: I0223 08:45:34.856349 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9edb-account-create-update-njmtp" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.536858 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-br4mg"] Feb 23 08:45:36 crc kubenswrapper[5118]: E0223 08:45:36.538064 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2" containerName="mariadb-account-create-update" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.538159 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2" containerName="mariadb-account-create-update" Feb 23 08:45:36 crc kubenswrapper[5118]: E0223 08:45:36.538232 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f07931-7cb9-4a0e-a08d-1bcb570dc50e" containerName="mariadb-database-create" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.538306 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f07931-7cb9-4a0e-a08d-1bcb570dc50e" containerName="mariadb-database-create" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.538533 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2" containerName="mariadb-account-create-update" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.538599 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f07931-7cb9-4a0e-a08d-1bcb570dc50e" containerName="mariadb-database-create" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.539336 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.543952 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-cfl7d" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.544234 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.544235 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.550978 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f6d7f58c-xw89l"] Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.552445 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.571992 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-br4mg"] Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.579583 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f6d7f58c-xw89l"] Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.598695 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4dd2\" (UniqueName: \"kubernetes.io/projected/444d1530-7a28-4644-8308-1ba2c57dfe2c-kube-api-access-x4dd2\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.598764 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcldr\" (UniqueName: \"kubernetes.io/projected/8d79556d-66cd-44c2-84b0-d20e1c8768b0-kube-api-access-jcldr\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.598792 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-dns-svc\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.598814 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-config\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.598840 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d79556d-66cd-44c2-84b0-d20e1c8768b0-logs\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.598885 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-combined-ca-bundle\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.598938 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-nb\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.598959 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-scripts\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.598996 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-sb\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.599275 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-config-data\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.700682 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-config-data\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.700739 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4dd2\" (UniqueName: \"kubernetes.io/projected/444d1530-7a28-4644-8308-1ba2c57dfe2c-kube-api-access-x4dd2\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.700766 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcldr\" (UniqueName: \"kubernetes.io/projected/8d79556d-66cd-44c2-84b0-d20e1c8768b0-kube-api-access-jcldr\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.700789 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-dns-svc\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.700814 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-config\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.700839 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d79556d-66cd-44c2-84b0-d20e1c8768b0-logs\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.700868 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-combined-ca-bundle\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.700918 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-nb\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.700941 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-scripts\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.700980 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-sb\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.702125 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d79556d-66cd-44c2-84b0-d20e1c8768b0-logs\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.702420 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-sb\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.702609 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-dns-svc\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.702717 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-config\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.703340 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-nb\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.708363 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-combined-ca-bundle\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.708414 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-config-data\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.709126 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-scripts\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.720564 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcldr\" (UniqueName: \"kubernetes.io/projected/8d79556d-66cd-44c2-84b0-d20e1c8768b0-kube-api-access-jcldr\") pod \"placement-db-sync-br4mg\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.728918 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4dd2\" (UniqueName: \"kubernetes.io/projected/444d1530-7a28-4644-8308-1ba2c57dfe2c-kube-api-access-x4dd2\") pod \"dnsmasq-dns-85f6d7f58c-xw89l\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.871037 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:36 crc kubenswrapper[5118]: I0223 08:45:36.877383 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:37 crc kubenswrapper[5118]: I0223 08:45:37.376297 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-br4mg"] Feb 23 08:45:37 crc kubenswrapper[5118]: I0223 08:45:37.429512 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f6d7f58c-xw89l"] Feb 23 08:45:37 crc kubenswrapper[5118]: W0223 08:45:37.434330 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod444d1530_7a28_4644_8308_1ba2c57dfe2c.slice/crio-e9c9fa74729a08385a9caa8e037b992e12e7ba3676f53217a42508723bff18db WatchSource:0}: Error finding container e9c9fa74729a08385a9caa8e037b992e12e7ba3676f53217a42508723bff18db: Status 404 returned error can't find the container with id e9c9fa74729a08385a9caa8e037b992e12e7ba3676f53217a42508723bff18db Feb 23 08:45:37 crc kubenswrapper[5118]: I0223 08:45:37.888208 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-br4mg" event={"ID":"8d79556d-66cd-44c2-84b0-d20e1c8768b0","Type":"ContainerStarted","Data":"7fd2e7a3295b3890780e2439a76f5b326a31e61a3539f23a96a62336a32bd845"} Feb 23 08:45:37 crc kubenswrapper[5118]: I0223 08:45:37.891284 5118 generic.go:334] "Generic (PLEG): container finished" podID="444d1530-7a28-4644-8308-1ba2c57dfe2c" containerID="6352248c154b9174e43a38f36c0fe09b42a8a216d60075223ec542e52b48cf26" exitCode=0 Feb 23 08:45:37 crc kubenswrapper[5118]: I0223 08:45:37.891344 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" event={"ID":"444d1530-7a28-4644-8308-1ba2c57dfe2c","Type":"ContainerDied","Data":"6352248c154b9174e43a38f36c0fe09b42a8a216d60075223ec542e52b48cf26"} Feb 23 08:45:37 crc kubenswrapper[5118]: I0223 08:45:37.891380 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" event={"ID":"444d1530-7a28-4644-8308-1ba2c57dfe2c","Type":"ContainerStarted","Data":"e9c9fa74729a08385a9caa8e037b992e12e7ba3676f53217a42508723bff18db"} Feb 23 08:45:38 crc kubenswrapper[5118]: I0223 08:45:38.901223 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" event={"ID":"444d1530-7a28-4644-8308-1ba2c57dfe2c","Type":"ContainerStarted","Data":"bd5e08520647fe05a940a13bb8b1c905fbc7c5ae94830059983c760825cc1b5b"} Feb 23 08:45:38 crc kubenswrapper[5118]: I0223 08:45:38.901722 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:38 crc kubenswrapper[5118]: I0223 08:45:38.923999 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" podStartSLOduration=2.923981992 podStartE2EDuration="2.923981992s" podCreationTimestamp="2026-02-23 08:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:38.919239548 +0000 UTC m=+7201.923024121" watchObservedRunningTime="2026-02-23 08:45:38.923981992 +0000 UTC m=+7201.927766565" Feb 23 08:45:40 crc kubenswrapper[5118]: I0223 08:45:40.928484 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-br4mg" event={"ID":"8d79556d-66cd-44c2-84b0-d20e1c8768b0","Type":"ContainerStarted","Data":"10a7ae64db688633500efa2e11ec05d691ccca7ba6abfcacaef23f6f841acdcf"} Feb 23 08:45:40 crc kubenswrapper[5118]: I0223 08:45:40.964751 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-br4mg" podStartSLOduration=1.744433594 podStartE2EDuration="4.964715701s" podCreationTimestamp="2026-02-23 08:45:36 +0000 UTC" firstStartedPulling="2026-02-23 08:45:37.384026424 +0000 UTC m=+7200.387811007" lastFinishedPulling="2026-02-23 08:45:40.604308531 +0000 UTC m=+7203.608093114" observedRunningTime="2026-02-23 08:45:40.952703442 +0000 UTC m=+7203.956488055" watchObservedRunningTime="2026-02-23 08:45:40.964715701 +0000 UTC m=+7203.968500304" Feb 23 08:45:42 crc kubenswrapper[5118]: I0223 08:45:42.986049 5118 generic.go:334] "Generic (PLEG): container finished" podID="8d79556d-66cd-44c2-84b0-d20e1c8768b0" containerID="10a7ae64db688633500efa2e11ec05d691ccca7ba6abfcacaef23f6f841acdcf" exitCode=0 Feb 23 08:45:42 crc kubenswrapper[5118]: I0223 08:45:42.986197 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-br4mg" event={"ID":"8d79556d-66cd-44c2-84b0-d20e1c8768b0","Type":"ContainerDied","Data":"10a7ae64db688633500efa2e11ec05d691ccca7ba6abfcacaef23f6f841acdcf"} Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.484816 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.563445 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-config-data\") pod \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.564292 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-scripts\") pod \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.569986 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-scripts" (OuterVolumeSpecName: "scripts") pod "8d79556d-66cd-44c2-84b0-d20e1c8768b0" (UID: "8d79556d-66cd-44c2-84b0-d20e1c8768b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.588522 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-config-data" (OuterVolumeSpecName: "config-data") pod "8d79556d-66cd-44c2-84b0-d20e1c8768b0" (UID: "8d79556d-66cd-44c2-84b0-d20e1c8768b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.666069 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcldr\" (UniqueName: \"kubernetes.io/projected/8d79556d-66cd-44c2-84b0-d20e1c8768b0-kube-api-access-jcldr\") pod \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.666191 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-combined-ca-bundle\") pod \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.666472 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d79556d-66cd-44c2-84b0-d20e1c8768b0-logs\") pod \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\" (UID: \"8d79556d-66cd-44c2-84b0-d20e1c8768b0\") " Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.666841 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.666863 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.666883 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d79556d-66cd-44c2-84b0-d20e1c8768b0-logs" (OuterVolumeSpecName: "logs") pod "8d79556d-66cd-44c2-84b0-d20e1c8768b0" (UID: "8d79556d-66cd-44c2-84b0-d20e1c8768b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.669962 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d79556d-66cd-44c2-84b0-d20e1c8768b0-kube-api-access-jcldr" (OuterVolumeSpecName: "kube-api-access-jcldr") pod "8d79556d-66cd-44c2-84b0-d20e1c8768b0" (UID: "8d79556d-66cd-44c2-84b0-d20e1c8768b0"). InnerVolumeSpecName "kube-api-access-jcldr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.689871 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d79556d-66cd-44c2-84b0-d20e1c8768b0" (UID: "8d79556d-66cd-44c2-84b0-d20e1c8768b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.769138 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d79556d-66cd-44c2-84b0-d20e1c8768b0-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.769468 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcldr\" (UniqueName: \"kubernetes.io/projected/8d79556d-66cd-44c2-84b0-d20e1c8768b0-kube-api-access-jcldr\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:44 crc kubenswrapper[5118]: I0223 08:45:44.769545 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d79556d-66cd-44c2-84b0-d20e1c8768b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.012955 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-br4mg" event={"ID":"8d79556d-66cd-44c2-84b0-d20e1c8768b0","Type":"ContainerDied","Data":"7fd2e7a3295b3890780e2439a76f5b326a31e61a3539f23a96a62336a32bd845"} Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.013019 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fd2e7a3295b3890780e2439a76f5b326a31e61a3539f23a96a62336a32bd845" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.013560 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-br4mg" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.227140 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-69b8f76cc6-xp4qh"] Feb 23 08:45:45 crc kubenswrapper[5118]: E0223 08:45:45.227647 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d79556d-66cd-44c2-84b0-d20e1c8768b0" containerName="placement-db-sync" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.227674 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d79556d-66cd-44c2-84b0-d20e1c8768b0" containerName="placement-db-sync" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.227924 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d79556d-66cd-44c2-84b0-d20e1c8768b0" containerName="placement-db-sync" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.229014 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.234766 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-cfl7d" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.235358 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.235597 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.250340 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69b8f76cc6-xp4qh"] Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.380081 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b08825-a737-4e87-b491-8cba50c463dd-scripts\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.380149 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b08825-a737-4e87-b491-8cba50c463dd-config-data\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.380222 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b08825-a737-4e87-b491-8cba50c463dd-combined-ca-bundle\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.380365 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b08825-a737-4e87-b491-8cba50c463dd-logs\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.380467 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g656g\" (UniqueName: \"kubernetes.io/projected/c7b08825-a737-4e87-b491-8cba50c463dd-kube-api-access-g656g\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.482539 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b08825-a737-4e87-b491-8cba50c463dd-scripts\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.482621 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b08825-a737-4e87-b491-8cba50c463dd-config-data\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.482723 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b08825-a737-4e87-b491-8cba50c463dd-combined-ca-bundle\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.482764 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b08825-a737-4e87-b491-8cba50c463dd-logs\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.482807 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g656g\" (UniqueName: \"kubernetes.io/projected/c7b08825-a737-4e87-b491-8cba50c463dd-kube-api-access-g656g\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.485731 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b08825-a737-4e87-b491-8cba50c463dd-logs\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.489046 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b08825-a737-4e87-b491-8cba50c463dd-scripts\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.490706 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b08825-a737-4e87-b491-8cba50c463dd-combined-ca-bundle\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.493623 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b08825-a737-4e87-b491-8cba50c463dd-config-data\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.501666 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g656g\" (UniqueName: \"kubernetes.io/projected/c7b08825-a737-4e87-b491-8cba50c463dd-kube-api-access-g656g\") pod \"placement-69b8f76cc6-xp4qh\" (UID: \"c7b08825-a737-4e87-b491-8cba50c463dd\") " pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:45 crc kubenswrapper[5118]: I0223 08:45:45.595358 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:46 crc kubenswrapper[5118]: I0223 08:45:46.107494 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69b8f76cc6-xp4qh"] Feb 23 08:45:46 crc kubenswrapper[5118]: I0223 08:45:46.880289 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:45:46 crc kubenswrapper[5118]: I0223 08:45:46.962302 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dfb985d97-f7qdh"] Feb 23 08:45:46 crc kubenswrapper[5118]: I0223 08:45:46.962669 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" podUID="b3afb984-0794-450e-a141-0068ac7fca00" containerName="dnsmasq-dns" containerID="cri-o://a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805" gracePeriod=10 Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.038037 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69b8f76cc6-xp4qh" event={"ID":"c7b08825-a737-4e87-b491-8cba50c463dd","Type":"ContainerStarted","Data":"7287069d5b9644863f6cf522ff2f64ec6d2c49eb07978a51bb1772cbedc3332a"} Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.038124 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69b8f76cc6-xp4qh" event={"ID":"c7b08825-a737-4e87-b491-8cba50c463dd","Type":"ContainerStarted","Data":"b7db77e4cecf8cbc498f2a90db19a73bf80b1a31977f486a4644dcda79be3dfb"} Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.038140 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69b8f76cc6-xp4qh" event={"ID":"c7b08825-a737-4e87-b491-8cba50c463dd","Type":"ContainerStarted","Data":"6ae86b36f3917a659358aab0c3af4e194b6ddbdd2a974a9688ff0f53275f1325"} Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.038666 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.038707 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.058371 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-69b8f76cc6-xp4qh" podStartSLOduration=2.058346561 podStartE2EDuration="2.058346561s" podCreationTimestamp="2026-02-23 08:45:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:45:47.055464082 +0000 UTC m=+7210.059248675" watchObservedRunningTime="2026-02-23 08:45:47.058346561 +0000 UTC m=+7210.062131144" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.467309 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.634302 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-sb\") pod \"b3afb984-0794-450e-a141-0068ac7fca00\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.634366 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-nb\") pod \"b3afb984-0794-450e-a141-0068ac7fca00\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.634412 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-config\") pod \"b3afb984-0794-450e-a141-0068ac7fca00\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.634482 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhbhb\" (UniqueName: \"kubernetes.io/projected/b3afb984-0794-450e-a141-0068ac7fca00-kube-api-access-nhbhb\") pod \"b3afb984-0794-450e-a141-0068ac7fca00\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.634652 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-dns-svc\") pod \"b3afb984-0794-450e-a141-0068ac7fca00\" (UID: \"b3afb984-0794-450e-a141-0068ac7fca00\") " Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.642553 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3afb984-0794-450e-a141-0068ac7fca00-kube-api-access-nhbhb" (OuterVolumeSpecName: "kube-api-access-nhbhb") pod "b3afb984-0794-450e-a141-0068ac7fca00" (UID: "b3afb984-0794-450e-a141-0068ac7fca00"). InnerVolumeSpecName "kube-api-access-nhbhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.700063 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3afb984-0794-450e-a141-0068ac7fca00" (UID: "b3afb984-0794-450e-a141-0068ac7fca00"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.702181 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-config" (OuterVolumeSpecName: "config") pod "b3afb984-0794-450e-a141-0068ac7fca00" (UID: "b3afb984-0794-450e-a141-0068ac7fca00"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.713042 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3afb984-0794-450e-a141-0068ac7fca00" (UID: "b3afb984-0794-450e-a141-0068ac7fca00"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.719954 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3afb984-0794-450e-a141-0068ac7fca00" (UID: "b3afb984-0794-450e-a141-0068ac7fca00"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.738119 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.739003 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.739124 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.739144 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3afb984-0794-450e-a141-0068ac7fca00-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:47 crc kubenswrapper[5118]: I0223 08:45:47.739159 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhbhb\" (UniqueName: \"kubernetes.io/projected/b3afb984-0794-450e-a141-0068ac7fca00-kube-api-access-nhbhb\") on node \"crc\" DevicePath \"\"" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.057170 5118 generic.go:334] "Generic (PLEG): container finished" podID="b3afb984-0794-450e-a141-0068ac7fca00" containerID="a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805" exitCode=0 Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.057235 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.057268 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" event={"ID":"b3afb984-0794-450e-a141-0068ac7fca00","Type":"ContainerDied","Data":"a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805"} Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.057358 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dfb985d97-f7qdh" event={"ID":"b3afb984-0794-450e-a141-0068ac7fca00","Type":"ContainerDied","Data":"f60eee71263987cee4f1d0ba9f8a5ae8df7c302cef506c431d928a67874cea05"} Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.057392 5118 scope.go:117] "RemoveContainer" containerID="a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.091408 5118 scope.go:117] "RemoveContainer" containerID="aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.133002 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dfb985d97-f7qdh"] Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.147812 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dfb985d97-f7qdh"] Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.148829 5118 scope.go:117] "RemoveContainer" containerID="a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805" Feb 23 08:45:48 crc kubenswrapper[5118]: E0223 08:45:48.150582 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805\": container with ID starting with a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805 not found: ID does not exist" containerID="a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.150624 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805"} err="failed to get container status \"a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805\": rpc error: code = NotFound desc = could not find container \"a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805\": container with ID starting with a6d6407e57a91efe5440b911e78ab7244f9927683634fa23666f3f306aa43805 not found: ID does not exist" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.150658 5118 scope.go:117] "RemoveContainer" containerID="aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b" Feb 23 08:45:48 crc kubenswrapper[5118]: E0223 08:45:48.151427 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b\": container with ID starting with aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b not found: ID does not exist" containerID="aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.151467 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b"} err="failed to get container status \"aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b\": rpc error: code = NotFound desc = could not find container \"aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b\": container with ID starting with aa8a075c1d0dfe65457d0e42c7a66dee2c7cc5090a683ed4dbdbfe0e6caacb5b not found: ID does not exist" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.241216 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mr4rb"] Feb 23 08:45:48 crc kubenswrapper[5118]: E0223 08:45:48.241882 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3afb984-0794-450e-a141-0068ac7fca00" containerName="dnsmasq-dns" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.241913 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3afb984-0794-450e-a141-0068ac7fca00" containerName="dnsmasq-dns" Feb 23 08:45:48 crc kubenswrapper[5118]: E0223 08:45:48.241942 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3afb984-0794-450e-a141-0068ac7fca00" containerName="init" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.241953 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3afb984-0794-450e-a141-0068ac7fca00" containerName="init" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.242246 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3afb984-0794-450e-a141-0068ac7fca00" containerName="dnsmasq-dns" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.244442 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.251952 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mr4rb"] Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.351266 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5p7r\" (UniqueName: \"kubernetes.io/projected/0b9abc2c-f99c-48b0-a468-214bfdf2df22-kube-api-access-t5p7r\") pod \"redhat-operators-mr4rb\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.351878 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-utilities\") pod \"redhat-operators-mr4rb\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.352025 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-catalog-content\") pod \"redhat-operators-mr4rb\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.453439 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5p7r\" (UniqueName: \"kubernetes.io/projected/0b9abc2c-f99c-48b0-a468-214bfdf2df22-kube-api-access-t5p7r\") pod \"redhat-operators-mr4rb\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.453522 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-utilities\") pod \"redhat-operators-mr4rb\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.453564 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-catalog-content\") pod \"redhat-operators-mr4rb\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.454085 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-catalog-content\") pod \"redhat-operators-mr4rb\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.454500 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-utilities\") pod \"redhat-operators-mr4rb\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.477362 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5p7r\" (UniqueName: \"kubernetes.io/projected/0b9abc2c-f99c-48b0-a468-214bfdf2df22-kube-api-access-t5p7r\") pod \"redhat-operators-mr4rb\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:48 crc kubenswrapper[5118]: I0223 08:45:48.561639 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:49 crc kubenswrapper[5118]: I0223 08:45:49.042945 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mr4rb"] Feb 23 08:45:49 crc kubenswrapper[5118]: I0223 08:45:49.068981 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr4rb" event={"ID":"0b9abc2c-f99c-48b0-a468-214bfdf2df22","Type":"ContainerStarted","Data":"85810c8e060312619c03e02193de3b11a5972ddd32b847e39dde299fc5de562c"} Feb 23 08:45:49 crc kubenswrapper[5118]: I0223 08:45:49.708672 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3afb984-0794-450e-a141-0068ac7fca00" path="/var/lib/kubelet/pods/b3afb984-0794-450e-a141-0068ac7fca00/volumes" Feb 23 08:45:50 crc kubenswrapper[5118]: I0223 08:45:50.082264 5118 generic.go:334] "Generic (PLEG): container finished" podID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerID="db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce" exitCode=0 Feb 23 08:45:50 crc kubenswrapper[5118]: I0223 08:45:50.082327 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr4rb" event={"ID":"0b9abc2c-f99c-48b0-a468-214bfdf2df22","Type":"ContainerDied","Data":"db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce"} Feb 23 08:45:51 crc kubenswrapper[5118]: I0223 08:45:51.104334 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr4rb" event={"ID":"0b9abc2c-f99c-48b0-a468-214bfdf2df22","Type":"ContainerStarted","Data":"1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d"} Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.131775 5118 generic.go:334] "Generic (PLEG): container finished" podID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerID="1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d" exitCode=0 Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.131867 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr4rb" event={"ID":"0b9abc2c-f99c-48b0-a468-214bfdf2df22","Type":"ContainerDied","Data":"1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d"} Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.634686 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-86d4l"] Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.639085 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.654978 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86d4l"] Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.769657 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-catalog-content\") pod \"community-operators-86d4l\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.770235 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8kwh\" (UniqueName: \"kubernetes.io/projected/37f2257e-0a07-4d08-a91f-c420cf9536bb-kube-api-access-t8kwh\") pod \"community-operators-86d4l\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.770267 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-utilities\") pod \"community-operators-86d4l\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.871864 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-catalog-content\") pod \"community-operators-86d4l\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.871954 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8kwh\" (UniqueName: \"kubernetes.io/projected/37f2257e-0a07-4d08-a91f-c420cf9536bb-kube-api-access-t8kwh\") pod \"community-operators-86d4l\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.871974 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-utilities\") pod \"community-operators-86d4l\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.872491 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-catalog-content\") pod \"community-operators-86d4l\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.872530 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-utilities\") pod \"community-operators-86d4l\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.896879 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8kwh\" (UniqueName: \"kubernetes.io/projected/37f2257e-0a07-4d08-a91f-c420cf9536bb-kube-api-access-t8kwh\") pod \"community-operators-86d4l\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:53 crc kubenswrapper[5118]: I0223 08:45:53.971905 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:45:54 crc kubenswrapper[5118]: I0223 08:45:54.199293 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr4rb" event={"ID":"0b9abc2c-f99c-48b0-a468-214bfdf2df22","Type":"ContainerStarted","Data":"93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00"} Feb 23 08:45:54 crc kubenswrapper[5118]: I0223 08:45:54.575767 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mr4rb" podStartSLOduration=2.844698691 podStartE2EDuration="6.5757465s" podCreationTimestamp="2026-02-23 08:45:48 +0000 UTC" firstStartedPulling="2026-02-23 08:45:50.085924287 +0000 UTC m=+7213.089708860" lastFinishedPulling="2026-02-23 08:45:53.816972096 +0000 UTC m=+7216.820756669" observedRunningTime="2026-02-23 08:45:54.24855162 +0000 UTC m=+7217.252336193" watchObservedRunningTime="2026-02-23 08:45:54.5757465 +0000 UTC m=+7217.579531073" Feb 23 08:45:54 crc kubenswrapper[5118]: I0223 08:45:54.580939 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86d4l"] Feb 23 08:45:54 crc kubenswrapper[5118]: W0223 08:45:54.587318 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37f2257e_0a07_4d08_a91f_c420cf9536bb.slice/crio-86847ee2be3d21a67b0333204b9460ea5c9489e6cdf923d1b32a4684b97f9df1 WatchSource:0}: Error finding container 86847ee2be3d21a67b0333204b9460ea5c9489e6cdf923d1b32a4684b97f9df1: Status 404 returned error can't find the container with id 86847ee2be3d21a67b0333204b9460ea5c9489e6cdf923d1b32a4684b97f9df1 Feb 23 08:45:55 crc kubenswrapper[5118]: I0223 08:45:55.215651 5118 generic.go:334] "Generic (PLEG): container finished" podID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerID="1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad" exitCode=0 Feb 23 08:45:55 crc kubenswrapper[5118]: I0223 08:45:55.215948 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86d4l" event={"ID":"37f2257e-0a07-4d08-a91f-c420cf9536bb","Type":"ContainerDied","Data":"1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad"} Feb 23 08:45:55 crc kubenswrapper[5118]: I0223 08:45:55.216168 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86d4l" event={"ID":"37f2257e-0a07-4d08-a91f-c420cf9536bb","Type":"ContainerStarted","Data":"86847ee2be3d21a67b0333204b9460ea5c9489e6cdf923d1b32a4684b97f9df1"} Feb 23 08:45:56 crc kubenswrapper[5118]: I0223 08:45:56.229670 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86d4l" event={"ID":"37f2257e-0a07-4d08-a91f-c420cf9536bb","Type":"ContainerStarted","Data":"5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae"} Feb 23 08:45:57 crc kubenswrapper[5118]: I0223 08:45:57.139290 5118 scope.go:117] "RemoveContainer" containerID="f4fba51cf8567d8478c874550c14c64c3d870a99fe36a9035a8c0549a98d79ee" Feb 23 08:45:57 crc kubenswrapper[5118]: I0223 08:45:57.243222 5118 generic.go:334] "Generic (PLEG): container finished" podID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerID="5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae" exitCode=0 Feb 23 08:45:57 crc kubenswrapper[5118]: I0223 08:45:57.243443 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86d4l" event={"ID":"37f2257e-0a07-4d08-a91f-c420cf9536bb","Type":"ContainerDied","Data":"5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae"} Feb 23 08:45:58 crc kubenswrapper[5118]: I0223 08:45:58.255773 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86d4l" event={"ID":"37f2257e-0a07-4d08-a91f-c420cf9536bb","Type":"ContainerStarted","Data":"4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860"} Feb 23 08:45:58 crc kubenswrapper[5118]: I0223 08:45:58.292437 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-86d4l" podStartSLOduration=2.866028895 podStartE2EDuration="5.292411952s" podCreationTimestamp="2026-02-23 08:45:53 +0000 UTC" firstStartedPulling="2026-02-23 08:45:55.221263377 +0000 UTC m=+7218.225047980" lastFinishedPulling="2026-02-23 08:45:57.647646464 +0000 UTC m=+7220.651431037" observedRunningTime="2026-02-23 08:45:58.287257718 +0000 UTC m=+7221.291042301" watchObservedRunningTime="2026-02-23 08:45:58.292411952 +0000 UTC m=+7221.296196535" Feb 23 08:45:58 crc kubenswrapper[5118]: I0223 08:45:58.562319 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:58 crc kubenswrapper[5118]: I0223 08:45:58.562686 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:45:59 crc kubenswrapper[5118]: I0223 08:45:59.618917 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mr4rb" podUID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerName="registry-server" probeResult="failure" output=< Feb 23 08:45:59 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 08:45:59 crc kubenswrapper[5118]: > Feb 23 08:46:02 crc kubenswrapper[5118]: I0223 08:46:02.975683 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:46:02 crc kubenswrapper[5118]: I0223 08:46:02.976112 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:46:03 crc kubenswrapper[5118]: I0223 08:46:03.973004 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:46:03 crc kubenswrapper[5118]: I0223 08:46:03.974220 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:46:04 crc kubenswrapper[5118]: I0223 08:46:04.038653 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:46:04 crc kubenswrapper[5118]: I0223 08:46:04.386499 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:46:04 crc kubenswrapper[5118]: I0223 08:46:04.474906 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86d4l"] Feb 23 08:46:06 crc kubenswrapper[5118]: I0223 08:46:06.344080 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-86d4l" podUID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerName="registry-server" containerID="cri-o://4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860" gracePeriod=2 Feb 23 08:46:06 crc kubenswrapper[5118]: I0223 08:46:06.925617 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:46:06 crc kubenswrapper[5118]: I0223 08:46:06.964458 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-utilities\") pod \"37f2257e-0a07-4d08-a91f-c420cf9536bb\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " Feb 23 08:46:06 crc kubenswrapper[5118]: I0223 08:46:06.964624 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-catalog-content\") pod \"37f2257e-0a07-4d08-a91f-c420cf9536bb\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " Feb 23 08:46:06 crc kubenswrapper[5118]: I0223 08:46:06.964810 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8kwh\" (UniqueName: \"kubernetes.io/projected/37f2257e-0a07-4d08-a91f-c420cf9536bb-kube-api-access-t8kwh\") pod \"37f2257e-0a07-4d08-a91f-c420cf9536bb\" (UID: \"37f2257e-0a07-4d08-a91f-c420cf9536bb\") " Feb 23 08:46:06 crc kubenswrapper[5118]: I0223 08:46:06.966856 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-utilities" (OuterVolumeSpecName: "utilities") pod "37f2257e-0a07-4d08-a91f-c420cf9536bb" (UID: "37f2257e-0a07-4d08-a91f-c420cf9536bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:46:06 crc kubenswrapper[5118]: I0223 08:46:06.972372 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f2257e-0a07-4d08-a91f-c420cf9536bb-kube-api-access-t8kwh" (OuterVolumeSpecName: "kube-api-access-t8kwh") pod "37f2257e-0a07-4d08-a91f-c420cf9536bb" (UID: "37f2257e-0a07-4d08-a91f-c420cf9536bb"). InnerVolumeSpecName "kube-api-access-t8kwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.039505 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37f2257e-0a07-4d08-a91f-c420cf9536bb" (UID: "37f2257e-0a07-4d08-a91f-c420cf9536bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.067509 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.067549 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8kwh\" (UniqueName: \"kubernetes.io/projected/37f2257e-0a07-4d08-a91f-c420cf9536bb-kube-api-access-t8kwh\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.067565 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37f2257e-0a07-4d08-a91f-c420cf9536bb-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.362896 5118 generic.go:334] "Generic (PLEG): container finished" podID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerID="4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860" exitCode=0 Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.362942 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86d4l" event={"ID":"37f2257e-0a07-4d08-a91f-c420cf9536bb","Type":"ContainerDied","Data":"4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860"} Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.362972 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86d4l" event={"ID":"37f2257e-0a07-4d08-a91f-c420cf9536bb","Type":"ContainerDied","Data":"86847ee2be3d21a67b0333204b9460ea5c9489e6cdf923d1b32a4684b97f9df1"} Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.362993 5118 scope.go:117] "RemoveContainer" containerID="4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.363032 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86d4l" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.418579 5118 scope.go:117] "RemoveContainer" containerID="5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.420425 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86d4l"] Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.435177 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-86d4l"] Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.450781 5118 scope.go:117] "RemoveContainer" containerID="1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.526867 5118 scope.go:117] "RemoveContainer" containerID="4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860" Feb 23 08:46:07 crc kubenswrapper[5118]: E0223 08:46:07.527589 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860\": container with ID starting with 4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860 not found: ID does not exist" containerID="4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.527643 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860"} err="failed to get container status \"4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860\": rpc error: code = NotFound desc = could not find container \"4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860\": container with ID starting with 4ae53f88ff6139280ceea646eb1a0c142c94b3c1b66ee208cf309a575551a860 not found: ID does not exist" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.527680 5118 scope.go:117] "RemoveContainer" containerID="5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae" Feb 23 08:46:07 crc kubenswrapper[5118]: E0223 08:46:07.528328 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae\": container with ID starting with 5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae not found: ID does not exist" containerID="5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.528389 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae"} err="failed to get container status \"5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae\": rpc error: code = NotFound desc = could not find container \"5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae\": container with ID starting with 5a2cd9fe1d29791146af5e91e1f188c568608b9b21b0ef89e1f025420bfb4cae not found: ID does not exist" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.528426 5118 scope.go:117] "RemoveContainer" containerID="1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad" Feb 23 08:46:07 crc kubenswrapper[5118]: E0223 08:46:07.528776 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad\": container with ID starting with 1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad not found: ID does not exist" containerID="1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.528816 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad"} err="failed to get container status \"1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad\": rpc error: code = NotFound desc = could not find container \"1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad\": container with ID starting with 1632c3432e6b9a097866a8c4a86101f32278feb2f2e448cc9cb7098309a561ad not found: ID does not exist" Feb 23 08:46:07 crc kubenswrapper[5118]: I0223 08:46:07.709622 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f2257e-0a07-4d08-a91f-c420cf9536bb" path="/var/lib/kubelet/pods/37f2257e-0a07-4d08-a91f-c420cf9536bb/volumes" Feb 23 08:46:08 crc kubenswrapper[5118]: I0223 08:46:08.620604 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:46:08 crc kubenswrapper[5118]: I0223 08:46:08.672084 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:46:09 crc kubenswrapper[5118]: I0223 08:46:09.695139 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mr4rb"] Feb 23 08:46:10 crc kubenswrapper[5118]: I0223 08:46:10.395189 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mr4rb" podUID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerName="registry-server" containerID="cri-o://93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00" gracePeriod=2 Feb 23 08:46:10 crc kubenswrapper[5118]: I0223 08:46:10.858025 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:46:10 crc kubenswrapper[5118]: I0223 08:46:10.964754 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5p7r\" (UniqueName: \"kubernetes.io/projected/0b9abc2c-f99c-48b0-a468-214bfdf2df22-kube-api-access-t5p7r\") pod \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " Feb 23 08:46:10 crc kubenswrapper[5118]: I0223 08:46:10.964997 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-utilities\") pod \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " Feb 23 08:46:10 crc kubenswrapper[5118]: I0223 08:46:10.965032 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-catalog-content\") pod \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\" (UID: \"0b9abc2c-f99c-48b0-a468-214bfdf2df22\") " Feb 23 08:46:10 crc kubenswrapper[5118]: I0223 08:46:10.965926 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-utilities" (OuterVolumeSpecName: "utilities") pod "0b9abc2c-f99c-48b0-a468-214bfdf2df22" (UID: "0b9abc2c-f99c-48b0-a468-214bfdf2df22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:46:10 crc kubenswrapper[5118]: I0223 08:46:10.966931 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:10 crc kubenswrapper[5118]: I0223 08:46:10.973847 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9abc2c-f99c-48b0-a468-214bfdf2df22-kube-api-access-t5p7r" (OuterVolumeSpecName: "kube-api-access-t5p7r") pod "0b9abc2c-f99c-48b0-a468-214bfdf2df22" (UID: "0b9abc2c-f99c-48b0-a468-214bfdf2df22"). InnerVolumeSpecName "kube-api-access-t5p7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.071996 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5p7r\" (UniqueName: \"kubernetes.io/projected/0b9abc2c-f99c-48b0-a468-214bfdf2df22-kube-api-access-t5p7r\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.080417 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b9abc2c-f99c-48b0-a468-214bfdf2df22" (UID: "0b9abc2c-f99c-48b0-a468-214bfdf2df22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.174343 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9abc2c-f99c-48b0-a468-214bfdf2df22-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.419469 5118 generic.go:334] "Generic (PLEG): container finished" podID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerID="93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00" exitCode=0 Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.420031 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr4rb" event={"ID":"0b9abc2c-f99c-48b0-a468-214bfdf2df22","Type":"ContainerDied","Data":"93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00"} Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.420266 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr4rb" event={"ID":"0b9abc2c-f99c-48b0-a468-214bfdf2df22","Type":"ContainerDied","Data":"85810c8e060312619c03e02193de3b11a5972ddd32b847e39dde299fc5de562c"} Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.420413 5118 scope.go:117] "RemoveContainer" containerID="93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.420796 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr4rb" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.449773 5118 scope.go:117] "RemoveContainer" containerID="1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.475439 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mr4rb"] Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.478158 5118 scope.go:117] "RemoveContainer" containerID="db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.490120 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mr4rb"] Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.516217 5118 scope.go:117] "RemoveContainer" containerID="93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00" Feb 23 08:46:11 crc kubenswrapper[5118]: E0223 08:46:11.517060 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00\": container with ID starting with 93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00 not found: ID does not exist" containerID="93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.517150 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00"} err="failed to get container status \"93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00\": rpc error: code = NotFound desc = could not find container \"93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00\": container with ID starting with 93435b051580cd606b82c1a5292bd6cf402fe90f1947a51b44f5f23c63af1e00 not found: ID does not exist" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.517187 5118 scope.go:117] "RemoveContainer" containerID="1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d" Feb 23 08:46:11 crc kubenswrapper[5118]: E0223 08:46:11.517766 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d\": container with ID starting with 1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d not found: ID does not exist" containerID="1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.517829 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d"} err="failed to get container status \"1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d\": rpc error: code = NotFound desc = could not find container \"1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d\": container with ID starting with 1b8b19143635fd41bf52e24946cce1c95ebc4758576be2bd926232c3709a1e3d not found: ID does not exist" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.517873 5118 scope.go:117] "RemoveContainer" containerID="db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce" Feb 23 08:46:11 crc kubenswrapper[5118]: E0223 08:46:11.518364 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce\": container with ID starting with db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce not found: ID does not exist" containerID="db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.518401 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce"} err="failed to get container status \"db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce\": rpc error: code = NotFound desc = could not find container \"db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce\": container with ID starting with db23c81756138afd5083975387e4dcbb5ea257cf954725f4e11b1ad30253d0ce not found: ID does not exist" Feb 23 08:46:11 crc kubenswrapper[5118]: I0223 08:46:11.710850 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" path="/var/lib/kubelet/pods/0b9abc2c-f99c-48b0-a468-214bfdf2df22/volumes" Feb 23 08:46:16 crc kubenswrapper[5118]: I0223 08:46:16.657182 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:46:16 crc kubenswrapper[5118]: I0223 08:46:16.706182 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-69b8f76cc6-xp4qh" Feb 23 08:46:32 crc kubenswrapper[5118]: I0223 08:46:32.975795 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:46:32 crc kubenswrapper[5118]: I0223 08:46:32.976949 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.577000 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7m6lm"] Feb 23 08:46:41 crc kubenswrapper[5118]: E0223 08:46:41.578825 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerName="registry-server" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.578851 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerName="registry-server" Feb 23 08:46:41 crc kubenswrapper[5118]: E0223 08:46:41.578867 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerName="extract-utilities" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.578875 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerName="extract-utilities" Feb 23 08:46:41 crc kubenswrapper[5118]: E0223 08:46:41.578891 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerName="extract-utilities" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.578898 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerName="extract-utilities" Feb 23 08:46:41 crc kubenswrapper[5118]: E0223 08:46:41.578932 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerName="extract-content" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.578940 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerName="extract-content" Feb 23 08:46:41 crc kubenswrapper[5118]: E0223 08:46:41.578954 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerName="registry-server" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.578962 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerName="registry-server" Feb 23 08:46:41 crc kubenswrapper[5118]: E0223 08:46:41.578973 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerName="extract-content" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.578980 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerName="extract-content" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.579258 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9abc2c-f99c-48b0-a468-214bfdf2df22" containerName="registry-server" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.579303 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f2257e-0a07-4d08-a91f-c420cf9536bb" containerName="registry-server" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.580081 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7m6lm" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.586055 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7m6lm"] Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.677983 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gj7x7"] Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.679634 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gj7x7" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.692763 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-41e6-account-create-update-4pfpl"] Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.716660 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gj7x7"] Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.716862 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-41e6-account-create-update-4pfpl" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.722748 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.727612 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060732fd-ca60-43ca-9760-18366ff75e31-operator-scripts\") pod \"nova-api-db-create-7m6lm\" (UID: \"060732fd-ca60-43ca-9760-18366ff75e31\") " pod="openstack/nova-api-db-create-7m6lm" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.727917 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxhf\" (UniqueName: \"kubernetes.io/projected/060732fd-ca60-43ca-9760-18366ff75e31-kube-api-access-dgxhf\") pod \"nova-api-db-create-7m6lm\" (UID: \"060732fd-ca60-43ca-9760-18366ff75e31\") " pod="openstack/nova-api-db-create-7m6lm" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.831402 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxhf\" (UniqueName: \"kubernetes.io/projected/060732fd-ca60-43ca-9760-18366ff75e31-kube-api-access-dgxhf\") pod \"nova-api-db-create-7m6lm\" (UID: \"060732fd-ca60-43ca-9760-18366ff75e31\") " pod="openstack/nova-api-db-create-7m6lm" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.831460 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9fvm\" (UniqueName: \"kubernetes.io/projected/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-kube-api-access-j9fvm\") pod \"nova-cell0-db-create-gj7x7\" (UID: \"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1\") " pod="openstack/nova-cell0-db-create-gj7x7" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.831531 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060732fd-ca60-43ca-9760-18366ff75e31-operator-scripts\") pod \"nova-api-db-create-7m6lm\" (UID: \"060732fd-ca60-43ca-9760-18366ff75e31\") " pod="openstack/nova-api-db-create-7m6lm" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.831555 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbgv\" (UniqueName: \"kubernetes.io/projected/84d986e2-7735-490b-b3e5-4446205f094e-kube-api-access-hdbgv\") pod \"nova-api-41e6-account-create-update-4pfpl\" (UID: \"84d986e2-7735-490b-b3e5-4446205f094e\") " pod="openstack/nova-api-41e6-account-create-update-4pfpl" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.831586 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d986e2-7735-490b-b3e5-4446205f094e-operator-scripts\") pod \"nova-api-41e6-account-create-update-4pfpl\" (UID: \"84d986e2-7735-490b-b3e5-4446205f094e\") " pod="openstack/nova-api-41e6-account-create-update-4pfpl" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.831605 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-operator-scripts\") pod \"nova-cell0-db-create-gj7x7\" (UID: \"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1\") " pod="openstack/nova-cell0-db-create-gj7x7" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.832634 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060732fd-ca60-43ca-9760-18366ff75e31-operator-scripts\") pod \"nova-api-db-create-7m6lm\" (UID: \"060732fd-ca60-43ca-9760-18366ff75e31\") " pod="openstack/nova-api-db-create-7m6lm" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.837335 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-41e6-account-create-update-4pfpl"] Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.933867 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbgv\" (UniqueName: \"kubernetes.io/projected/84d986e2-7735-490b-b3e5-4446205f094e-kube-api-access-hdbgv\") pod \"nova-api-41e6-account-create-update-4pfpl\" (UID: \"84d986e2-7735-490b-b3e5-4446205f094e\") " pod="openstack/nova-api-41e6-account-create-update-4pfpl" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.933982 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d986e2-7735-490b-b3e5-4446205f094e-operator-scripts\") pod \"nova-api-41e6-account-create-update-4pfpl\" (UID: \"84d986e2-7735-490b-b3e5-4446205f094e\") " pod="openstack/nova-api-41e6-account-create-update-4pfpl" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.934017 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-operator-scripts\") pod \"nova-cell0-db-create-gj7x7\" (UID: \"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1\") " pod="openstack/nova-cell0-db-create-gj7x7" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.934260 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9fvm\" (UniqueName: \"kubernetes.io/projected/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-kube-api-access-j9fvm\") pod \"nova-cell0-db-create-gj7x7\" (UID: \"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1\") " pod="openstack/nova-cell0-db-create-gj7x7" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.935926 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d986e2-7735-490b-b3e5-4446205f094e-operator-scripts\") pod \"nova-api-41e6-account-create-update-4pfpl\" (UID: \"84d986e2-7735-490b-b3e5-4446205f094e\") " pod="openstack/nova-api-41e6-account-create-update-4pfpl" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.936479 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-operator-scripts\") pod \"nova-cell0-db-create-gj7x7\" (UID: \"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1\") " pod="openstack/nova-cell0-db-create-gj7x7" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.953164 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxhf\" (UniqueName: \"kubernetes.io/projected/060732fd-ca60-43ca-9760-18366ff75e31-kube-api-access-dgxhf\") pod \"nova-api-db-create-7m6lm\" (UID: \"060732fd-ca60-43ca-9760-18366ff75e31\") " pod="openstack/nova-api-db-create-7m6lm" Feb 23 08:46:41 crc kubenswrapper[5118]: I0223 08:46:41.967957 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdbgv\" (UniqueName: \"kubernetes.io/projected/84d986e2-7735-490b-b3e5-4446205f094e-kube-api-access-hdbgv\") pod \"nova-api-41e6-account-create-update-4pfpl\" (UID: \"84d986e2-7735-490b-b3e5-4446205f094e\") " pod="openstack/nova-api-41e6-account-create-update-4pfpl" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.002751 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9fvm\" (UniqueName: \"kubernetes.io/projected/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-kube-api-access-j9fvm\") pod \"nova-cell0-db-create-gj7x7\" (UID: \"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1\") " pod="openstack/nova-cell0-db-create-gj7x7" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.014551 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gj7x7" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.015014 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kj9s6"] Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.033478 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kj9s6"] Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.033637 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kj9s6" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.037414 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-f7e0-account-create-update-z7f5c"] Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.039499 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.045548 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-41e6-account-create-update-4pfpl" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.045802 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f7e0-account-create-update-z7f5c"] Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.049263 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.104225 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c2f9-account-create-update-rrh2l"] Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.105762 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.111645 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.114696 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c2f9-account-create-update-rrh2l"] Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.146606 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s552\" (UniqueName: \"kubernetes.io/projected/65965047-113d-44a2-87f9-925f161e5c0c-kube-api-access-9s552\") pod \"nova-cell1-db-create-kj9s6\" (UID: \"65965047-113d-44a2-87f9-925f161e5c0c\") " pod="openstack/nova-cell1-db-create-kj9s6" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.146712 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c39e8379-74ca-4713-8494-81de13ec2734-operator-scripts\") pod \"nova-cell0-f7e0-account-create-update-z7f5c\" (UID: \"c39e8379-74ca-4713-8494-81de13ec2734\") " pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.146748 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42c5h\" (UniqueName: \"kubernetes.io/projected/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-kube-api-access-42c5h\") pod \"nova-cell1-c2f9-account-create-update-rrh2l\" (UID: \"a5a7092c-8bd3-462b-8ce9-8dd40cc35067\") " pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.146814 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmcwz\" (UniqueName: \"kubernetes.io/projected/c39e8379-74ca-4713-8494-81de13ec2734-kube-api-access-wmcwz\") pod \"nova-cell0-f7e0-account-create-update-z7f5c\" (UID: \"c39e8379-74ca-4713-8494-81de13ec2734\") " pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.146860 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65965047-113d-44a2-87f9-925f161e5c0c-operator-scripts\") pod \"nova-cell1-db-create-kj9s6\" (UID: \"65965047-113d-44a2-87f9-925f161e5c0c\") " pod="openstack/nova-cell1-db-create-kj9s6" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.146894 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-operator-scripts\") pod \"nova-cell1-c2f9-account-create-update-rrh2l\" (UID: \"a5a7092c-8bd3-462b-8ce9-8dd40cc35067\") " pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.225859 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7m6lm" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.248769 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c39e8379-74ca-4713-8494-81de13ec2734-operator-scripts\") pod \"nova-cell0-f7e0-account-create-update-z7f5c\" (UID: \"c39e8379-74ca-4713-8494-81de13ec2734\") " pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.248825 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42c5h\" (UniqueName: \"kubernetes.io/projected/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-kube-api-access-42c5h\") pod \"nova-cell1-c2f9-account-create-update-rrh2l\" (UID: \"a5a7092c-8bd3-462b-8ce9-8dd40cc35067\") " pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.248877 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmcwz\" (UniqueName: \"kubernetes.io/projected/c39e8379-74ca-4713-8494-81de13ec2734-kube-api-access-wmcwz\") pod \"nova-cell0-f7e0-account-create-update-z7f5c\" (UID: \"c39e8379-74ca-4713-8494-81de13ec2734\") " pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.248908 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65965047-113d-44a2-87f9-925f161e5c0c-operator-scripts\") pod \"nova-cell1-db-create-kj9s6\" (UID: \"65965047-113d-44a2-87f9-925f161e5c0c\") " pod="openstack/nova-cell1-db-create-kj9s6" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.248930 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-operator-scripts\") pod \"nova-cell1-c2f9-account-create-update-rrh2l\" (UID: \"a5a7092c-8bd3-462b-8ce9-8dd40cc35067\") " pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.248996 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s552\" (UniqueName: \"kubernetes.io/projected/65965047-113d-44a2-87f9-925f161e5c0c-kube-api-access-9s552\") pod \"nova-cell1-db-create-kj9s6\" (UID: \"65965047-113d-44a2-87f9-925f161e5c0c\") " pod="openstack/nova-cell1-db-create-kj9s6" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.250226 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-operator-scripts\") pod \"nova-cell1-c2f9-account-create-update-rrh2l\" (UID: \"a5a7092c-8bd3-462b-8ce9-8dd40cc35067\") " pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.250663 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65965047-113d-44a2-87f9-925f161e5c0c-operator-scripts\") pod \"nova-cell1-db-create-kj9s6\" (UID: \"65965047-113d-44a2-87f9-925f161e5c0c\") " pod="openstack/nova-cell1-db-create-kj9s6" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.251173 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c39e8379-74ca-4713-8494-81de13ec2734-operator-scripts\") pod \"nova-cell0-f7e0-account-create-update-z7f5c\" (UID: \"c39e8379-74ca-4713-8494-81de13ec2734\") " pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.272813 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s552\" (UniqueName: \"kubernetes.io/projected/65965047-113d-44a2-87f9-925f161e5c0c-kube-api-access-9s552\") pod \"nova-cell1-db-create-kj9s6\" (UID: \"65965047-113d-44a2-87f9-925f161e5c0c\") " pod="openstack/nova-cell1-db-create-kj9s6" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.274241 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmcwz\" (UniqueName: \"kubernetes.io/projected/c39e8379-74ca-4713-8494-81de13ec2734-kube-api-access-wmcwz\") pod \"nova-cell0-f7e0-account-create-update-z7f5c\" (UID: \"c39e8379-74ca-4713-8494-81de13ec2734\") " pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.281011 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42c5h\" (UniqueName: \"kubernetes.io/projected/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-kube-api-access-42c5h\") pod \"nova-cell1-c2f9-account-create-update-rrh2l\" (UID: \"a5a7092c-8bd3-462b-8ce9-8dd40cc35067\") " pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.392763 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kj9s6" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.499078 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.505642 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.617172 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gj7x7"] Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.743713 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-41e6-account-create-update-4pfpl"] Feb 23 08:46:42 crc kubenswrapper[5118]: W0223 08:46:42.750866 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65965047_113d_44a2_87f9_925f161e5c0c.slice/crio-aea413e036ca99f188cd2560856ba697651582834cb8e8703765002ecebf4430 WatchSource:0}: Error finding container aea413e036ca99f188cd2560856ba697651582834cb8e8703765002ecebf4430: Status 404 returned error can't find the container with id aea413e036ca99f188cd2560856ba697651582834cb8e8703765002ecebf4430 Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.753132 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kj9s6"] Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.772680 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7m6lm"] Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.857831 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7m6lm" event={"ID":"060732fd-ca60-43ca-9760-18366ff75e31","Type":"ContainerStarted","Data":"37780bffa692a24c8048ffe2c97c106c4b4aca52cfbcf2cce863d1204581240c"} Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.863699 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kj9s6" event={"ID":"65965047-113d-44a2-87f9-925f161e5c0c","Type":"ContainerStarted","Data":"aea413e036ca99f188cd2560856ba697651582834cb8e8703765002ecebf4430"} Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.866918 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-41e6-account-create-update-4pfpl" event={"ID":"84d986e2-7735-490b-b3e5-4446205f094e","Type":"ContainerStarted","Data":"92ea066c3570e88566c827de869e041f3edf8ab431ea5d9b0cf4d0016a2fb7d7"} Feb 23 08:46:42 crc kubenswrapper[5118]: I0223 08:46:42.868704 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gj7x7" event={"ID":"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1","Type":"ContainerStarted","Data":"ff53feb8145aa92df6bca72bc6081a2a7c7b675210bea038314dbca2b7c60f56"} Feb 23 08:46:43 crc kubenswrapper[5118]: W0223 08:46:43.058972 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc39e8379_74ca_4713_8494_81de13ec2734.slice/crio-f25a548df7706675b98601b02796b6189aff6f9a845e78cfc85cf5003adebecb WatchSource:0}: Error finding container f25a548df7706675b98601b02796b6189aff6f9a845e78cfc85cf5003adebecb: Status 404 returned error can't find the container with id f25a548df7706675b98601b02796b6189aff6f9a845e78cfc85cf5003adebecb Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.062133 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-f7e0-account-create-update-z7f5c"] Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.137851 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c2f9-account-create-update-rrh2l"] Feb 23 08:46:43 crc kubenswrapper[5118]: W0223 08:46:43.150369 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5a7092c_8bd3_462b_8ce9_8dd40cc35067.slice/crio-4425bbf4107a87ab252a97780494e587d1d5b6ca641170f38d24f0bfb5059ebd WatchSource:0}: Error finding container 4425bbf4107a87ab252a97780494e587d1d5b6ca641170f38d24f0bfb5059ebd: Status 404 returned error can't find the container with id 4425bbf4107a87ab252a97780494e587d1d5b6ca641170f38d24f0bfb5059ebd Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.879556 5118 generic.go:334] "Generic (PLEG): container finished" podID="99fca003-f1bf-42ec-9bac-8c1bb4cc37c1" containerID="2da11e32d4e311d7141f7671012e3579e2c7a1c554c749bc940b3fdc4d67c6b5" exitCode=0 Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.879655 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gj7x7" event={"ID":"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1","Type":"ContainerDied","Data":"2da11e32d4e311d7141f7671012e3579e2c7a1c554c749bc940b3fdc4d67c6b5"} Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.882859 5118 generic.go:334] "Generic (PLEG): container finished" podID="060732fd-ca60-43ca-9760-18366ff75e31" containerID="1654a3c8278af0612f23b34c40534afbac9ab069b0e5238f98524ae4c22a5f98" exitCode=0 Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.882924 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7m6lm" event={"ID":"060732fd-ca60-43ca-9760-18366ff75e31","Type":"ContainerDied","Data":"1654a3c8278af0612f23b34c40534afbac9ab069b0e5238f98524ae4c22a5f98"} Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.885039 5118 generic.go:334] "Generic (PLEG): container finished" podID="a5a7092c-8bd3-462b-8ce9-8dd40cc35067" containerID="26fc64a0bfae4863969afe929650aa6eac6e469c0339accb15675b4c355ed0b4" exitCode=0 Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.885083 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" event={"ID":"a5a7092c-8bd3-462b-8ce9-8dd40cc35067","Type":"ContainerDied","Data":"26fc64a0bfae4863969afe929650aa6eac6e469c0339accb15675b4c355ed0b4"} Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.885134 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" event={"ID":"a5a7092c-8bd3-462b-8ce9-8dd40cc35067","Type":"ContainerStarted","Data":"4425bbf4107a87ab252a97780494e587d1d5b6ca641170f38d24f0bfb5059ebd"} Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.887475 5118 generic.go:334] "Generic (PLEG): container finished" podID="65965047-113d-44a2-87f9-925f161e5c0c" containerID="ebb7ab81d7ff1968e30477440f577825c218f87601d3aa92e1d7ee194427fba0" exitCode=0 Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.887563 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kj9s6" event={"ID":"65965047-113d-44a2-87f9-925f161e5c0c","Type":"ContainerDied","Data":"ebb7ab81d7ff1968e30477440f577825c218f87601d3aa92e1d7ee194427fba0"} Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.890498 5118 generic.go:334] "Generic (PLEG): container finished" podID="c39e8379-74ca-4713-8494-81de13ec2734" containerID="8f21f817072d9e4dad2a061e5e39b5779c18192efd1982a628161d3a1bbddb29" exitCode=0 Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.890670 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" event={"ID":"c39e8379-74ca-4713-8494-81de13ec2734","Type":"ContainerDied","Data":"8f21f817072d9e4dad2a061e5e39b5779c18192efd1982a628161d3a1bbddb29"} Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.890700 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" event={"ID":"c39e8379-74ca-4713-8494-81de13ec2734","Type":"ContainerStarted","Data":"f25a548df7706675b98601b02796b6189aff6f9a845e78cfc85cf5003adebecb"} Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.896706 5118 generic.go:334] "Generic (PLEG): container finished" podID="84d986e2-7735-490b-b3e5-4446205f094e" containerID="e2841cd7138b6367db8df98e2cd4100108182c6f0a2dbbf5a0bf8aaf87fa2bfe" exitCode=0 Feb 23 08:46:43 crc kubenswrapper[5118]: I0223 08:46:43.896771 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-41e6-account-create-update-4pfpl" event={"ID":"84d986e2-7735-490b-b3e5-4446205f094e","Type":"ContainerDied","Data":"e2841cd7138b6367db8df98e2cd4100108182c6f0a2dbbf5a0bf8aaf87fa2bfe"} Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.344732 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gj7x7" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.427422 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-operator-scripts\") pod \"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1\" (UID: \"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1\") " Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.427612 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9fvm\" (UniqueName: \"kubernetes.io/projected/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-kube-api-access-j9fvm\") pod \"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1\" (UID: \"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1\") " Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.428296 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99fca003-f1bf-42ec-9bac-8c1bb4cc37c1" (UID: "99fca003-f1bf-42ec-9bac-8c1bb4cc37c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.434048 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-kube-api-access-j9fvm" (OuterVolumeSpecName: "kube-api-access-j9fvm") pod "99fca003-f1bf-42ec-9bac-8c1bb4cc37c1" (UID: "99fca003-f1bf-42ec-9bac-8c1bb4cc37c1"). InnerVolumeSpecName "kube-api-access-j9fvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.529619 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9fvm\" (UniqueName: \"kubernetes.io/projected/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-kube-api-access-j9fvm\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.529898 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.931745 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-41e6-account-create-update-4pfpl" event={"ID":"84d986e2-7735-490b-b3e5-4446205f094e","Type":"ContainerDied","Data":"92ea066c3570e88566c827de869e041f3edf8ab431ea5d9b0cf4d0016a2fb7d7"} Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.931797 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92ea066c3570e88566c827de869e041f3edf8ab431ea5d9b0cf4d0016a2fb7d7" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.934229 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gj7x7" event={"ID":"99fca003-f1bf-42ec-9bac-8c1bb4cc37c1","Type":"ContainerDied","Data":"ff53feb8145aa92df6bca72bc6081a2a7c7b675210bea038314dbca2b7c60f56"} Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.934256 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff53feb8145aa92df6bca72bc6081a2a7c7b675210bea038314dbca2b7c60f56" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.934316 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gj7x7" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.938566 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7m6lm" event={"ID":"060732fd-ca60-43ca-9760-18366ff75e31","Type":"ContainerDied","Data":"37780bffa692a24c8048ffe2c97c106c4b4aca52cfbcf2cce863d1204581240c"} Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.938592 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37780bffa692a24c8048ffe2c97c106c4b4aca52cfbcf2cce863d1204581240c" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.939809 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kj9s6" event={"ID":"65965047-113d-44a2-87f9-925f161e5c0c","Type":"ContainerDied","Data":"aea413e036ca99f188cd2560856ba697651582834cb8e8703765002ecebf4430"} Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.939834 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aea413e036ca99f188cd2560856ba697651582834cb8e8703765002ecebf4430" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.942000 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" event={"ID":"c39e8379-74ca-4713-8494-81de13ec2734","Type":"ContainerDied","Data":"f25a548df7706675b98601b02796b6189aff6f9a845e78cfc85cf5003adebecb"} Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.942024 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f25a548df7706675b98601b02796b6189aff6f9a845e78cfc85cf5003adebecb" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.989378 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7m6lm" Feb 23 08:46:45 crc kubenswrapper[5118]: I0223 08:46:45.996747 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kj9s6" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.013755 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-41e6-account-create-update-4pfpl" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.019009 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.029907 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.048271 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65965047-113d-44a2-87f9-925f161e5c0c-operator-scripts\") pod \"65965047-113d-44a2-87f9-925f161e5c0c\" (UID: \"65965047-113d-44a2-87f9-925f161e5c0c\") " Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.048342 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s552\" (UniqueName: \"kubernetes.io/projected/65965047-113d-44a2-87f9-925f161e5c0c-kube-api-access-9s552\") pod \"65965047-113d-44a2-87f9-925f161e5c0c\" (UID: \"65965047-113d-44a2-87f9-925f161e5c0c\") " Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.048431 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgxhf\" (UniqueName: \"kubernetes.io/projected/060732fd-ca60-43ca-9760-18366ff75e31-kube-api-access-dgxhf\") pod \"060732fd-ca60-43ca-9760-18366ff75e31\" (UID: \"060732fd-ca60-43ca-9760-18366ff75e31\") " Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.048548 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060732fd-ca60-43ca-9760-18366ff75e31-operator-scripts\") pod \"060732fd-ca60-43ca-9760-18366ff75e31\" (UID: \"060732fd-ca60-43ca-9760-18366ff75e31\") " Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.048842 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65965047-113d-44a2-87f9-925f161e5c0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65965047-113d-44a2-87f9-925f161e5c0c" (UID: "65965047-113d-44a2-87f9-925f161e5c0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.049171 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65965047-113d-44a2-87f9-925f161e5c0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.049701 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060732fd-ca60-43ca-9760-18366ff75e31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "060732fd-ca60-43ca-9760-18366ff75e31" (UID: "060732fd-ca60-43ca-9760-18366ff75e31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.056756 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65965047-113d-44a2-87f9-925f161e5c0c-kube-api-access-9s552" (OuterVolumeSpecName: "kube-api-access-9s552") pod "65965047-113d-44a2-87f9-925f161e5c0c" (UID: "65965047-113d-44a2-87f9-925f161e5c0c"). InnerVolumeSpecName "kube-api-access-9s552". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.060510 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060732fd-ca60-43ca-9760-18366ff75e31-kube-api-access-dgxhf" (OuterVolumeSpecName: "kube-api-access-dgxhf") pod "060732fd-ca60-43ca-9760-18366ff75e31" (UID: "060732fd-ca60-43ca-9760-18366ff75e31"). InnerVolumeSpecName "kube-api-access-dgxhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.150249 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c39e8379-74ca-4713-8494-81de13ec2734-operator-scripts\") pod \"c39e8379-74ca-4713-8494-81de13ec2734\" (UID: \"c39e8379-74ca-4713-8494-81de13ec2734\") " Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.150396 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdbgv\" (UniqueName: \"kubernetes.io/projected/84d986e2-7735-490b-b3e5-4446205f094e-kube-api-access-hdbgv\") pod \"84d986e2-7735-490b-b3e5-4446205f094e\" (UID: \"84d986e2-7735-490b-b3e5-4446205f094e\") " Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.150500 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42c5h\" (UniqueName: \"kubernetes.io/projected/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-kube-api-access-42c5h\") pod \"a5a7092c-8bd3-462b-8ce9-8dd40cc35067\" (UID: \"a5a7092c-8bd3-462b-8ce9-8dd40cc35067\") " Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.150580 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmcwz\" (UniqueName: \"kubernetes.io/projected/c39e8379-74ca-4713-8494-81de13ec2734-kube-api-access-wmcwz\") pod \"c39e8379-74ca-4713-8494-81de13ec2734\" (UID: \"c39e8379-74ca-4713-8494-81de13ec2734\") " Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.150629 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d986e2-7735-490b-b3e5-4446205f094e-operator-scripts\") pod \"84d986e2-7735-490b-b3e5-4446205f094e\" (UID: \"84d986e2-7735-490b-b3e5-4446205f094e\") " Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.150669 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-operator-scripts\") pod \"a5a7092c-8bd3-462b-8ce9-8dd40cc35067\" (UID: \"a5a7092c-8bd3-462b-8ce9-8dd40cc35067\") " Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.151135 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s552\" (UniqueName: \"kubernetes.io/projected/65965047-113d-44a2-87f9-925f161e5c0c-kube-api-access-9s552\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.151154 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgxhf\" (UniqueName: \"kubernetes.io/projected/060732fd-ca60-43ca-9760-18366ff75e31-kube-api-access-dgxhf\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.151167 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060732fd-ca60-43ca-9760-18366ff75e31-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.151496 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5a7092c-8bd3-462b-8ce9-8dd40cc35067" (UID: "a5a7092c-8bd3-462b-8ce9-8dd40cc35067"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.151844 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d986e2-7735-490b-b3e5-4446205f094e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84d986e2-7735-490b-b3e5-4446205f094e" (UID: "84d986e2-7735-490b-b3e5-4446205f094e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.151868 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c39e8379-74ca-4713-8494-81de13ec2734-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c39e8379-74ca-4713-8494-81de13ec2734" (UID: "c39e8379-74ca-4713-8494-81de13ec2734"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.153585 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39e8379-74ca-4713-8494-81de13ec2734-kube-api-access-wmcwz" (OuterVolumeSpecName: "kube-api-access-wmcwz") pod "c39e8379-74ca-4713-8494-81de13ec2734" (UID: "c39e8379-74ca-4713-8494-81de13ec2734"). InnerVolumeSpecName "kube-api-access-wmcwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.155678 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d986e2-7735-490b-b3e5-4446205f094e-kube-api-access-hdbgv" (OuterVolumeSpecName: "kube-api-access-hdbgv") pod "84d986e2-7735-490b-b3e5-4446205f094e" (UID: "84d986e2-7735-490b-b3e5-4446205f094e"). InnerVolumeSpecName "kube-api-access-hdbgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.156646 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-kube-api-access-42c5h" (OuterVolumeSpecName: "kube-api-access-42c5h") pod "a5a7092c-8bd3-462b-8ce9-8dd40cc35067" (UID: "a5a7092c-8bd3-462b-8ce9-8dd40cc35067"). InnerVolumeSpecName "kube-api-access-42c5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.252999 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmcwz\" (UniqueName: \"kubernetes.io/projected/c39e8379-74ca-4713-8494-81de13ec2734-kube-api-access-wmcwz\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.253036 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84d986e2-7735-490b-b3e5-4446205f094e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.253046 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.253056 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c39e8379-74ca-4713-8494-81de13ec2734-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.253066 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdbgv\" (UniqueName: \"kubernetes.io/projected/84d986e2-7735-490b-b3e5-4446205f094e-kube-api-access-hdbgv\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.253075 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42c5h\" (UniqueName: \"kubernetes.io/projected/a5a7092c-8bd3-462b-8ce9-8dd40cc35067-kube-api-access-42c5h\") on node \"crc\" DevicePath \"\"" Feb 23 08:46:46 crc kubenswrapper[5118]: I0223 08:46:46.999882 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kj9s6" Feb 23 08:46:47 crc kubenswrapper[5118]: I0223 08:46:47.002292 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" event={"ID":"a5a7092c-8bd3-462b-8ce9-8dd40cc35067","Type":"ContainerDied","Data":"4425bbf4107a87ab252a97780494e587d1d5b6ca641170f38d24f0bfb5059ebd"} Feb 23 08:46:47 crc kubenswrapper[5118]: I0223 08:46:47.002345 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-41e6-account-create-update-4pfpl" Feb 23 08:46:47 crc kubenswrapper[5118]: I0223 08:46:47.002374 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4425bbf4107a87ab252a97780494e587d1d5b6ca641170f38d24f0bfb5059ebd" Feb 23 08:46:47 crc kubenswrapper[5118]: I0223 08:46:47.002403 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c2f9-account-create-update-rrh2l" Feb 23 08:46:47 crc kubenswrapper[5118]: I0223 08:46:47.002416 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-f7e0-account-create-update-z7f5c" Feb 23 08:46:47 crc kubenswrapper[5118]: I0223 08:46:47.002561 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7m6lm" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.089972 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6cvjz"] Feb 23 08:46:52 crc kubenswrapper[5118]: E0223 08:46:52.090830 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5a7092c-8bd3-462b-8ce9-8dd40cc35067" containerName="mariadb-account-create-update" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.090849 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a7092c-8bd3-462b-8ce9-8dd40cc35067" containerName="mariadb-account-create-update" Feb 23 08:46:52 crc kubenswrapper[5118]: E0223 08:46:52.090872 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d986e2-7735-490b-b3e5-4446205f094e" containerName="mariadb-account-create-update" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.090880 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d986e2-7735-490b-b3e5-4446205f094e" containerName="mariadb-account-create-update" Feb 23 08:46:52 crc kubenswrapper[5118]: E0223 08:46:52.090896 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65965047-113d-44a2-87f9-925f161e5c0c" containerName="mariadb-database-create" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.090904 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="65965047-113d-44a2-87f9-925f161e5c0c" containerName="mariadb-database-create" Feb 23 08:46:52 crc kubenswrapper[5118]: E0223 08:46:52.090918 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39e8379-74ca-4713-8494-81de13ec2734" containerName="mariadb-account-create-update" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.090925 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39e8379-74ca-4713-8494-81de13ec2734" containerName="mariadb-account-create-update" Feb 23 08:46:52 crc kubenswrapper[5118]: E0223 08:46:52.090945 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060732fd-ca60-43ca-9760-18366ff75e31" containerName="mariadb-database-create" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.090952 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="060732fd-ca60-43ca-9760-18366ff75e31" containerName="mariadb-database-create" Feb 23 08:46:52 crc kubenswrapper[5118]: E0223 08:46:52.090967 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99fca003-f1bf-42ec-9bac-8c1bb4cc37c1" containerName="mariadb-database-create" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.090973 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="99fca003-f1bf-42ec-9bac-8c1bb4cc37c1" containerName="mariadb-database-create" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.091176 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5a7092c-8bd3-462b-8ce9-8dd40cc35067" containerName="mariadb-account-create-update" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.091192 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d986e2-7735-490b-b3e5-4446205f094e" containerName="mariadb-account-create-update" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.091203 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="65965047-113d-44a2-87f9-925f161e5c0c" containerName="mariadb-database-create" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.091216 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39e8379-74ca-4713-8494-81de13ec2734" containerName="mariadb-account-create-update" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.091232 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="99fca003-f1bf-42ec-9bac-8c1bb4cc37c1" containerName="mariadb-database-create" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.091244 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="060732fd-ca60-43ca-9760-18366ff75e31" containerName="mariadb-database-create" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.092001 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.099518 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.101921 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rn7rn" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.102299 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.132198 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6cvjz"] Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.182883 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-scripts\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.183003 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj62k\" (UniqueName: \"kubernetes.io/projected/3de48519-b3d7-4123-8138-1151dbd0f25e-kube-api-access-qj62k\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.183069 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-config-data\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.183171 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.285191 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.285278 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-scripts\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.285352 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj62k\" (UniqueName: \"kubernetes.io/projected/3de48519-b3d7-4123-8138-1151dbd0f25e-kube-api-access-qj62k\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.285405 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-config-data\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.291809 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-scripts\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.292048 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-config-data\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.296350 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.306736 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj62k\" (UniqueName: \"kubernetes.io/projected/3de48519-b3d7-4123-8138-1151dbd0f25e-kube-api-access-qj62k\") pod \"nova-cell0-conductor-db-sync-6cvjz\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.424648 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:46:52 crc kubenswrapper[5118]: I0223 08:46:52.722772 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6cvjz"] Feb 23 08:46:53 crc kubenswrapper[5118]: I0223 08:46:53.067538 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6cvjz" event={"ID":"3de48519-b3d7-4123-8138-1151dbd0f25e","Type":"ContainerStarted","Data":"97257c29052e6a0ef71a36d6a23283ab80e056b222867c519e35e277ba1c4507"} Feb 23 08:46:57 crc kubenswrapper[5118]: I0223 08:46:57.331583 5118 scope.go:117] "RemoveContainer" containerID="fe7be9c4b8a68ea5367a7935c0c4cd49788b11cc6f1fa945035c6d7f19706c89" Feb 23 08:47:02 crc kubenswrapper[5118]: I0223 08:47:02.975523 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:47:02 crc kubenswrapper[5118]: I0223 08:47:02.977806 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:47:02 crc kubenswrapper[5118]: I0223 08:47:02.978642 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 08:47:02 crc kubenswrapper[5118]: I0223 08:47:02.979784 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"736e449ee6985bd3eea8148239e5d9c3b28aa25fd251975364e0cd9fe953dfbb"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:47:02 crc kubenswrapper[5118]: I0223 08:47:02.980079 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://736e449ee6985bd3eea8148239e5d9c3b28aa25fd251975364e0cd9fe953dfbb" gracePeriod=600 Feb 23 08:47:03 crc kubenswrapper[5118]: I0223 08:47:03.176501 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6cvjz" event={"ID":"3de48519-b3d7-4123-8138-1151dbd0f25e","Type":"ContainerStarted","Data":"f88048c0d6a3e1ce4197293513987f53aaec3b8f66b4d42a3481a6306bb060da"} Feb 23 08:47:03 crc kubenswrapper[5118]: I0223 08:47:03.191470 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="736e449ee6985bd3eea8148239e5d9c3b28aa25fd251975364e0cd9fe953dfbb" exitCode=0 Feb 23 08:47:03 crc kubenswrapper[5118]: I0223 08:47:03.191535 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"736e449ee6985bd3eea8148239e5d9c3b28aa25fd251975364e0cd9fe953dfbb"} Feb 23 08:47:03 crc kubenswrapper[5118]: I0223 08:47:03.191583 5118 scope.go:117] "RemoveContainer" containerID="058311a1c57ee0fa4eac6b6098a6845929728d5af7e0924bb8433c0265beaa92" Feb 23 08:47:03 crc kubenswrapper[5118]: I0223 08:47:03.212619 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6cvjz" podStartSLOduration=1.63017975 podStartE2EDuration="11.212590163s" podCreationTimestamp="2026-02-23 08:46:52 +0000 UTC" firstStartedPulling="2026-02-23 08:46:52.732221954 +0000 UTC m=+7275.736006527" lastFinishedPulling="2026-02-23 08:47:02.314632377 +0000 UTC m=+7285.318416940" observedRunningTime="2026-02-23 08:47:03.198077414 +0000 UTC m=+7286.201861987" watchObservedRunningTime="2026-02-23 08:47:03.212590163 +0000 UTC m=+7286.216374736" Feb 23 08:47:04 crc kubenswrapper[5118]: I0223 08:47:04.244336 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf"} Feb 23 08:47:08 crc kubenswrapper[5118]: I0223 08:47:08.288560 5118 generic.go:334] "Generic (PLEG): container finished" podID="3de48519-b3d7-4123-8138-1151dbd0f25e" containerID="f88048c0d6a3e1ce4197293513987f53aaec3b8f66b4d42a3481a6306bb060da" exitCode=0 Feb 23 08:47:08 crc kubenswrapper[5118]: I0223 08:47:08.288658 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6cvjz" event={"ID":"3de48519-b3d7-4123-8138-1151dbd0f25e","Type":"ContainerDied","Data":"f88048c0d6a3e1ce4197293513987f53aaec3b8f66b4d42a3481a6306bb060da"} Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.672986 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.782611 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-combined-ca-bundle\") pod \"3de48519-b3d7-4123-8138-1151dbd0f25e\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.783741 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-config-data\") pod \"3de48519-b3d7-4123-8138-1151dbd0f25e\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.784002 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj62k\" (UniqueName: \"kubernetes.io/projected/3de48519-b3d7-4123-8138-1151dbd0f25e-kube-api-access-qj62k\") pod \"3de48519-b3d7-4123-8138-1151dbd0f25e\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.784139 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-scripts\") pod \"3de48519-b3d7-4123-8138-1151dbd0f25e\" (UID: \"3de48519-b3d7-4123-8138-1151dbd0f25e\") " Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.798446 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-scripts" (OuterVolumeSpecName: "scripts") pod "3de48519-b3d7-4123-8138-1151dbd0f25e" (UID: "3de48519-b3d7-4123-8138-1151dbd0f25e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.807357 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de48519-b3d7-4123-8138-1151dbd0f25e-kube-api-access-qj62k" (OuterVolumeSpecName: "kube-api-access-qj62k") pod "3de48519-b3d7-4123-8138-1151dbd0f25e" (UID: "3de48519-b3d7-4123-8138-1151dbd0f25e"). InnerVolumeSpecName "kube-api-access-qj62k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.857429 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3de48519-b3d7-4123-8138-1151dbd0f25e" (UID: "3de48519-b3d7-4123-8138-1151dbd0f25e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.889571 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.889695 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj62k\" (UniqueName: \"kubernetes.io/projected/3de48519-b3d7-4123-8138-1151dbd0f25e-kube-api-access-qj62k\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.889768 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.917061 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-config-data" (OuterVolumeSpecName: "config-data") pod "3de48519-b3d7-4123-8138-1151dbd0f25e" (UID: "3de48519-b3d7-4123-8138-1151dbd0f25e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:09 crc kubenswrapper[5118]: I0223 08:47:09.991630 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de48519-b3d7-4123-8138-1151dbd0f25e-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.314083 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6cvjz" event={"ID":"3de48519-b3d7-4123-8138-1151dbd0f25e","Type":"ContainerDied","Data":"97257c29052e6a0ef71a36d6a23283ab80e056b222867c519e35e277ba1c4507"} Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.314204 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97257c29052e6a0ef71a36d6a23283ab80e056b222867c519e35e277ba1c4507" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.314329 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6cvjz" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.503366 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:47:10 crc kubenswrapper[5118]: E0223 08:47:10.503838 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de48519-b3d7-4123-8138-1151dbd0f25e" containerName="nova-cell0-conductor-db-sync" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.503861 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de48519-b3d7-4123-8138-1151dbd0f25e" containerName="nova-cell0-conductor-db-sync" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.504129 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de48519-b3d7-4123-8138-1151dbd0f25e" containerName="nova-cell0-conductor-db-sync" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.504859 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.509000 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.509597 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rn7rn" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.534821 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.606083 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.606312 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.606363 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8xdf\" (UniqueName: \"kubernetes.io/projected/00136a25-626f-4abc-9f43-776ce7aa84bd-kube-api-access-d8xdf\") pod \"nova-cell0-conductor-0\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.707750 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.707883 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.707937 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8xdf\" (UniqueName: \"kubernetes.io/projected/00136a25-626f-4abc-9f43-776ce7aa84bd-kube-api-access-d8xdf\") pod \"nova-cell0-conductor-0\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.714750 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.715373 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.740235 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8xdf\" (UniqueName: \"kubernetes.io/projected/00136a25-626f-4abc-9f43-776ce7aa84bd-kube-api-access-d8xdf\") pod \"nova-cell0-conductor-0\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:10 crc kubenswrapper[5118]: I0223 08:47:10.864399 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:11 crc kubenswrapper[5118]: I0223 08:47:11.419071 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:47:12 crc kubenswrapper[5118]: I0223 08:47:12.348328 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"00136a25-626f-4abc-9f43-776ce7aa84bd","Type":"ContainerStarted","Data":"b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286"} Feb 23 08:47:12 crc kubenswrapper[5118]: I0223 08:47:12.348717 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"00136a25-626f-4abc-9f43-776ce7aa84bd","Type":"ContainerStarted","Data":"859e7b7a673f0cd6d9df80fa1db16e5a1b4651b64dfe9e5705cef62ada003511"} Feb 23 08:47:12 crc kubenswrapper[5118]: I0223 08:47:12.348747 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:12 crc kubenswrapper[5118]: I0223 08:47:12.374208 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.374184011 podStartE2EDuration="2.374184011s" podCreationTimestamp="2026-02-23 08:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:12.371372293 +0000 UTC m=+7295.375156906" watchObservedRunningTime="2026-02-23 08:47:12.374184011 +0000 UTC m=+7295.377968584" Feb 23 08:47:20 crc kubenswrapper[5118]: I0223 08:47:20.920152 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.418155 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pn8cv"] Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.419347 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.423775 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.423973 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.429748 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pn8cv"] Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.519453 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-scripts\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.520112 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-config-data\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.520153 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.520176 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hr9f\" (UniqueName: \"kubernetes.io/projected/301b65e4-1307-4a85-a3fc-54d4b49508d6-kube-api-access-7hr9f\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.585228 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.586758 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.590406 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.604823 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.622260 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-config-data\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.622311 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.622332 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hr9f\" (UniqueName: \"kubernetes.io/projected/301b65e4-1307-4a85-a3fc-54d4b49508d6-kube-api-access-7hr9f\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.622398 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-scripts\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.635243 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-scripts\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.644137 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-config-data\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.644283 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.647685 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hr9f\" (UniqueName: \"kubernetes.io/projected/301b65e4-1307-4a85-a3fc-54d4b49508d6-kube-api-access-7hr9f\") pod \"nova-cell0-cell-mapping-pn8cv\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.671896 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.673491 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.697720 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.719638 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.725658 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-config-data\") pod \"nova-scheduler-0\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.725712 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff7jt\" (UniqueName: \"kubernetes.io/projected/12edb685-7b5d-43c2-8067-b1855e9a8e7b-kube-api-access-ff7jt\") pod \"nova-scheduler-0\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.725762 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.768468 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.799061 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.828082 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-config-data\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.828186 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-config-data\") pod \"nova-scheduler-0\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.828218 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88xc4\" (UniqueName: \"kubernetes.io/projected/7474fb59-ea30-419a-bb8c-8151f612a831-kube-api-access-88xc4\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.828249 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff7jt\" (UniqueName: \"kubernetes.io/projected/12edb685-7b5d-43c2-8067-b1855e9a8e7b-kube-api-access-ff7jt\") pod \"nova-scheduler-0\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.828309 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.828362 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7474fb59-ea30-419a-bb8c-8151f612a831-logs\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.828400 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.844533 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.844683 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.884189 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-config-data\") pod \"nova-scheduler-0\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.884904 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.909023 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.924822 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff7jt\" (UniqueName: \"kubernetes.io/projected/12edb685-7b5d-43c2-8067-b1855e9a8e7b-kube-api-access-ff7jt\") pod \"nova-scheduler-0\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.933319 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d9dbb6f47-x95pd"] Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.938751 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88xc4\" (UniqueName: \"kubernetes.io/projected/7474fb59-ea30-419a-bb8c-8151f612a831-kube-api-access-88xc4\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.938984 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2vzg\" (UniqueName: \"kubernetes.io/projected/dc95d343-d5d6-42f8-86a1-ab57138763cd-kube-api-access-r2vzg\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.939077 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.939185 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-config-data\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.939713 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7474fb59-ea30-419a-bb8c-8151f612a831-logs\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.939818 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.939902 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc95d343-d5d6-42f8-86a1-ab57138763cd-logs\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.940105 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-config-data\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.941193 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7474fb59-ea30-419a-bb8c-8151f612a831-logs\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.948361 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.949825 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-config-data\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.955389 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:21 crc kubenswrapper[5118]: I0223 08:47:21.992906 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88xc4\" (UniqueName: \"kubernetes.io/projected/7474fb59-ea30-419a-bb8c-8151f612a831-kube-api-access-88xc4\") pod \"nova-metadata-0\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " pod="openstack/nova-metadata-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.001834 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d9dbb6f47-x95pd"] Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.017938 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.019668 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.028468 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.030043 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.034707 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042208 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042282 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042321 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc95d343-d5d6-42f8-86a1-ab57138763cd-logs\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042368 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-dns-svc\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042401 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxf6r\" (UniqueName: \"kubernetes.io/projected/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-kube-api-access-kxf6r\") pod \"nova-cell1-novncproxy-0\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042442 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-sb\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042470 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-nb\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042511 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2vzg\" (UniqueName: \"kubernetes.io/projected/dc95d343-d5d6-42f8-86a1-ab57138763cd-kube-api-access-r2vzg\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042538 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042559 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-config-data\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042595 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-config\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.042628 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmvtf\" (UniqueName: \"kubernetes.io/projected/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-kube-api-access-jmvtf\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.043220 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc95d343-d5d6-42f8-86a1-ab57138763cd-logs\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.052209 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-config-data\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.054837 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.067661 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2vzg\" (UniqueName: \"kubernetes.io/projected/dc95d343-d5d6-42f8-86a1-ab57138763cd-kube-api-access-r2vzg\") pod \"nova-api-0\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " pod="openstack/nova-api-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.146824 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-config\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.147290 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmvtf\" (UniqueName: \"kubernetes.io/projected/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-kube-api-access-jmvtf\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.147336 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.147367 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.147407 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-dns-svc\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.147431 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxf6r\" (UniqueName: \"kubernetes.io/projected/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-kube-api-access-kxf6r\") pod \"nova-cell1-novncproxy-0\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.147468 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-sb\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.147488 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-nb\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.148297 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-config\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.151369 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-dns-svc\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.151498 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-sb\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.152147 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-nb\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.153367 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.157864 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.168968 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmvtf\" (UniqueName: \"kubernetes.io/projected/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-kube-api-access-jmvtf\") pod \"dnsmasq-dns-d9dbb6f47-x95pd\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.199815 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxf6r\" (UniqueName: \"kubernetes.io/projected/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-kube-api-access-kxf6r\") pod \"nova-cell1-novncproxy-0\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.213273 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.273047 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.299958 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.373988 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.489866 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pn8cv"] Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.545211 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-twmzg"] Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.550138 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.555621 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.555821 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.621174 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-twmzg"] Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.667947 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-scripts\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.670526 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nggz\" (UniqueName: \"kubernetes.io/projected/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-kube-api-access-8nggz\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.671000 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.671158 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-config-data\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.721704 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.776385 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.776447 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-config-data\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.776511 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-scripts\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.776541 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nggz\" (UniqueName: \"kubernetes.io/projected/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-kube-api-access-8nggz\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.793734 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-scripts\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.794805 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-config-data\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.795782 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.822868 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nggz\" (UniqueName: \"kubernetes.io/projected/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-kube-api-access-8nggz\") pod \"nova-cell1-conductor-db-sync-twmzg\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.893367 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:22 crc kubenswrapper[5118]: I0223 08:47:22.902589 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.101612 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:23 crc kubenswrapper[5118]: W0223 08:47:23.110764 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc95d343_d5d6_42f8_86a1_ab57138763cd.slice/crio-dd67a4c2c753606af0d0a1f19ee14fb2af2d94066814940c3a6ea860904bfb5b WatchSource:0}: Error finding container dd67a4c2c753606af0d0a1f19ee14fb2af2d94066814940c3a6ea860904bfb5b: Status 404 returned error can't find the container with id dd67a4c2c753606af0d0a1f19ee14fb2af2d94066814940c3a6ea860904bfb5b Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.141304 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d9dbb6f47-x95pd"] Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.236916 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.423461 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-twmzg"] Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.511684 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7474fb59-ea30-419a-bb8c-8151f612a831","Type":"ContainerStarted","Data":"783282d1e64aaa81cbb55424a26ba173bd1b6829d49fa9d9af79dde5d5bd6a24"} Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.515056 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-twmzg" event={"ID":"10c16330-f3b7-4cad-ad50-34a5e5f79b9b","Type":"ContainerStarted","Data":"45edb5b94378bb612dcd6c05d4818752ac66b6e3b611817a720388022ca5e1ae"} Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.517806 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12edb685-7b5d-43c2-8067-b1855e9a8e7b","Type":"ContainerStarted","Data":"4c2dfb023f0df5abbb04665d0ef3d68de5ccbb47f3aff277762b8e3d300c8965"} Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.519408 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52","Type":"ContainerStarted","Data":"2fe2cf6c4ced7ae91309f4600867caa542e6430892eb040e12462812e82156d9"} Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.521479 5118 generic.go:334] "Generic (PLEG): container finished" podID="4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" containerID="b888e9b874ca4b9ed8a78c6f1b2149b163a6ab83367e510c8688d0c12954e4a3" exitCode=0 Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.521527 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" event={"ID":"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95","Type":"ContainerDied","Data":"b888e9b874ca4b9ed8a78c6f1b2149b163a6ab83367e510c8688d0c12954e4a3"} Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.521544 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" event={"ID":"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95","Type":"ContainerStarted","Data":"fda8834ff4398927fc7e6dc2bde87a24e7f1fb26d9ff7265fc010a68e3e5c934"} Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.525228 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pn8cv" event={"ID":"301b65e4-1307-4a85-a3fc-54d4b49508d6","Type":"ContainerStarted","Data":"7fb0d63f9689552d1e43a3c320a8c7d479c783f5dccff98b23d71e33f1ed0c18"} Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.525262 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pn8cv" event={"ID":"301b65e4-1307-4a85-a3fc-54d4b49508d6","Type":"ContainerStarted","Data":"6947304677b44d30dc7fbd5a620be4cca21d69042209ed6875b7d2af5ff8a7dc"} Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.527157 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc95d343-d5d6-42f8-86a1-ab57138763cd","Type":"ContainerStarted","Data":"dd67a4c2c753606af0d0a1f19ee14fb2af2d94066814940c3a6ea860904bfb5b"} Feb 23 08:47:23 crc kubenswrapper[5118]: I0223 08:47:23.583063 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pn8cv" podStartSLOduration=2.583040916 podStartE2EDuration="2.583040916s" podCreationTimestamp="2026-02-23 08:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:23.567492041 +0000 UTC m=+7306.571276614" watchObservedRunningTime="2026-02-23 08:47:23.583040916 +0000 UTC m=+7306.586825489" Feb 23 08:47:24 crc kubenswrapper[5118]: I0223 08:47:24.542225 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-twmzg" event={"ID":"10c16330-f3b7-4cad-ad50-34a5e5f79b9b","Type":"ContainerStarted","Data":"cee75633c13bcd9438093bc06917d7cccea8cb048389510b162a73da2ad95fe3"} Feb 23 08:47:24 crc kubenswrapper[5118]: I0223 08:47:24.593271 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-twmzg" podStartSLOduration=2.593246915 podStartE2EDuration="2.593246915s" podCreationTimestamp="2026-02-23 08:47:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:24.563600201 +0000 UTC m=+7307.567384774" watchObservedRunningTime="2026-02-23 08:47:24.593246915 +0000 UTC m=+7307.597031488" Feb 23 08:47:25 crc kubenswrapper[5118]: I0223 08:47:25.556351 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" event={"ID":"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95","Type":"ContainerStarted","Data":"7023fae62968ec8b48f3a38acafb5befc3831f86e7b6441021e0e28d90aee46a"} Feb 23 08:47:25 crc kubenswrapper[5118]: I0223 08:47:25.582298 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" podStartSLOduration=4.582270745 podStartE2EDuration="4.582270745s" podCreationTimestamp="2026-02-23 08:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:25.580916803 +0000 UTC m=+7308.584701376" watchObservedRunningTime="2026-02-23 08:47:25.582270745 +0000 UTC m=+7308.586055318" Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.565719 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc95d343-d5d6-42f8-86a1-ab57138763cd","Type":"ContainerStarted","Data":"ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c"} Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.566115 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc95d343-d5d6-42f8-86a1-ab57138763cd","Type":"ContainerStarted","Data":"94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215"} Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.569629 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7474fb59-ea30-419a-bb8c-8151f612a831","Type":"ContainerStarted","Data":"257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf"} Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.569695 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7474fb59-ea30-419a-bb8c-8151f612a831","Type":"ContainerStarted","Data":"2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e"} Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.572648 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12edb685-7b5d-43c2-8067-b1855e9a8e7b","Type":"ContainerStarted","Data":"97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af"} Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.585008 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52","Type":"ContainerStarted","Data":"78d34c23cc6382d352e86b30a107dc2a84567ddc2ec844444c8d11077d4669bc"} Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.585433 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.604062 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.98722472 podStartE2EDuration="5.604045394s" podCreationTimestamp="2026-02-23 08:47:21 +0000 UTC" firstStartedPulling="2026-02-23 08:47:23.125047245 +0000 UTC m=+7306.128831808" lastFinishedPulling="2026-02-23 08:47:25.741867909 +0000 UTC m=+7308.745652482" observedRunningTime="2026-02-23 08:47:26.596284857 +0000 UTC m=+7309.600069420" watchObservedRunningTime="2026-02-23 08:47:26.604045394 +0000 UTC m=+7309.607829967" Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.674748 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.869926555 podStartE2EDuration="5.674727196s" podCreationTimestamp="2026-02-23 08:47:21 +0000 UTC" firstStartedPulling="2026-02-23 08:47:22.936861363 +0000 UTC m=+7305.940645936" lastFinishedPulling="2026-02-23 08:47:25.741662004 +0000 UTC m=+7308.745446577" observedRunningTime="2026-02-23 08:47:26.670586226 +0000 UTC m=+7309.674370809" watchObservedRunningTime="2026-02-23 08:47:26.674727196 +0000 UTC m=+7309.678511769" Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.681601 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.604721258 podStartE2EDuration="5.681586461s" podCreationTimestamp="2026-02-23 08:47:21 +0000 UTC" firstStartedPulling="2026-02-23 08:47:22.675635372 +0000 UTC m=+7305.679419945" lastFinishedPulling="2026-02-23 08:47:25.752500575 +0000 UTC m=+7308.756285148" observedRunningTime="2026-02-23 08:47:26.634108188 +0000 UTC m=+7309.637892761" watchObservedRunningTime="2026-02-23 08:47:26.681586461 +0000 UTC m=+7309.685371034" Feb 23 08:47:26 crc kubenswrapper[5118]: I0223 08:47:26.691192 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.185756592 podStartE2EDuration="5.691164502s" podCreationTimestamp="2026-02-23 08:47:21 +0000 UTC" firstStartedPulling="2026-02-23 08:47:23.235827253 +0000 UTC m=+7306.239611826" lastFinishedPulling="2026-02-23 08:47:25.741235163 +0000 UTC m=+7308.745019736" observedRunningTime="2026-02-23 08:47:26.6885735 +0000 UTC m=+7309.692358073" watchObservedRunningTime="2026-02-23 08:47:26.691164502 +0000 UTC m=+7309.694949075" Feb 23 08:47:27 crc kubenswrapper[5118]: I0223 08:47:27.032245 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:47:27 crc kubenswrapper[5118]: I0223 08:47:27.032303 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:47:27 crc kubenswrapper[5118]: I0223 08:47:27.214195 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 08:47:27 crc kubenswrapper[5118]: I0223 08:47:27.374895 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:27 crc kubenswrapper[5118]: I0223 08:47:27.625473 5118 generic.go:334] "Generic (PLEG): container finished" podID="10c16330-f3b7-4cad-ad50-34a5e5f79b9b" containerID="cee75633c13bcd9438093bc06917d7cccea8cb048389510b162a73da2ad95fe3" exitCode=0 Feb 23 08:47:27 crc kubenswrapper[5118]: I0223 08:47:27.625572 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-twmzg" event={"ID":"10c16330-f3b7-4cad-ad50-34a5e5f79b9b","Type":"ContainerDied","Data":"cee75633c13bcd9438093bc06917d7cccea8cb048389510b162a73da2ad95fe3"} Feb 23 08:47:28 crc kubenswrapper[5118]: I0223 08:47:28.637597 5118 generic.go:334] "Generic (PLEG): container finished" podID="301b65e4-1307-4a85-a3fc-54d4b49508d6" containerID="7fb0d63f9689552d1e43a3c320a8c7d479c783f5dccff98b23d71e33f1ed0c18" exitCode=0 Feb 23 08:47:28 crc kubenswrapper[5118]: I0223 08:47:28.637810 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pn8cv" event={"ID":"301b65e4-1307-4a85-a3fc-54d4b49508d6","Type":"ContainerDied","Data":"7fb0d63f9689552d1e43a3c320a8c7d479c783f5dccff98b23d71e33f1ed0c18"} Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.128890 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.252113 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-config-data\") pod \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.252534 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nggz\" (UniqueName: \"kubernetes.io/projected/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-kube-api-access-8nggz\") pod \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.252572 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-scripts\") pod \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.252624 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-combined-ca-bundle\") pod \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\" (UID: \"10c16330-f3b7-4cad-ad50-34a5e5f79b9b\") " Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.259338 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-kube-api-access-8nggz" (OuterVolumeSpecName: "kube-api-access-8nggz") pod "10c16330-f3b7-4cad-ad50-34a5e5f79b9b" (UID: "10c16330-f3b7-4cad-ad50-34a5e5f79b9b"). InnerVolumeSpecName "kube-api-access-8nggz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.261229 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-scripts" (OuterVolumeSpecName: "scripts") pod "10c16330-f3b7-4cad-ad50-34a5e5f79b9b" (UID: "10c16330-f3b7-4cad-ad50-34a5e5f79b9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.292438 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-config-data" (OuterVolumeSpecName: "config-data") pod "10c16330-f3b7-4cad-ad50-34a5e5f79b9b" (UID: "10c16330-f3b7-4cad-ad50-34a5e5f79b9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.306661 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10c16330-f3b7-4cad-ad50-34a5e5f79b9b" (UID: "10c16330-f3b7-4cad-ad50-34a5e5f79b9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.355157 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.355205 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.355219 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nggz\" (UniqueName: \"kubernetes.io/projected/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-kube-api-access-8nggz\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.355236 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10c16330-f3b7-4cad-ad50-34a5e5f79b9b-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.649034 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-twmzg" event={"ID":"10c16330-f3b7-4cad-ad50-34a5e5f79b9b","Type":"ContainerDied","Data":"45edb5b94378bb612dcd6c05d4818752ac66b6e3b611817a720388022ca5e1ae"} Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.651157 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45edb5b94378bb612dcd6c05d4818752ac66b6e3b611817a720388022ca5e1ae" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.651262 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-twmzg" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.759524 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:47:29 crc kubenswrapper[5118]: E0223 08:47:29.759950 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c16330-f3b7-4cad-ad50-34a5e5f79b9b" containerName="nova-cell1-conductor-db-sync" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.759967 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c16330-f3b7-4cad-ad50-34a5e5f79b9b" containerName="nova-cell1-conductor-db-sync" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.760146 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c16330-f3b7-4cad-ad50-34a5e5f79b9b" containerName="nova-cell1-conductor-db-sync" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.760777 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.764362 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.788938 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.866562 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbqrg\" (UniqueName: \"kubernetes.io/projected/285709b4-ebda-4e1f-91c3-905c79ab31af-kube-api-access-dbqrg\") pod \"nova-cell1-conductor-0\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.866602 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.866634 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.967972 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.968865 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbqrg\" (UniqueName: \"kubernetes.io/projected/285709b4-ebda-4e1f-91c3-905c79ab31af-kube-api-access-dbqrg\") pod \"nova-cell1-conductor-0\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.968898 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.973415 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.973513 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.986212 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbqrg\" (UniqueName: \"kubernetes.io/projected/285709b4-ebda-4e1f-91c3-905c79ab31af-kube-api-access-dbqrg\") pod \"nova-cell1-conductor-0\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:29 crc kubenswrapper[5118]: I0223 08:47:29.991122 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.070625 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-scripts\") pod \"301b65e4-1307-4a85-a3fc-54d4b49508d6\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.071319 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-config-data\") pod \"301b65e4-1307-4a85-a3fc-54d4b49508d6\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.071465 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hr9f\" (UniqueName: \"kubernetes.io/projected/301b65e4-1307-4a85-a3fc-54d4b49508d6-kube-api-access-7hr9f\") pod \"301b65e4-1307-4a85-a3fc-54d4b49508d6\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.071642 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-combined-ca-bundle\") pod \"301b65e4-1307-4a85-a3fc-54d4b49508d6\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.074211 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-scripts" (OuterVolumeSpecName: "scripts") pod "301b65e4-1307-4a85-a3fc-54d4b49508d6" (UID: "301b65e4-1307-4a85-a3fc-54d4b49508d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.074669 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301b65e4-1307-4a85-a3fc-54d4b49508d6-kube-api-access-7hr9f" (OuterVolumeSpecName: "kube-api-access-7hr9f") pod "301b65e4-1307-4a85-a3fc-54d4b49508d6" (UID: "301b65e4-1307-4a85-a3fc-54d4b49508d6"). InnerVolumeSpecName "kube-api-access-7hr9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.081142 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:30 crc kubenswrapper[5118]: E0223 08:47:30.099449 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-combined-ca-bundle podName:301b65e4-1307-4a85-a3fc-54d4b49508d6 nodeName:}" failed. No retries permitted until 2026-02-23 08:47:30.599415026 +0000 UTC m=+7313.603199599 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-combined-ca-bundle") pod "301b65e4-1307-4a85-a3fc-54d4b49508d6" (UID: "301b65e4-1307-4a85-a3fc-54d4b49508d6") : error deleting /var/lib/kubelet/pods/301b65e4-1307-4a85-a3fc-54d4b49508d6/volume-subpaths: remove /var/lib/kubelet/pods/301b65e4-1307-4a85-a3fc-54d4b49508d6/volume-subpaths: no such file or directory Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.103452 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-config-data" (OuterVolumeSpecName: "config-data") pod "301b65e4-1307-4a85-a3fc-54d4b49508d6" (UID: "301b65e4-1307-4a85-a3fc-54d4b49508d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.174455 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.174503 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.174523 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hr9f\" (UniqueName: \"kubernetes.io/projected/301b65e4-1307-4a85-a3fc-54d4b49508d6-kube-api-access-7hr9f\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.528803 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:47:30 crc kubenswrapper[5118]: W0223 08:47:30.530645 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod285709b4_ebda_4e1f_91c3_905c79ab31af.slice/crio-942ea0f7651acc5fa8de96ba6853c4e1244fae50d98d4d9d2f52e29b0a6fa599 WatchSource:0}: Error finding container 942ea0f7651acc5fa8de96ba6853c4e1244fae50d98d4d9d2f52e29b0a6fa599: Status 404 returned error can't find the container with id 942ea0f7651acc5fa8de96ba6853c4e1244fae50d98d4d9d2f52e29b0a6fa599 Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.674391 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pn8cv" event={"ID":"301b65e4-1307-4a85-a3fc-54d4b49508d6","Type":"ContainerDied","Data":"6947304677b44d30dc7fbd5a620be4cca21d69042209ed6875b7d2af5ff8a7dc"} Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.674799 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6947304677b44d30dc7fbd5a620be4cca21d69042209ed6875b7d2af5ff8a7dc" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.674441 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pn8cv" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.676357 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"285709b4-ebda-4e1f-91c3-905c79ab31af","Type":"ContainerStarted","Data":"942ea0f7651acc5fa8de96ba6853c4e1244fae50d98d4d9d2f52e29b0a6fa599"} Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.688597 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-combined-ca-bundle\") pod \"301b65e4-1307-4a85-a3fc-54d4b49508d6\" (UID: \"301b65e4-1307-4a85-a3fc-54d4b49508d6\") " Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.694982 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "301b65e4-1307-4a85-a3fc-54d4b49508d6" (UID: "301b65e4-1307-4a85-a3fc-54d4b49508d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.790579 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/301b65e4-1307-4a85-a3fc-54d4b49508d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.960587 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.960817 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc95d343-d5d6-42f8-86a1-ab57138763cd" containerName="nova-api-log" containerID="cri-o://94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215" gracePeriod=30 Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.961272 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc95d343-d5d6-42f8-86a1-ab57138763cd" containerName="nova-api-api" containerID="cri-o://ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c" gracePeriod=30 Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.975489 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:30 crc kubenswrapper[5118]: I0223 08:47:30.975733 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="12edb685-7b5d-43c2-8067-b1855e9a8e7b" containerName="nova-scheduler-scheduler" containerID="cri-o://97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af" gracePeriod=30 Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.010974 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.011371 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7474fb59-ea30-419a-bb8c-8151f612a831" containerName="nova-metadata-log" containerID="cri-o://2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e" gracePeriod=30 Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.011571 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7474fb59-ea30-419a-bb8c-8151f612a831" containerName="nova-metadata-metadata" containerID="cri-o://257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf" gracePeriod=30 Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.535270 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.601429 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.616961 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2vzg\" (UniqueName: \"kubernetes.io/projected/dc95d343-d5d6-42f8-86a1-ab57138763cd-kube-api-access-r2vzg\") pod \"dc95d343-d5d6-42f8-86a1-ab57138763cd\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.617005 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88xc4\" (UniqueName: \"kubernetes.io/projected/7474fb59-ea30-419a-bb8c-8151f612a831-kube-api-access-88xc4\") pod \"7474fb59-ea30-419a-bb8c-8151f612a831\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.617081 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-combined-ca-bundle\") pod \"dc95d343-d5d6-42f8-86a1-ab57138763cd\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.617124 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-combined-ca-bundle\") pod \"7474fb59-ea30-419a-bb8c-8151f612a831\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.618989 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-config-data\") pod \"dc95d343-d5d6-42f8-86a1-ab57138763cd\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.619137 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-config-data\") pod \"7474fb59-ea30-419a-bb8c-8151f612a831\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.619177 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7474fb59-ea30-419a-bb8c-8151f612a831-logs\") pod \"7474fb59-ea30-419a-bb8c-8151f612a831\" (UID: \"7474fb59-ea30-419a-bb8c-8151f612a831\") " Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.619282 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc95d343-d5d6-42f8-86a1-ab57138763cd-logs\") pod \"dc95d343-d5d6-42f8-86a1-ab57138763cd\" (UID: \"dc95d343-d5d6-42f8-86a1-ab57138763cd\") " Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.619885 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc95d343-d5d6-42f8-86a1-ab57138763cd-logs" (OuterVolumeSpecName: "logs") pod "dc95d343-d5d6-42f8-86a1-ab57138763cd" (UID: "dc95d343-d5d6-42f8-86a1-ab57138763cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.620599 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc95d343-d5d6-42f8-86a1-ab57138763cd-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.629300 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7474fb59-ea30-419a-bb8c-8151f612a831-logs" (OuterVolumeSpecName: "logs") pod "7474fb59-ea30-419a-bb8c-8151f612a831" (UID: "7474fb59-ea30-419a-bb8c-8151f612a831"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.634234 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7474fb59-ea30-419a-bb8c-8151f612a831-kube-api-access-88xc4" (OuterVolumeSpecName: "kube-api-access-88xc4") pod "7474fb59-ea30-419a-bb8c-8151f612a831" (UID: "7474fb59-ea30-419a-bb8c-8151f612a831"). InnerVolumeSpecName "kube-api-access-88xc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.637961 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc95d343-d5d6-42f8-86a1-ab57138763cd-kube-api-access-r2vzg" (OuterVolumeSpecName: "kube-api-access-r2vzg") pod "dc95d343-d5d6-42f8-86a1-ab57138763cd" (UID: "dc95d343-d5d6-42f8-86a1-ab57138763cd"). InnerVolumeSpecName "kube-api-access-r2vzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.656569 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-config-data" (OuterVolumeSpecName: "config-data") pod "dc95d343-d5d6-42f8-86a1-ab57138763cd" (UID: "dc95d343-d5d6-42f8-86a1-ab57138763cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.659768 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc95d343-d5d6-42f8-86a1-ab57138763cd" (UID: "dc95d343-d5d6-42f8-86a1-ab57138763cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.669741 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-config-data" (OuterVolumeSpecName: "config-data") pod "7474fb59-ea30-419a-bb8c-8151f612a831" (UID: "7474fb59-ea30-419a-bb8c-8151f612a831"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.672427 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7474fb59-ea30-419a-bb8c-8151f612a831" (UID: "7474fb59-ea30-419a-bb8c-8151f612a831"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.688638 5118 generic.go:334] "Generic (PLEG): container finished" podID="dc95d343-d5d6-42f8-86a1-ab57138763cd" containerID="ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c" exitCode=0 Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.689274 5118 generic.go:334] "Generic (PLEG): container finished" podID="dc95d343-d5d6-42f8-86a1-ab57138763cd" containerID="94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215" exitCode=143 Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.689373 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc95d343-d5d6-42f8-86a1-ab57138763cd","Type":"ContainerDied","Data":"ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c"} Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.689450 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc95d343-d5d6-42f8-86a1-ab57138763cd","Type":"ContainerDied","Data":"94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215"} Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.689504 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc95d343-d5d6-42f8-86a1-ab57138763cd","Type":"ContainerDied","Data":"dd67a4c2c753606af0d0a1f19ee14fb2af2d94066814940c3a6ea860904bfb5b"} Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.689569 5118 scope.go:117] "RemoveContainer" containerID="ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.689745 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.710556 5118 generic.go:334] "Generic (PLEG): container finished" podID="7474fb59-ea30-419a-bb8c-8151f612a831" containerID="257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf" exitCode=0 Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.710622 5118 generic.go:334] "Generic (PLEG): container finished" podID="7474fb59-ea30-419a-bb8c-8151f612a831" containerID="2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e" exitCode=143 Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.710641 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.718069 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7474fb59-ea30-419a-bb8c-8151f612a831","Type":"ContainerDied","Data":"257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf"} Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.718151 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7474fb59-ea30-419a-bb8c-8151f612a831","Type":"ContainerDied","Data":"2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e"} Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.718163 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7474fb59-ea30-419a-bb8c-8151f612a831","Type":"ContainerDied","Data":"783282d1e64aaa81cbb55424a26ba173bd1b6829d49fa9d9af79dde5d5bd6a24"} Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.723400 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2vzg\" (UniqueName: \"kubernetes.io/projected/dc95d343-d5d6-42f8-86a1-ab57138763cd-kube-api-access-r2vzg\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.723436 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88xc4\" (UniqueName: \"kubernetes.io/projected/7474fb59-ea30-419a-bb8c-8151f612a831-kube-api-access-88xc4\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.723446 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.723457 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.723468 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc95d343-d5d6-42f8-86a1-ab57138763cd-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.723477 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7474fb59-ea30-419a-bb8c-8151f612a831-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.723491 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7474fb59-ea30-419a-bb8c-8151f612a831-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.729614 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"285709b4-ebda-4e1f-91c3-905c79ab31af","Type":"ContainerStarted","Data":"25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1"} Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.729778 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.738037 5118 scope.go:117] "RemoveContainer" containerID="94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.745061 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.768040 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.773396 5118 scope.go:117] "RemoveContainer" containerID="ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c" Feb 23 08:47:31 crc kubenswrapper[5118]: E0223 08:47:31.776082 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c\": container with ID starting with ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c not found: ID does not exist" containerID="ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.776146 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c"} err="failed to get container status \"ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c\": rpc error: code = NotFound desc = could not find container \"ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c\": container with ID starting with ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c not found: ID does not exist" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.776180 5118 scope.go:117] "RemoveContainer" containerID="94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215" Feb 23 08:47:31 crc kubenswrapper[5118]: E0223 08:47:31.780284 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215\": container with ID starting with 94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215 not found: ID does not exist" containerID="94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.780341 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215"} err="failed to get container status \"94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215\": rpc error: code = NotFound desc = could not find container \"94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215\": container with ID starting with 94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215 not found: ID does not exist" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.780379 5118 scope.go:117] "RemoveContainer" containerID="ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.784247 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c"} err="failed to get container status \"ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c\": rpc error: code = NotFound desc = could not find container \"ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c\": container with ID starting with ddd9654748f0f2fd506c89dafe46b553f6e93c5ef8d0fcadd501e622d1edd32c not found: ID does not exist" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.784298 5118 scope.go:117] "RemoveContainer" containerID="94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.788262 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215"} err="failed to get container status \"94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215\": rpc error: code = NotFound desc = could not find container \"94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215\": container with ID starting with 94dd4f9b350f70c4a934ad3f8fe93bdf8d7c5ce70b6491d559cd83105535e215 not found: ID does not exist" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.788312 5118 scope.go:117] "RemoveContainer" containerID="257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804259 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:31 crc kubenswrapper[5118]: E0223 08:47:31.804693 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc95d343-d5d6-42f8-86a1-ab57138763cd" containerName="nova-api-log" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804707 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc95d343-d5d6-42f8-86a1-ab57138763cd" containerName="nova-api-log" Feb 23 08:47:31 crc kubenswrapper[5118]: E0223 08:47:31.804735 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301b65e4-1307-4a85-a3fc-54d4b49508d6" containerName="nova-manage" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804741 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="301b65e4-1307-4a85-a3fc-54d4b49508d6" containerName="nova-manage" Feb 23 08:47:31 crc kubenswrapper[5118]: E0223 08:47:31.804758 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc95d343-d5d6-42f8-86a1-ab57138763cd" containerName="nova-api-api" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804764 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc95d343-d5d6-42f8-86a1-ab57138763cd" containerName="nova-api-api" Feb 23 08:47:31 crc kubenswrapper[5118]: E0223 08:47:31.804773 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7474fb59-ea30-419a-bb8c-8151f612a831" containerName="nova-metadata-log" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804779 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="7474fb59-ea30-419a-bb8c-8151f612a831" containerName="nova-metadata-log" Feb 23 08:47:31 crc kubenswrapper[5118]: E0223 08:47:31.804788 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7474fb59-ea30-419a-bb8c-8151f612a831" containerName="nova-metadata-metadata" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804794 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="7474fb59-ea30-419a-bb8c-8151f612a831" containerName="nova-metadata-metadata" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804953 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc95d343-d5d6-42f8-86a1-ab57138763cd" containerName="nova-api-log" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804971 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="301b65e4-1307-4a85-a3fc-54d4b49508d6" containerName="nova-manage" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804981 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="7474fb59-ea30-419a-bb8c-8151f612a831" containerName="nova-metadata-log" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804989 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="7474fb59-ea30-419a-bb8c-8151f612a831" containerName="nova-metadata-metadata" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.804999 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc95d343-d5d6-42f8-86a1-ab57138763cd" containerName="nova-api-api" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.806065 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.819627 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.819601256 podStartE2EDuration="2.819601256s" podCreationTimestamp="2026-02-23 08:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:31.757271604 +0000 UTC m=+7314.761056177" watchObservedRunningTime="2026-02-23 08:47:31.819601256 +0000 UTC m=+7314.823385849" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.821488 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.824559 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-config-data\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.824609 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5e30b00-f716-45d5-b718-c06f52c855bf-logs\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.824701 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8222v\" (UniqueName: \"kubernetes.io/projected/e5e30b00-f716-45d5-b718-c06f52c855bf-kube-api-access-8222v\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.824721 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.832942 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.839592 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.847207 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.851438 5118 scope.go:117] "RemoveContainer" containerID="2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.856269 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.858758 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.861247 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.861581 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.890998 5118 scope.go:117] "RemoveContainer" containerID="257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf" Feb 23 08:47:31 crc kubenswrapper[5118]: E0223 08:47:31.891900 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf\": container with ID starting with 257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf not found: ID does not exist" containerID="257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.891959 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf"} err="failed to get container status \"257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf\": rpc error: code = NotFound desc = could not find container \"257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf\": container with ID starting with 257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf not found: ID does not exist" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.892002 5118 scope.go:117] "RemoveContainer" containerID="2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e" Feb 23 08:47:31 crc kubenswrapper[5118]: E0223 08:47:31.892376 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e\": container with ID starting with 2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e not found: ID does not exist" containerID="2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.892421 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e"} err="failed to get container status \"2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e\": rpc error: code = NotFound desc = could not find container \"2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e\": container with ID starting with 2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e not found: ID does not exist" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.892466 5118 scope.go:117] "RemoveContainer" containerID="257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.893133 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf"} err="failed to get container status \"257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf\": rpc error: code = NotFound desc = could not find container \"257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf\": container with ID starting with 257b5b855a266d19b697b6ae1a566ecfd09175c5231b263aca61f0e60b911dbf not found: ID does not exist" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.893161 5118 scope.go:117] "RemoveContainer" containerID="2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.893507 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e"} err="failed to get container status \"2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e\": rpc error: code = NotFound desc = could not find container \"2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e\": container with ID starting with 2c598b021daf9a53c62bca64c28ffed2ed6342cd3b980d5718e064134e84eb2e not found: ID does not exist" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.925363 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-config-data\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.925419 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-config-data\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.925444 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.925474 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5e30b00-f716-45d5-b718-c06f52c855bf-logs\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.925505 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c2e835e-2520-43fb-bb58-7b62b5e6a719-logs\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.925561 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8222v\" (UniqueName: \"kubernetes.io/projected/e5e30b00-f716-45d5-b718-c06f52c855bf-kube-api-access-8222v\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.925585 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.925629 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46bvd\" (UniqueName: \"kubernetes.io/projected/3c2e835e-2520-43fb-bb58-7b62b5e6a719-kube-api-access-46bvd\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.926225 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5e30b00-f716-45d5-b718-c06f52c855bf-logs\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.932850 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.934178 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-config-data\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:31 crc kubenswrapper[5118]: I0223 08:47:31.945201 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8222v\" (UniqueName: \"kubernetes.io/projected/e5e30b00-f716-45d5-b718-c06f52c855bf-kube-api-access-8222v\") pod \"nova-api-0\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " pod="openstack/nova-api-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.031255 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c2e835e-2520-43fb-bb58-7b62b5e6a719-logs\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.031370 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46bvd\" (UniqueName: \"kubernetes.io/projected/3c2e835e-2520-43fb-bb58-7b62b5e6a719-kube-api-access-46bvd\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.031430 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-config-data\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.031462 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.031840 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c2e835e-2520-43fb-bb58-7b62b5e6a719-logs\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.036693 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-config-data\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.037607 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.049709 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46bvd\" (UniqueName: \"kubernetes.io/projected/3c2e835e-2520-43fb-bb58-7b62b5e6a719-kube-api-access-46bvd\") pod \"nova-metadata-0\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " pod="openstack/nova-metadata-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.159518 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.195079 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.303411 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.362555 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.374564 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.389142 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85f6d7f58c-xw89l"] Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.389410 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" podUID="444d1530-7a28-4644-8308-1ba2c57dfe2c" containerName="dnsmasq-dns" containerID="cri-o://bd5e08520647fe05a940a13bb8b1c905fbc7c5ae94830059983c760825cc1b5b" gracePeriod=10 Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.397113 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.442636 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-config-data\") pod \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.442944 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-combined-ca-bundle\") pod \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.443063 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff7jt\" (UniqueName: \"kubernetes.io/projected/12edb685-7b5d-43c2-8067-b1855e9a8e7b-kube-api-access-ff7jt\") pod \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\" (UID: \"12edb685-7b5d-43c2-8067-b1855e9a8e7b\") " Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.449872 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12edb685-7b5d-43c2-8067-b1855e9a8e7b-kube-api-access-ff7jt" (OuterVolumeSpecName: "kube-api-access-ff7jt") pod "12edb685-7b5d-43c2-8067-b1855e9a8e7b" (UID: "12edb685-7b5d-43c2-8067-b1855e9a8e7b"). InnerVolumeSpecName "kube-api-access-ff7jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.474047 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-config-data" (OuterVolumeSpecName: "config-data") pod "12edb685-7b5d-43c2-8067-b1855e9a8e7b" (UID: "12edb685-7b5d-43c2-8067-b1855e9a8e7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.504280 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12edb685-7b5d-43c2-8067-b1855e9a8e7b" (UID: "12edb685-7b5d-43c2-8067-b1855e9a8e7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.545666 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.545714 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff7jt\" (UniqueName: \"kubernetes.io/projected/12edb685-7b5d-43c2-8067-b1855e9a8e7b-kube-api-access-ff7jt\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.545728 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12edb685-7b5d-43c2-8067-b1855e9a8e7b-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.682328 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.755974 5118 generic.go:334] "Generic (PLEG): container finished" podID="12edb685-7b5d-43c2-8067-b1855e9a8e7b" containerID="97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af" exitCode=0 Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.756053 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12edb685-7b5d-43c2-8067-b1855e9a8e7b","Type":"ContainerDied","Data":"97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af"} Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.756089 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12edb685-7b5d-43c2-8067-b1855e9a8e7b","Type":"ContainerDied","Data":"4c2dfb023f0df5abbb04665d0ef3d68de5ccbb47f3aff277762b8e3d300c8965"} Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.756125 5118 scope.go:117] "RemoveContainer" containerID="97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.756250 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.763563 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5e30b00-f716-45d5-b718-c06f52c855bf","Type":"ContainerStarted","Data":"aaaaf71be3b2c33e141415382c426778865cdddb8508154b69127c23cd443966"} Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.765670 5118 generic.go:334] "Generic (PLEG): container finished" podID="444d1530-7a28-4644-8308-1ba2c57dfe2c" containerID="bd5e08520647fe05a940a13bb8b1c905fbc7c5ae94830059983c760825cc1b5b" exitCode=0 Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.766993 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" event={"ID":"444d1530-7a28-4644-8308-1ba2c57dfe2c","Type":"ContainerDied","Data":"bd5e08520647fe05a940a13bb8b1c905fbc7c5ae94830059983c760825cc1b5b"} Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.790201 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.792271 5118 scope.go:117] "RemoveContainer" containerID="97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af" Feb 23 08:47:32 crc kubenswrapper[5118]: E0223 08:47:32.792744 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af\": container with ID starting with 97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af not found: ID does not exist" containerID="97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.792768 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af"} err="failed to get container status \"97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af\": rpc error: code = NotFound desc = could not find container \"97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af\": container with ID starting with 97ac667e7dcc53f2e4b0c3eed928bbbd9ec3c5d0be10ca0b84377c9997fb15af not found: ID does not exist" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.797306 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.936992 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.981179 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.982375 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.995745 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:32 crc kubenswrapper[5118]: E0223 08:47:32.996326 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444d1530-7a28-4644-8308-1ba2c57dfe2c" containerName="init" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.996345 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="444d1530-7a28-4644-8308-1ba2c57dfe2c" containerName="init" Feb 23 08:47:32 crc kubenswrapper[5118]: E0223 08:47:32.996355 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444d1530-7a28-4644-8308-1ba2c57dfe2c" containerName="dnsmasq-dns" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.996362 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="444d1530-7a28-4644-8308-1ba2c57dfe2c" containerName="dnsmasq-dns" Feb 23 08:47:32 crc kubenswrapper[5118]: E0223 08:47:32.996381 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12edb685-7b5d-43c2-8067-b1855e9a8e7b" containerName="nova-scheduler-scheduler" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.996388 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="12edb685-7b5d-43c2-8067-b1855e9a8e7b" containerName="nova-scheduler-scheduler" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.996558 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="444d1530-7a28-4644-8308-1ba2c57dfe2c" containerName="dnsmasq-dns" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.996588 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="12edb685-7b5d-43c2-8067-b1855e9a8e7b" containerName="nova-scheduler-scheduler" Feb 23 08:47:32 crc kubenswrapper[5118]: I0223 08:47:32.997325 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.000900 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.004683 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.167939 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-config\") pod \"444d1530-7a28-4644-8308-1ba2c57dfe2c\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.168010 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-dns-svc\") pod \"444d1530-7a28-4644-8308-1ba2c57dfe2c\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.168142 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-nb\") pod \"444d1530-7a28-4644-8308-1ba2c57dfe2c\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.168209 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4dd2\" (UniqueName: \"kubernetes.io/projected/444d1530-7a28-4644-8308-1ba2c57dfe2c-kube-api-access-x4dd2\") pod \"444d1530-7a28-4644-8308-1ba2c57dfe2c\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.168232 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-sb\") pod \"444d1530-7a28-4644-8308-1ba2c57dfe2c\" (UID: \"444d1530-7a28-4644-8308-1ba2c57dfe2c\") " Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.168456 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-config-data\") pod \"nova-scheduler-0\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.168508 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm65p\" (UniqueName: \"kubernetes.io/projected/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-kube-api-access-nm65p\") pod \"nova-scheduler-0\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.168574 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.180456 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444d1530-7a28-4644-8308-1ba2c57dfe2c-kube-api-access-x4dd2" (OuterVolumeSpecName: "kube-api-access-x4dd2") pod "444d1530-7a28-4644-8308-1ba2c57dfe2c" (UID: "444d1530-7a28-4644-8308-1ba2c57dfe2c"). InnerVolumeSpecName "kube-api-access-x4dd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.212459 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-config" (OuterVolumeSpecName: "config") pod "444d1530-7a28-4644-8308-1ba2c57dfe2c" (UID: "444d1530-7a28-4644-8308-1ba2c57dfe2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.223588 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "444d1530-7a28-4644-8308-1ba2c57dfe2c" (UID: "444d1530-7a28-4644-8308-1ba2c57dfe2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.225663 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "444d1530-7a28-4644-8308-1ba2c57dfe2c" (UID: "444d1530-7a28-4644-8308-1ba2c57dfe2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.237648 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "444d1530-7a28-4644-8308-1ba2c57dfe2c" (UID: "444d1530-7a28-4644-8308-1ba2c57dfe2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.269863 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.269959 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-config-data\") pod \"nova-scheduler-0\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.270007 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm65p\" (UniqueName: \"kubernetes.io/projected/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-kube-api-access-nm65p\") pod \"nova-scheduler-0\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.270613 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.271131 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.271670 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.271711 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4dd2\" (UniqueName: \"kubernetes.io/projected/444d1530-7a28-4644-8308-1ba2c57dfe2c-kube-api-access-x4dd2\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.271738 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/444d1530-7a28-4644-8308-1ba2c57dfe2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.273913 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.275797 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-config-data\") pod \"nova-scheduler-0\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.301106 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm65p\" (UniqueName: \"kubernetes.io/projected/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-kube-api-access-nm65p\") pod \"nova-scheduler-0\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.559064 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.708863 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12edb685-7b5d-43c2-8067-b1855e9a8e7b" path="/var/lib/kubelet/pods/12edb685-7b5d-43c2-8067-b1855e9a8e7b/volumes" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.709740 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7474fb59-ea30-419a-bb8c-8151f612a831" path="/var/lib/kubelet/pods/7474fb59-ea30-419a-bb8c-8151f612a831/volumes" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.710480 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc95d343-d5d6-42f8-86a1-ab57138763cd" path="/var/lib/kubelet/pods/dc95d343-d5d6-42f8-86a1-ab57138763cd/volumes" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.786192 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" event={"ID":"444d1530-7a28-4644-8308-1ba2c57dfe2c","Type":"ContainerDied","Data":"e9c9fa74729a08385a9caa8e037b992e12e7ba3676f53217a42508723bff18db"} Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.786246 5118 scope.go:117] "RemoveContainer" containerID="bd5e08520647fe05a940a13bb8b1c905fbc7c5ae94830059983c760825cc1b5b" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.786266 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f6d7f58c-xw89l" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.793423 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5e30b00-f716-45d5-b718-c06f52c855bf","Type":"ContainerStarted","Data":"3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a"} Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.793985 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5e30b00-f716-45d5-b718-c06f52c855bf","Type":"ContainerStarted","Data":"6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3"} Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.807725 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c2e835e-2520-43fb-bb58-7b62b5e6a719","Type":"ContainerStarted","Data":"864c2c0c4dab5720be4a7906dc34e4f90eb2d615cb472b23d30062c300e842f7"} Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.807773 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c2e835e-2520-43fb-bb58-7b62b5e6a719","Type":"ContainerStarted","Data":"39966dbde33ad8fc3bb13d95441b1e330b6e63d7444b003ae8f78c2df864f36e"} Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.807785 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c2e835e-2520-43fb-bb58-7b62b5e6a719","Type":"ContainerStarted","Data":"ff690feef586cfad0256e09565f5b4837e83630481d4bdbd63d10b10c51c4082"} Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.820409 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85f6d7f58c-xw89l"] Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.837531 5118 scope.go:117] "RemoveContainer" containerID="6352248c154b9174e43a38f36c0fe09b42a8a216d60075223ec542e52b48cf26" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.847971 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85f6d7f58c-xw89l"] Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.852582 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.852561628 podStartE2EDuration="2.852561628s" podCreationTimestamp="2026-02-23 08:47:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:33.837937675 +0000 UTC m=+7316.841722248" watchObservedRunningTime="2026-02-23 08:47:33.852561628 +0000 UTC m=+7316.856346201" Feb 23 08:47:33 crc kubenswrapper[5118]: I0223 08:47:33.872090 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.872069318 podStartE2EDuration="2.872069318s" podCreationTimestamp="2026-02-23 08:47:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:33.869537327 +0000 UTC m=+7316.873321900" watchObservedRunningTime="2026-02-23 08:47:33.872069318 +0000 UTC m=+7316.875853891" Feb 23 08:47:34 crc kubenswrapper[5118]: I0223 08:47:34.115109 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:34 crc kubenswrapper[5118]: I0223 08:47:34.823197 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca3a40be-b4c9-4e7e-97e3-fd23ae649889","Type":"ContainerStarted","Data":"884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74"} Feb 23 08:47:34 crc kubenswrapper[5118]: I0223 08:47:34.823655 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca3a40be-b4c9-4e7e-97e3-fd23ae649889","Type":"ContainerStarted","Data":"ed15fb4346c938e79b1b02d487119954bc29e8fc9a93d27f5992a48810d44689"} Feb 23 08:47:34 crc kubenswrapper[5118]: I0223 08:47:34.870615 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.870585926 podStartE2EDuration="2.870585926s" podCreationTimestamp="2026-02-23 08:47:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:34.849079817 +0000 UTC m=+7317.852864430" watchObservedRunningTime="2026-02-23 08:47:34.870585926 +0000 UTC m=+7317.874370539" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.133597 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.690458 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wxcjh"] Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.691627 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.694462 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.694814 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.713458 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="444d1530-7a28-4644-8308-1ba2c57dfe2c" path="/var/lib/kubelet/pods/444d1530-7a28-4644-8308-1ba2c57dfe2c/volumes" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.771145 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wxcjh"] Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.833337 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-scripts\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.833741 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzq4b\" (UniqueName: \"kubernetes.io/projected/d6582370-a366-423e-88a4-da5b540bb5bf-kube-api-access-xzq4b\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.833824 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.833878 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-config-data\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.936065 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-scripts\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.936676 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzq4b\" (UniqueName: \"kubernetes.io/projected/d6582370-a366-423e-88a4-da5b540bb5bf-kube-api-access-xzq4b\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.936709 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.936734 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-config-data\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.943367 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.944930 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-config-data\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.959164 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-scripts\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:35 crc kubenswrapper[5118]: I0223 08:47:35.959970 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzq4b\" (UniqueName: \"kubernetes.io/projected/d6582370-a366-423e-88a4-da5b540bb5bf-kube-api-access-xzq4b\") pod \"nova-cell1-cell-mapping-wxcjh\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:36 crc kubenswrapper[5118]: I0223 08:47:36.047063 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:36 crc kubenswrapper[5118]: I0223 08:47:36.594803 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wxcjh"] Feb 23 08:47:36 crc kubenswrapper[5118]: W0223 08:47:36.597299 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6582370_a366_423e_88a4_da5b540bb5bf.slice/crio-34dd798564728c136790a2f78715eac713aa76708c92a95324f380948ef00dc2 WatchSource:0}: Error finding container 34dd798564728c136790a2f78715eac713aa76708c92a95324f380948ef00dc2: Status 404 returned error can't find the container with id 34dd798564728c136790a2f78715eac713aa76708c92a95324f380948ef00dc2 Feb 23 08:47:36 crc kubenswrapper[5118]: I0223 08:47:36.850073 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wxcjh" event={"ID":"d6582370-a366-423e-88a4-da5b540bb5bf","Type":"ContainerStarted","Data":"e5770682a4a710cf5e4872bfff0f41b98c32bed8380f55063354b8a5b03f9cb4"} Feb 23 08:47:36 crc kubenswrapper[5118]: I0223 08:47:36.850589 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wxcjh" event={"ID":"d6582370-a366-423e-88a4-da5b540bb5bf","Type":"ContainerStarted","Data":"34dd798564728c136790a2f78715eac713aa76708c92a95324f380948ef00dc2"} Feb 23 08:47:36 crc kubenswrapper[5118]: I0223 08:47:36.878038 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wxcjh" podStartSLOduration=1.878005852 podStartE2EDuration="1.878005852s" podCreationTimestamp="2026-02-23 08:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:36.871224999 +0000 UTC m=+7319.875009572" watchObservedRunningTime="2026-02-23 08:47:36.878005852 +0000 UTC m=+7319.881790435" Feb 23 08:47:37 crc kubenswrapper[5118]: I0223 08:47:37.195824 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:47:37 crc kubenswrapper[5118]: I0223 08:47:37.195974 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:47:38 crc kubenswrapper[5118]: I0223 08:47:38.559704 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 08:47:41 crc kubenswrapper[5118]: I0223 08:47:41.934203 5118 generic.go:334] "Generic (PLEG): container finished" podID="d6582370-a366-423e-88a4-da5b540bb5bf" containerID="e5770682a4a710cf5e4872bfff0f41b98c32bed8380f55063354b8a5b03f9cb4" exitCode=0 Feb 23 08:47:41 crc kubenswrapper[5118]: I0223 08:47:41.934297 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wxcjh" event={"ID":"d6582370-a366-423e-88a4-da5b540bb5bf","Type":"ContainerDied","Data":"e5770682a4a710cf5e4872bfff0f41b98c32bed8380f55063354b8a5b03f9cb4"} Feb 23 08:47:42 crc kubenswrapper[5118]: I0223 08:47:42.161200 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:47:42 crc kubenswrapper[5118]: I0223 08:47:42.161560 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:47:42 crc kubenswrapper[5118]: I0223 08:47:42.195811 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 08:47:42 crc kubenswrapper[5118]: I0223 08:47:42.195885 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.243359 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.325331 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.325624 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.325729 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.434687 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.453147 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzq4b\" (UniqueName: \"kubernetes.io/projected/d6582370-a366-423e-88a4-da5b540bb5bf-kube-api-access-xzq4b\") pod \"d6582370-a366-423e-88a4-da5b540bb5bf\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.454070 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-scripts\") pod \"d6582370-a366-423e-88a4-da5b540bb5bf\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.455137 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-config-data\") pod \"d6582370-a366-423e-88a4-da5b540bb5bf\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.455332 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-combined-ca-bundle\") pod \"d6582370-a366-423e-88a4-da5b540bb5bf\" (UID: \"d6582370-a366-423e-88a4-da5b540bb5bf\") " Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.468354 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6582370-a366-423e-88a4-da5b540bb5bf-kube-api-access-xzq4b" (OuterVolumeSpecName: "kube-api-access-xzq4b") pod "d6582370-a366-423e-88a4-da5b540bb5bf" (UID: "d6582370-a366-423e-88a4-da5b540bb5bf"). InnerVolumeSpecName "kube-api-access-xzq4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.475442 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-scripts" (OuterVolumeSpecName: "scripts") pod "d6582370-a366-423e-88a4-da5b540bb5bf" (UID: "d6582370-a366-423e-88a4-da5b540bb5bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.511497 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-config-data" (OuterVolumeSpecName: "config-data") pod "d6582370-a366-423e-88a4-da5b540bb5bf" (UID: "d6582370-a366-423e-88a4-da5b540bb5bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.515668 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6582370-a366-423e-88a4-da5b540bb5bf" (UID: "d6582370-a366-423e-88a4-da5b540bb5bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.558520 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.558606 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.558619 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6582370-a366-423e-88a4-da5b540bb5bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.558632 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzq4b\" (UniqueName: \"kubernetes.io/projected/d6582370-a366-423e-88a4-da5b540bb5bf-kube-api-access-xzq4b\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.559631 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.595258 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.977825 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wxcjh" Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.979695 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wxcjh" event={"ID":"d6582370-a366-423e-88a4-da5b540bb5bf","Type":"ContainerDied","Data":"34dd798564728c136790a2f78715eac713aa76708c92a95324f380948ef00dc2"} Feb 23 08:47:43 crc kubenswrapper[5118]: I0223 08:47:43.979892 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34dd798564728c136790a2f78715eac713aa76708c92a95324f380948ef00dc2" Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.019169 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.088362 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.088655 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerName="nova-api-log" containerID="cri-o://6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3" gracePeriod=30 Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.088934 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerName="nova-api-api" containerID="cri-o://3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a" gracePeriod=30 Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.161942 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.162376 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerName="nova-metadata-log" containerID="cri-o://39966dbde33ad8fc3bb13d95441b1e330b6e63d7444b003ae8f78c2df864f36e" gracePeriod=30 Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.162502 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerName="nova-metadata-metadata" containerID="cri-o://864c2c0c4dab5720be4a7906dc34e4f90eb2d615cb472b23d30062c300e842f7" gracePeriod=30 Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.568540 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.989591 5118 generic.go:334] "Generic (PLEG): container finished" podID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerID="6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3" exitCode=143 Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.989681 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5e30b00-f716-45d5-b718-c06f52c855bf","Type":"ContainerDied","Data":"6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3"} Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.992035 5118 generic.go:334] "Generic (PLEG): container finished" podID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerID="39966dbde33ad8fc3bb13d95441b1e330b6e63d7444b003ae8f78c2df864f36e" exitCode=143 Feb 23 08:47:44 crc kubenswrapper[5118]: I0223 08:47:44.992063 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c2e835e-2520-43fb-bb58-7b62b5e6a719","Type":"ContainerDied","Data":"39966dbde33ad8fc3bb13d95441b1e330b6e63d7444b003ae8f78c2df864f36e"} Feb 23 08:47:46 crc kubenswrapper[5118]: I0223 08:47:46.004764 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ca3a40be-b4c9-4e7e-97e3-fd23ae649889" containerName="nova-scheduler-scheduler" containerID="cri-o://884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74" gracePeriod=30 Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.049028 5118 generic.go:334] "Generic (PLEG): container finished" podID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerID="864c2c0c4dab5720be4a7906dc34e4f90eb2d615cb472b23d30062c300e842f7" exitCode=0 Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.049272 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c2e835e-2520-43fb-bb58-7b62b5e6a719","Type":"ContainerDied","Data":"864c2c0c4dab5720be4a7906dc34e4f90eb2d615cb472b23d30062c300e842f7"} Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.129601 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.296161 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-combined-ca-bundle\") pod \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.296246 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c2e835e-2520-43fb-bb58-7b62b5e6a719-logs\") pod \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.296345 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46bvd\" (UniqueName: \"kubernetes.io/projected/3c2e835e-2520-43fb-bb58-7b62b5e6a719-kube-api-access-46bvd\") pod \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.296451 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-config-data\") pod \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\" (UID: \"3c2e835e-2520-43fb-bb58-7b62b5e6a719\") " Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.297387 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c2e835e-2520-43fb-bb58-7b62b5e6a719-logs" (OuterVolumeSpecName: "logs") pod "3c2e835e-2520-43fb-bb58-7b62b5e6a719" (UID: "3c2e835e-2520-43fb-bb58-7b62b5e6a719"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.303815 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c2e835e-2520-43fb-bb58-7b62b5e6a719-kube-api-access-46bvd" (OuterVolumeSpecName: "kube-api-access-46bvd") pod "3c2e835e-2520-43fb-bb58-7b62b5e6a719" (UID: "3c2e835e-2520-43fb-bb58-7b62b5e6a719"). InnerVolumeSpecName "kube-api-access-46bvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.332737 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-config-data" (OuterVolumeSpecName: "config-data") pod "3c2e835e-2520-43fb-bb58-7b62b5e6a719" (UID: "3c2e835e-2520-43fb-bb58-7b62b5e6a719"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.334257 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c2e835e-2520-43fb-bb58-7b62b5e6a719" (UID: "3c2e835e-2520-43fb-bb58-7b62b5e6a719"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.399476 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.399520 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c2e835e-2520-43fb-bb58-7b62b5e6a719-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.399531 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46bvd\" (UniqueName: \"kubernetes.io/projected/3c2e835e-2520-43fb-bb58-7b62b5e6a719-kube-api-access-46bvd\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:48 crc kubenswrapper[5118]: I0223 08:47:48.399545 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c2e835e-2520-43fb-bb58-7b62b5e6a719-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:48 crc kubenswrapper[5118]: E0223 08:47:48.562124 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:47:48 crc kubenswrapper[5118]: E0223 08:47:48.564300 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:47:48 crc kubenswrapper[5118]: E0223 08:47:48.565977 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:47:48 crc kubenswrapper[5118]: E0223 08:47:48.566078 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ca3a40be-b4c9-4e7e-97e3-fd23ae649889" containerName="nova-scheduler-scheduler" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.034869 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.073153 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c2e835e-2520-43fb-bb58-7b62b5e6a719","Type":"ContainerDied","Data":"ff690feef586cfad0256e09565f5b4837e83630481d4bdbd63d10b10c51c4082"} Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.073261 5118 scope.go:117] "RemoveContainer" containerID="864c2c0c4dab5720be4a7906dc34e4f90eb2d615cb472b23d30062c300e842f7" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.073619 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.077921 5118 generic.go:334] "Generic (PLEG): container finished" podID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerID="3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a" exitCode=0 Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.077977 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5e30b00-f716-45d5-b718-c06f52c855bf","Type":"ContainerDied","Data":"3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a"} Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.078026 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5e30b00-f716-45d5-b718-c06f52c855bf","Type":"ContainerDied","Data":"aaaaf71be3b2c33e141415382c426778865cdddb8508154b69127c23cd443966"} Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.078089 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.113463 5118 scope.go:117] "RemoveContainer" containerID="39966dbde33ad8fc3bb13d95441b1e330b6e63d7444b003ae8f78c2df864f36e" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.134734 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.165795 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.175447 5118 scope.go:117] "RemoveContainer" containerID="3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.178491 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:49 crc kubenswrapper[5118]: E0223 08:47:49.179123 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerName="nova-api-log" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.179142 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerName="nova-api-log" Feb 23 08:47:49 crc kubenswrapper[5118]: E0223 08:47:49.179161 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerName="nova-metadata-metadata" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.179173 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerName="nova-metadata-metadata" Feb 23 08:47:49 crc kubenswrapper[5118]: E0223 08:47:49.179192 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerName="nova-metadata-log" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.179202 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerName="nova-metadata-log" Feb 23 08:47:49 crc kubenswrapper[5118]: E0223 08:47:49.179229 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6582370-a366-423e-88a4-da5b540bb5bf" containerName="nova-manage" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.179238 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6582370-a366-423e-88a4-da5b540bb5bf" containerName="nova-manage" Feb 23 08:47:49 crc kubenswrapper[5118]: E0223 08:47:49.180311 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerName="nova-api-api" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.180331 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerName="nova-api-api" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.180786 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerName="nova-metadata-metadata" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.180818 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" containerName="nova-metadata-log" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.180844 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerName="nova-api-log" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.180858 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6582370-a366-423e-88a4-da5b540bb5bf" containerName="nova-manage" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.180869 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" containerName="nova-api-api" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.183198 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.187019 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.197284 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.223589 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8222v\" (UniqueName: \"kubernetes.io/projected/e5e30b00-f716-45d5-b718-c06f52c855bf-kube-api-access-8222v\") pod \"e5e30b00-f716-45d5-b718-c06f52c855bf\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.223811 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5e30b00-f716-45d5-b718-c06f52c855bf-logs\") pod \"e5e30b00-f716-45d5-b718-c06f52c855bf\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.223968 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-config-data\") pod \"e5e30b00-f716-45d5-b718-c06f52c855bf\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.224067 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-combined-ca-bundle\") pod \"e5e30b00-f716-45d5-b718-c06f52c855bf\" (UID: \"e5e30b00-f716-45d5-b718-c06f52c855bf\") " Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.224996 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e30b00-f716-45d5-b718-c06f52c855bf-logs" (OuterVolumeSpecName: "logs") pod "e5e30b00-f716-45d5-b718-c06f52c855bf" (UID: "e5e30b00-f716-45d5-b718-c06f52c855bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.227384 5118 scope.go:117] "RemoveContainer" containerID="6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.231748 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e30b00-f716-45d5-b718-c06f52c855bf-kube-api-access-8222v" (OuterVolumeSpecName: "kube-api-access-8222v") pod "e5e30b00-f716-45d5-b718-c06f52c855bf" (UID: "e5e30b00-f716-45d5-b718-c06f52c855bf"). InnerVolumeSpecName "kube-api-access-8222v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.251407 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-config-data" (OuterVolumeSpecName: "config-data") pod "e5e30b00-f716-45d5-b718-c06f52c855bf" (UID: "e5e30b00-f716-45d5-b718-c06f52c855bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.255012 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5e30b00-f716-45d5-b718-c06f52c855bf" (UID: "e5e30b00-f716-45d5-b718-c06f52c855bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.326248 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d576fe42-601f-4592-9c3f-61d507293af9-logs\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.326466 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-config-data\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.326684 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv9xc\" (UniqueName: \"kubernetes.io/projected/d576fe42-601f-4592-9c3f-61d507293af9-kube-api-access-fv9xc\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.326941 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.327882 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.327916 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e30b00-f716-45d5-b718-c06f52c855bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.327943 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8222v\" (UniqueName: \"kubernetes.io/projected/e5e30b00-f716-45d5-b718-c06f52c855bf-kube-api-access-8222v\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.327958 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5e30b00-f716-45d5-b718-c06f52c855bf-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.344789 5118 scope.go:117] "RemoveContainer" containerID="3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a" Feb 23 08:47:49 crc kubenswrapper[5118]: E0223 08:47:49.345778 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a\": container with ID starting with 3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a not found: ID does not exist" containerID="3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.345831 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a"} err="failed to get container status \"3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a\": rpc error: code = NotFound desc = could not find container \"3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a\": container with ID starting with 3175478dec3f614c639b49ecba69d42c880e176271c2fc636c5fb92a9293515a not found: ID does not exist" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.345866 5118 scope.go:117] "RemoveContainer" containerID="6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3" Feb 23 08:47:49 crc kubenswrapper[5118]: E0223 08:47:49.346572 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3\": container with ID starting with 6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3 not found: ID does not exist" containerID="6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.346607 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3"} err="failed to get container status \"6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3\": rpc error: code = NotFound desc = could not find container \"6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3\": container with ID starting with 6c42b6ee7db247d6b37ee2f3956c53136e83f552eacf8119f936b1720981a2d3 not found: ID does not exist" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.420010 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.428814 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.429575 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-config-data\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.429636 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv9xc\" (UniqueName: \"kubernetes.io/projected/d576fe42-601f-4592-9c3f-61d507293af9-kube-api-access-fv9xc\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.429684 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.429726 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d576fe42-601f-4592-9c3f-61d507293af9-logs\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.430146 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d576fe42-601f-4592-9c3f-61d507293af9-logs\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.438644 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.451206 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-config-data\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.458971 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv9xc\" (UniqueName: \"kubernetes.io/projected/d576fe42-601f-4592-9c3f-61d507293af9-kube-api-access-fv9xc\") pod \"nova-metadata-0\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.474916 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.485888 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.487224 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.498265 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.516311 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.639398 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccbce9a-839b-4566-a047-9cd168e610b4-logs\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.639448 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.639532 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-config-data\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.639570 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9zm5\" (UniqueName: \"kubernetes.io/projected/bccbce9a-839b-4566-a047-9cd168e610b4-kube-api-access-v9zm5\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.717614 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c2e835e-2520-43fb-bb58-7b62b5e6a719" path="/var/lib/kubelet/pods/3c2e835e-2520-43fb-bb58-7b62b5e6a719/volumes" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.718624 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e30b00-f716-45d5-b718-c06f52c855bf" path="/var/lib/kubelet/pods/e5e30b00-f716-45d5-b718-c06f52c855bf/volumes" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.741494 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-config-data\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.741568 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9zm5\" (UniqueName: \"kubernetes.io/projected/bccbce9a-839b-4566-a047-9cd168e610b4-kube-api-access-v9zm5\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.742266 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccbce9a-839b-4566-a047-9cd168e610b4-logs\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.742343 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.743226 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccbce9a-839b-4566-a047-9cd168e610b4-logs\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.747729 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-config-data\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.760238 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.762728 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9zm5\" (UniqueName: \"kubernetes.io/projected/bccbce9a-839b-4566-a047-9cd168e610b4-kube-api-access-v9zm5\") pod \"nova-api-0\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " pod="openstack/nova-api-0" Feb 23 08:47:49 crc kubenswrapper[5118]: I0223 08:47:49.824931 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:47:50 crc kubenswrapper[5118]: I0223 08:47:50.015810 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:47:50 crc kubenswrapper[5118]: I0223 08:47:50.096993 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d576fe42-601f-4592-9c3f-61d507293af9","Type":"ContainerStarted","Data":"b08d566111abf84b500aba9e95630b8d86befd8045b99138a5a3216976ef3186"} Feb 23 08:47:50 crc kubenswrapper[5118]: W0223 08:47:50.304355 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbccbce9a_839b_4566_a047_9cd168e610b4.slice/crio-92a21b3c8e45dafeb9c205a33e40b66bf84e268323034855efceb99312a09217 WatchSource:0}: Error finding container 92a21b3c8e45dafeb9c205a33e40b66bf84e268323034855efceb99312a09217: Status 404 returned error can't find the container with id 92a21b3c8e45dafeb9c205a33e40b66bf84e268323034855efceb99312a09217 Feb 23 08:47:50 crc kubenswrapper[5118]: I0223 08:47:50.308390 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:47:50 crc kubenswrapper[5118]: I0223 08:47:50.922950 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.070965 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm65p\" (UniqueName: \"kubernetes.io/projected/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-kube-api-access-nm65p\") pod \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.071185 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-config-data\") pod \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.071286 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-combined-ca-bundle\") pod \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\" (UID: \"ca3a40be-b4c9-4e7e-97e3-fd23ae649889\") " Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.077999 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-kube-api-access-nm65p" (OuterVolumeSpecName: "kube-api-access-nm65p") pod "ca3a40be-b4c9-4e7e-97e3-fd23ae649889" (UID: "ca3a40be-b4c9-4e7e-97e3-fd23ae649889"). InnerVolumeSpecName "kube-api-access-nm65p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.107598 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca3a40be-b4c9-4e7e-97e3-fd23ae649889" (UID: "ca3a40be-b4c9-4e7e-97e3-fd23ae649889"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.109525 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-config-data" (OuterVolumeSpecName: "config-data") pod "ca3a40be-b4c9-4e7e-97e3-fd23ae649889" (UID: "ca3a40be-b4c9-4e7e-97e3-fd23ae649889"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.122603 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d576fe42-601f-4592-9c3f-61d507293af9","Type":"ContainerStarted","Data":"195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513"} Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.122683 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d576fe42-601f-4592-9c3f-61d507293af9","Type":"ContainerStarted","Data":"20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371"} Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.125495 5118 generic.go:334] "Generic (PLEG): container finished" podID="ca3a40be-b4c9-4e7e-97e3-fd23ae649889" containerID="884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74" exitCode=0 Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.125545 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca3a40be-b4c9-4e7e-97e3-fd23ae649889","Type":"ContainerDied","Data":"884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74"} Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.125604 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ca3a40be-b4c9-4e7e-97e3-fd23ae649889","Type":"ContainerDied","Data":"ed15fb4346c938e79b1b02d487119954bc29e8fc9a93d27f5992a48810d44689"} Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.125610 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.125624 5118 scope.go:117] "RemoveContainer" containerID="884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.128021 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bccbce9a-839b-4566-a047-9cd168e610b4","Type":"ContainerStarted","Data":"7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd"} Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.128082 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bccbce9a-839b-4566-a047-9cd168e610b4","Type":"ContainerStarted","Data":"bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b"} Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.128119 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bccbce9a-839b-4566-a047-9cd168e610b4","Type":"ContainerStarted","Data":"92a21b3c8e45dafeb9c205a33e40b66bf84e268323034855efceb99312a09217"} Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.157503 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.157474429 podStartE2EDuration="2.157474429s" podCreationTimestamp="2026-02-23 08:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:51.14296352 +0000 UTC m=+7334.146748113" watchObservedRunningTime="2026-02-23 08:47:51.157474429 +0000 UTC m=+7334.161259002" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.173263 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.173695 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm65p\" (UniqueName: \"kubernetes.io/projected/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-kube-api-access-nm65p\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.173712 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3a40be-b4c9-4e7e-97e3-fd23ae649889-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.198340 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.198314603 podStartE2EDuration="2.198314603s" podCreationTimestamp="2026-02-23 08:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:51.179201933 +0000 UTC m=+7334.182986506" watchObservedRunningTime="2026-02-23 08:47:51.198314603 +0000 UTC m=+7334.202099176" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.202092 5118 scope.go:117] "RemoveContainer" containerID="884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74" Feb 23 08:47:51 crc kubenswrapper[5118]: E0223 08:47:51.205060 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74\": container with ID starting with 884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74 not found: ID does not exist" containerID="884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.205146 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74"} err="failed to get container status \"884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74\": rpc error: code = NotFound desc = could not find container \"884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74\": container with ID starting with 884bcb4703960e4a6f83edc8e72dbdaa3846ca9f181c625b95fff3535adafb74 not found: ID does not exist" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.225238 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.236217 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.244136 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:51 crc kubenswrapper[5118]: E0223 08:47:51.244840 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3a40be-b4c9-4e7e-97e3-fd23ae649889" containerName="nova-scheduler-scheduler" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.244916 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3a40be-b4c9-4e7e-97e3-fd23ae649889" containerName="nova-scheduler-scheduler" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.245258 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3a40be-b4c9-4e7e-97e3-fd23ae649889" containerName="nova-scheduler-scheduler" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.246276 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.249003 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.267830 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.377381 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.377567 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-config-data\") pod \"nova-scheduler-0\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.377807 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh24l\" (UniqueName: \"kubernetes.io/projected/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-kube-api-access-kh24l\") pod \"nova-scheduler-0\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.480249 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh24l\" (UniqueName: \"kubernetes.io/projected/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-kube-api-access-kh24l\") pod \"nova-scheduler-0\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.480478 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.480529 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-config-data\") pod \"nova-scheduler-0\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.485646 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.485772 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-config-data\") pod \"nova-scheduler-0\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.502192 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh24l\" (UniqueName: \"kubernetes.io/projected/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-kube-api-access-kh24l\") pod \"nova-scheduler-0\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.563298 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:47:51 crc kubenswrapper[5118]: I0223 08:47:51.715021 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3a40be-b4c9-4e7e-97e3-fd23ae649889" path="/var/lib/kubelet/pods/ca3a40be-b4c9-4e7e-97e3-fd23ae649889/volumes" Feb 23 08:47:52 crc kubenswrapper[5118]: I0223 08:47:52.159864 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:47:53 crc kubenswrapper[5118]: I0223 08:47:53.177700 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9","Type":"ContainerStarted","Data":"dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3"} Feb 23 08:47:53 crc kubenswrapper[5118]: I0223 08:47:53.178677 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9","Type":"ContainerStarted","Data":"23f03c27125a7cd1b91c7212d636c147d110df390e014a3f09f479cd8652e798"} Feb 23 08:47:53 crc kubenswrapper[5118]: I0223 08:47:53.210139 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.210063024 podStartE2EDuration="2.210063024s" podCreationTimestamp="2026-02-23 08:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:47:53.209955182 +0000 UTC m=+7336.213739795" watchObservedRunningTime="2026-02-23 08:47:53.210063024 +0000 UTC m=+7336.213847687" Feb 23 08:47:54 crc kubenswrapper[5118]: I0223 08:47:54.517464 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:47:54 crc kubenswrapper[5118]: I0223 08:47:54.517971 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:47:56 crc kubenswrapper[5118]: I0223 08:47:56.564464 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 08:47:59 crc kubenswrapper[5118]: I0223 08:47:59.518242 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 08:47:59 crc kubenswrapper[5118]: I0223 08:47:59.518827 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 08:47:59 crc kubenswrapper[5118]: I0223 08:47:59.826279 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:47:59 crc kubenswrapper[5118]: I0223 08:47:59.826340 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:48:00 crc kubenswrapper[5118]: I0223 08:48:00.600329 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d576fe42-601f-4592-9c3f-61d507293af9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.86:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:48:00 crc kubenswrapper[5118]: I0223 08:48:00.600384 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d576fe42-601f-4592-9c3f-61d507293af9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.86:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:48:00 crc kubenswrapper[5118]: I0223 08:48:00.908494 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.87:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:48:00 crc kubenswrapper[5118]: I0223 08:48:00.908798 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.87:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:48:01 crc kubenswrapper[5118]: I0223 08:48:01.563621 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 08:48:01 crc kubenswrapper[5118]: I0223 08:48:01.602991 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 08:48:02 crc kubenswrapper[5118]: I0223 08:48:02.345038 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 08:48:09 crc kubenswrapper[5118]: I0223 08:48:09.524881 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 08:48:09 crc kubenswrapper[5118]: I0223 08:48:09.525671 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 08:48:09 crc kubenswrapper[5118]: I0223 08:48:09.529787 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 08:48:09 crc kubenswrapper[5118]: I0223 08:48:09.531447 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 08:48:09 crc kubenswrapper[5118]: I0223 08:48:09.830983 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 08:48:09 crc kubenswrapper[5118]: I0223 08:48:09.833316 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 08:48:09 crc kubenswrapper[5118]: I0223 08:48:09.835290 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 08:48:09 crc kubenswrapper[5118]: I0223 08:48:09.836837 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.403028 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.410515 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.679719 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7876bbdd9c-gsctp"] Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.682032 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.694012 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7876bbdd9c-gsctp"] Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.858786 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-dns-svc\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.858856 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-config\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.858885 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-sb\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.858951 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-nb\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.861822 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4ng5\" (UniqueName: \"kubernetes.io/projected/ca3daa1a-aef3-4a38-82d3-9b87638123ab-kube-api-access-k4ng5\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.963909 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4ng5\" (UniqueName: \"kubernetes.io/projected/ca3daa1a-aef3-4a38-82d3-9b87638123ab-kube-api-access-k4ng5\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.963983 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-dns-svc\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.964009 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-config\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.964035 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-sb\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.964061 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-nb\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.965072 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-nb\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.966156 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-dns-svc\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.966824 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-config\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.967412 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-sb\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:10 crc kubenswrapper[5118]: I0223 08:48:10.995451 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4ng5\" (UniqueName: \"kubernetes.io/projected/ca3daa1a-aef3-4a38-82d3-9b87638123ab-kube-api-access-k4ng5\") pod \"dnsmasq-dns-7876bbdd9c-gsctp\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:11 crc kubenswrapper[5118]: I0223 08:48:11.032764 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:11 crc kubenswrapper[5118]: I0223 08:48:11.565320 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7876bbdd9c-gsctp"] Feb 23 08:48:12 crc kubenswrapper[5118]: I0223 08:48:12.424691 5118 generic.go:334] "Generic (PLEG): container finished" podID="ca3daa1a-aef3-4a38-82d3-9b87638123ab" containerID="3f8d91d8f93781bca7cd264703f679cfd62f9bd071ace2751ee7b348c2918382" exitCode=0 Feb 23 08:48:12 crc kubenswrapper[5118]: I0223 08:48:12.425479 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" event={"ID":"ca3daa1a-aef3-4a38-82d3-9b87638123ab","Type":"ContainerDied","Data":"3f8d91d8f93781bca7cd264703f679cfd62f9bd071ace2751ee7b348c2918382"} Feb 23 08:48:12 crc kubenswrapper[5118]: I0223 08:48:12.426386 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" event={"ID":"ca3daa1a-aef3-4a38-82d3-9b87638123ab","Type":"ContainerStarted","Data":"a313fe3424dc01fe62cf5d38c291b1d5994fd75831deca477bd5634a672322d6"} Feb 23 08:48:13 crc kubenswrapper[5118]: I0223 08:48:13.439075 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" event={"ID":"ca3daa1a-aef3-4a38-82d3-9b87638123ab","Type":"ContainerStarted","Data":"6369f880eea137dc4ab7ed1b3b57e5197888987cbc58828dd48692bda29ccf36"} Feb 23 08:48:13 crc kubenswrapper[5118]: I0223 08:48:13.439749 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:13 crc kubenswrapper[5118]: I0223 08:48:13.458316 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" podStartSLOduration=3.458296324 podStartE2EDuration="3.458296324s" podCreationTimestamp="2026-02-23 08:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:48:13.457068844 +0000 UTC m=+7356.460853407" watchObservedRunningTime="2026-02-23 08:48:13.458296324 +0000 UTC m=+7356.462080887" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.035318 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.123749 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9dbb6f47-x95pd"] Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.124437 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" podUID="4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" containerName="dnsmasq-dns" containerID="cri-o://7023fae62968ec8b48f3a38acafb5befc3831f86e7b6441021e0e28d90aee46a" gracePeriod=10 Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.584905 5118 generic.go:334] "Generic (PLEG): container finished" podID="4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" containerID="7023fae62968ec8b48f3a38acafb5befc3831f86e7b6441021e0e28d90aee46a" exitCode=0 Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.584983 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" event={"ID":"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95","Type":"ContainerDied","Data":"7023fae62968ec8b48f3a38acafb5befc3831f86e7b6441021e0e28d90aee46a"} Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.770963 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.825172 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-sb\") pod \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.825568 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmvtf\" (UniqueName: \"kubernetes.io/projected/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-kube-api-access-jmvtf\") pod \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.825625 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-nb\") pod \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.825654 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-config\") pod \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.825889 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-dns-svc\") pod \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\" (UID: \"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95\") " Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.836194 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-kube-api-access-jmvtf" (OuterVolumeSpecName: "kube-api-access-jmvtf") pod "4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" (UID: "4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95"). InnerVolumeSpecName "kube-api-access-jmvtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.900307 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" (UID: "4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.902898 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-config" (OuterVolumeSpecName: "config") pod "4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" (UID: "4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.921009 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" (UID: "4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.927659 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmvtf\" (UniqueName: \"kubernetes.io/projected/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-kube-api-access-jmvtf\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.927698 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.927708 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.927720 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:21 crc kubenswrapper[5118]: I0223 08:48:21.942954 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" (UID: "4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:48:22 crc kubenswrapper[5118]: I0223 08:48:22.030009 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:22 crc kubenswrapper[5118]: I0223 08:48:22.600331 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" event={"ID":"4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95","Type":"ContainerDied","Data":"fda8834ff4398927fc7e6dc2bde87a24e7f1fb26d9ff7265fc010a68e3e5c934"} Feb 23 08:48:22 crc kubenswrapper[5118]: I0223 08:48:22.600394 5118 scope.go:117] "RemoveContainer" containerID="7023fae62968ec8b48f3a38acafb5befc3831f86e7b6441021e0e28d90aee46a" Feb 23 08:48:22 crc kubenswrapper[5118]: I0223 08:48:22.600485 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9dbb6f47-x95pd" Feb 23 08:48:22 crc kubenswrapper[5118]: I0223 08:48:22.633193 5118 scope.go:117] "RemoveContainer" containerID="b888e9b874ca4b9ed8a78c6f1b2149b163a6ab83367e510c8688d0c12954e4a3" Feb 23 08:48:22 crc kubenswrapper[5118]: I0223 08:48:22.666487 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9dbb6f47-x95pd"] Feb 23 08:48:22 crc kubenswrapper[5118]: I0223 08:48:22.681289 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d9dbb6f47-x95pd"] Feb 23 08:48:23 crc kubenswrapper[5118]: I0223 08:48:23.713750 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" path="/var/lib/kubelet/pods/4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95/volumes" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.350616 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sgvkz"] Feb 23 08:48:24 crc kubenswrapper[5118]: E0223 08:48:24.351113 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" containerName="init" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.351128 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" containerName="init" Feb 23 08:48:24 crc kubenswrapper[5118]: E0223 08:48:24.351151 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" containerName="dnsmasq-dns" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.351157 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" containerName="dnsmasq-dns" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.351319 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c980ee1-3fe0-4e99-9a1d-14e5c08ceb95" containerName="dnsmasq-dns" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.352160 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sgvkz" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.362842 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sgvkz"] Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.429287 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7106-account-create-update-8qzp6"] Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.430987 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7106-account-create-update-8qzp6" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.434370 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.449250 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7106-account-create-update-8qzp6"] Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.490483 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzplf\" (UniqueName: \"kubernetes.io/projected/fe01c183-1593-499e-a295-d927371fae2f-kube-api-access-hzplf\") pod \"cinder-db-create-sgvkz\" (UID: \"fe01c183-1593-499e-a295-d927371fae2f\") " pod="openstack/cinder-db-create-sgvkz" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.490812 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe01c183-1593-499e-a295-d927371fae2f-operator-scripts\") pod \"cinder-db-create-sgvkz\" (UID: \"fe01c183-1593-499e-a295-d927371fae2f\") " pod="openstack/cinder-db-create-sgvkz" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.592766 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzplf\" (UniqueName: \"kubernetes.io/projected/fe01c183-1593-499e-a295-d927371fae2f-kube-api-access-hzplf\") pod \"cinder-db-create-sgvkz\" (UID: \"fe01c183-1593-499e-a295-d927371fae2f\") " pod="openstack/cinder-db-create-sgvkz" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.592822 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f624a50-4192-4e8b-9759-9610d25bd843-operator-scripts\") pod \"cinder-7106-account-create-update-8qzp6\" (UID: \"2f624a50-4192-4e8b-9759-9610d25bd843\") " pod="openstack/cinder-7106-account-create-update-8qzp6" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.592904 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe01c183-1593-499e-a295-d927371fae2f-operator-scripts\") pod \"cinder-db-create-sgvkz\" (UID: \"fe01c183-1593-499e-a295-d927371fae2f\") " pod="openstack/cinder-db-create-sgvkz" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.593003 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqq49\" (UniqueName: \"kubernetes.io/projected/2f624a50-4192-4e8b-9759-9610d25bd843-kube-api-access-sqq49\") pod \"cinder-7106-account-create-update-8qzp6\" (UID: \"2f624a50-4192-4e8b-9759-9610d25bd843\") " pod="openstack/cinder-7106-account-create-update-8qzp6" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.593709 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe01c183-1593-499e-a295-d927371fae2f-operator-scripts\") pod \"cinder-db-create-sgvkz\" (UID: \"fe01c183-1593-499e-a295-d927371fae2f\") " pod="openstack/cinder-db-create-sgvkz" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.625553 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzplf\" (UniqueName: \"kubernetes.io/projected/fe01c183-1593-499e-a295-d927371fae2f-kube-api-access-hzplf\") pod \"cinder-db-create-sgvkz\" (UID: \"fe01c183-1593-499e-a295-d927371fae2f\") " pod="openstack/cinder-db-create-sgvkz" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.680757 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sgvkz" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.696277 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqq49\" (UniqueName: \"kubernetes.io/projected/2f624a50-4192-4e8b-9759-9610d25bd843-kube-api-access-sqq49\") pod \"cinder-7106-account-create-update-8qzp6\" (UID: \"2f624a50-4192-4e8b-9759-9610d25bd843\") " pod="openstack/cinder-7106-account-create-update-8qzp6" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.696372 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f624a50-4192-4e8b-9759-9610d25bd843-operator-scripts\") pod \"cinder-7106-account-create-update-8qzp6\" (UID: \"2f624a50-4192-4e8b-9759-9610d25bd843\") " pod="openstack/cinder-7106-account-create-update-8qzp6" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.697163 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f624a50-4192-4e8b-9759-9610d25bd843-operator-scripts\") pod \"cinder-7106-account-create-update-8qzp6\" (UID: \"2f624a50-4192-4e8b-9759-9610d25bd843\") " pod="openstack/cinder-7106-account-create-update-8qzp6" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.713499 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqq49\" (UniqueName: \"kubernetes.io/projected/2f624a50-4192-4e8b-9759-9610d25bd843-kube-api-access-sqq49\") pod \"cinder-7106-account-create-update-8qzp6\" (UID: \"2f624a50-4192-4e8b-9759-9610d25bd843\") " pod="openstack/cinder-7106-account-create-update-8qzp6" Feb 23 08:48:24 crc kubenswrapper[5118]: I0223 08:48:24.754839 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7106-account-create-update-8qzp6" Feb 23 08:48:25 crc kubenswrapper[5118]: I0223 08:48:25.161230 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sgvkz"] Feb 23 08:48:25 crc kubenswrapper[5118]: W0223 08:48:25.162003 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe01c183_1593_499e_a295_d927371fae2f.slice/crio-81891563bfb9cf499771e8bd24c5ac8254721f68eea505f876d3e7851a8b7706 WatchSource:0}: Error finding container 81891563bfb9cf499771e8bd24c5ac8254721f68eea505f876d3e7851a8b7706: Status 404 returned error can't find the container with id 81891563bfb9cf499771e8bd24c5ac8254721f68eea505f876d3e7851a8b7706 Feb 23 08:48:25 crc kubenswrapper[5118]: I0223 08:48:25.378878 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7106-account-create-update-8qzp6"] Feb 23 08:48:25 crc kubenswrapper[5118]: W0223 08:48:25.385599 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f624a50_4192_4e8b_9759_9610d25bd843.slice/crio-cbbbb60e3e76102f4f38a68571a06498bff233c1db6c583eadf3fa4a7b6db2c4 WatchSource:0}: Error finding container cbbbb60e3e76102f4f38a68571a06498bff233c1db6c583eadf3fa4a7b6db2c4: Status 404 returned error can't find the container with id cbbbb60e3e76102f4f38a68571a06498bff233c1db6c583eadf3fa4a7b6db2c4 Feb 23 08:48:25 crc kubenswrapper[5118]: I0223 08:48:25.629375 5118 generic.go:334] "Generic (PLEG): container finished" podID="fe01c183-1593-499e-a295-d927371fae2f" containerID="8551dd328058db9971618b44ca26868db695791ba6cab633179fb403d34348a7" exitCode=0 Feb 23 08:48:25 crc kubenswrapper[5118]: I0223 08:48:25.629493 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sgvkz" event={"ID":"fe01c183-1593-499e-a295-d927371fae2f","Type":"ContainerDied","Data":"8551dd328058db9971618b44ca26868db695791ba6cab633179fb403d34348a7"} Feb 23 08:48:25 crc kubenswrapper[5118]: I0223 08:48:25.630235 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sgvkz" event={"ID":"fe01c183-1593-499e-a295-d927371fae2f","Type":"ContainerStarted","Data":"81891563bfb9cf499771e8bd24c5ac8254721f68eea505f876d3e7851a8b7706"} Feb 23 08:48:25 crc kubenswrapper[5118]: I0223 08:48:25.632623 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7106-account-create-update-8qzp6" event={"ID":"2f624a50-4192-4e8b-9759-9610d25bd843","Type":"ContainerStarted","Data":"6954477e9bfb929d48ee7e253172c3d2bfbdcbff4194cc62381fbd21f0391c8a"} Feb 23 08:48:25 crc kubenswrapper[5118]: I0223 08:48:25.632669 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7106-account-create-update-8qzp6" event={"ID":"2f624a50-4192-4e8b-9759-9610d25bd843","Type":"ContainerStarted","Data":"cbbbb60e3e76102f4f38a68571a06498bff233c1db6c583eadf3fa4a7b6db2c4"} Feb 23 08:48:25 crc kubenswrapper[5118]: I0223 08:48:25.672402 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7106-account-create-update-8qzp6" podStartSLOduration=1.672385478 podStartE2EDuration="1.672385478s" podCreationTimestamp="2026-02-23 08:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:48:25.669207672 +0000 UTC m=+7368.672992255" watchObservedRunningTime="2026-02-23 08:48:25.672385478 +0000 UTC m=+7368.676170051" Feb 23 08:48:26 crc kubenswrapper[5118]: I0223 08:48:26.646257 5118 generic.go:334] "Generic (PLEG): container finished" podID="2f624a50-4192-4e8b-9759-9610d25bd843" containerID="6954477e9bfb929d48ee7e253172c3d2bfbdcbff4194cc62381fbd21f0391c8a" exitCode=0 Feb 23 08:48:26 crc kubenswrapper[5118]: I0223 08:48:26.646408 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7106-account-create-update-8qzp6" event={"ID":"2f624a50-4192-4e8b-9759-9610d25bd843","Type":"ContainerDied","Data":"6954477e9bfb929d48ee7e253172c3d2bfbdcbff4194cc62381fbd21f0391c8a"} Feb 23 08:48:27 crc kubenswrapper[5118]: I0223 08:48:27.238967 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sgvkz" Feb 23 08:48:27 crc kubenswrapper[5118]: I0223 08:48:27.349165 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzplf\" (UniqueName: \"kubernetes.io/projected/fe01c183-1593-499e-a295-d927371fae2f-kube-api-access-hzplf\") pod \"fe01c183-1593-499e-a295-d927371fae2f\" (UID: \"fe01c183-1593-499e-a295-d927371fae2f\") " Feb 23 08:48:27 crc kubenswrapper[5118]: I0223 08:48:27.349661 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe01c183-1593-499e-a295-d927371fae2f-operator-scripts\") pod \"fe01c183-1593-499e-a295-d927371fae2f\" (UID: \"fe01c183-1593-499e-a295-d927371fae2f\") " Feb 23 08:48:27 crc kubenswrapper[5118]: I0223 08:48:27.350868 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe01c183-1593-499e-a295-d927371fae2f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe01c183-1593-499e-a295-d927371fae2f" (UID: "fe01c183-1593-499e-a295-d927371fae2f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:48:27 crc kubenswrapper[5118]: I0223 08:48:27.358552 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe01c183-1593-499e-a295-d927371fae2f-kube-api-access-hzplf" (OuterVolumeSpecName: "kube-api-access-hzplf") pod "fe01c183-1593-499e-a295-d927371fae2f" (UID: "fe01c183-1593-499e-a295-d927371fae2f"). InnerVolumeSpecName "kube-api-access-hzplf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:48:27 crc kubenswrapper[5118]: I0223 08:48:27.452301 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe01c183-1593-499e-a295-d927371fae2f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:27 crc kubenswrapper[5118]: I0223 08:48:27.452342 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzplf\" (UniqueName: \"kubernetes.io/projected/fe01c183-1593-499e-a295-d927371fae2f-kube-api-access-hzplf\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:27 crc kubenswrapper[5118]: I0223 08:48:27.692892 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sgvkz" event={"ID":"fe01c183-1593-499e-a295-d927371fae2f","Type":"ContainerDied","Data":"81891563bfb9cf499771e8bd24c5ac8254721f68eea505f876d3e7851a8b7706"} Feb 23 08:48:27 crc kubenswrapper[5118]: I0223 08:48:27.692974 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81891563bfb9cf499771e8bd24c5ac8254721f68eea505f876d3e7851a8b7706" Feb 23 08:48:27 crc kubenswrapper[5118]: I0223 08:48:27.692931 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sgvkz" Feb 23 08:48:28 crc kubenswrapper[5118]: I0223 08:48:28.191329 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7106-account-create-update-8qzp6" Feb 23 08:48:28 crc kubenswrapper[5118]: I0223 08:48:28.290350 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqq49\" (UniqueName: \"kubernetes.io/projected/2f624a50-4192-4e8b-9759-9610d25bd843-kube-api-access-sqq49\") pod \"2f624a50-4192-4e8b-9759-9610d25bd843\" (UID: \"2f624a50-4192-4e8b-9759-9610d25bd843\") " Feb 23 08:48:28 crc kubenswrapper[5118]: I0223 08:48:28.290728 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f624a50-4192-4e8b-9759-9610d25bd843-operator-scripts\") pod \"2f624a50-4192-4e8b-9759-9610d25bd843\" (UID: \"2f624a50-4192-4e8b-9759-9610d25bd843\") " Feb 23 08:48:28 crc kubenswrapper[5118]: I0223 08:48:28.291801 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f624a50-4192-4e8b-9759-9610d25bd843-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f624a50-4192-4e8b-9759-9610d25bd843" (UID: "2f624a50-4192-4e8b-9759-9610d25bd843"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:48:28 crc kubenswrapper[5118]: I0223 08:48:28.301863 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f624a50-4192-4e8b-9759-9610d25bd843-kube-api-access-sqq49" (OuterVolumeSpecName: "kube-api-access-sqq49") pod "2f624a50-4192-4e8b-9759-9610d25bd843" (UID: "2f624a50-4192-4e8b-9759-9610d25bd843"). InnerVolumeSpecName "kube-api-access-sqq49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:48:28 crc kubenswrapper[5118]: I0223 08:48:28.393142 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqq49\" (UniqueName: \"kubernetes.io/projected/2f624a50-4192-4e8b-9759-9610d25bd843-kube-api-access-sqq49\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:28 crc kubenswrapper[5118]: I0223 08:48:28.393191 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f624a50-4192-4e8b-9759-9610d25bd843-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:48:28 crc kubenswrapper[5118]: I0223 08:48:28.711415 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7106-account-create-update-8qzp6" event={"ID":"2f624a50-4192-4e8b-9759-9610d25bd843","Type":"ContainerDied","Data":"cbbbb60e3e76102f4f38a68571a06498bff233c1db6c583eadf3fa4a7b6db2c4"} Feb 23 08:48:28 crc kubenswrapper[5118]: I0223 08:48:28.711485 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbbbb60e3e76102f4f38a68571a06498bff233c1db6c583eadf3fa4a7b6db2c4" Feb 23 08:48:28 crc kubenswrapper[5118]: I0223 08:48:28.711497 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7106-account-create-update-8qzp6" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.765226 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fszjc"] Feb 23 08:48:29 crc kubenswrapper[5118]: E0223 08:48:29.766141 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f624a50-4192-4e8b-9759-9610d25bd843" containerName="mariadb-account-create-update" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.766161 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f624a50-4192-4e8b-9759-9610d25bd843" containerName="mariadb-account-create-update" Feb 23 08:48:29 crc kubenswrapper[5118]: E0223 08:48:29.766188 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe01c183-1593-499e-a295-d927371fae2f" containerName="mariadb-database-create" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.766196 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe01c183-1593-499e-a295-d927371fae2f" containerName="mariadb-database-create" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.766455 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f624a50-4192-4e8b-9759-9610d25bd843" containerName="mariadb-account-create-update" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.766475 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe01c183-1593-499e-a295-d927371fae2f" containerName="mariadb-database-create" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.767386 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.770286 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.770491 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t827v" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.772688 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.777022 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fszjc"] Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.927049 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-combined-ca-bundle\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.927236 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-scripts\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.927300 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-config-data\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.927374 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f526e0b-c3b6-42d7-ba48-0d476701b579-etc-machine-id\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.927446 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-db-sync-config-data\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:29 crc kubenswrapper[5118]: I0223 08:48:29.927473 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkq8h\" (UniqueName: \"kubernetes.io/projected/5f526e0b-c3b6-42d7-ba48-0d476701b579-kube-api-access-zkq8h\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.029470 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-scripts\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.029546 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-config-data\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.029632 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f526e0b-c3b6-42d7-ba48-0d476701b579-etc-machine-id\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.029725 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-db-sync-config-data\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.029765 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkq8h\" (UniqueName: \"kubernetes.io/projected/5f526e0b-c3b6-42d7-ba48-0d476701b579-kube-api-access-zkq8h\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.029822 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-combined-ca-bundle\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.029855 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f526e0b-c3b6-42d7-ba48-0d476701b579-etc-machine-id\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.037757 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-combined-ca-bundle\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.037815 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-db-sync-config-data\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.042270 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-scripts\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.046176 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-config-data\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.054765 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkq8h\" (UniqueName: \"kubernetes.io/projected/5f526e0b-c3b6-42d7-ba48-0d476701b579-kube-api-access-zkq8h\") pod \"cinder-db-sync-fszjc\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.084831 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fszjc" Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.609033 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fszjc"] Feb 23 08:48:30 crc kubenswrapper[5118]: I0223 08:48:30.734456 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fszjc" event={"ID":"5f526e0b-c3b6-42d7-ba48-0d476701b579","Type":"ContainerStarted","Data":"67a1c825e3206bde5ae8f96e7c3b02e16f717717b4caa4129a05d5c047c4a90e"} Feb 23 08:48:50 crc kubenswrapper[5118]: E0223 08:48:50.509836 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8419493e1fd846703d277695e03fc5eb" Feb 23 08:48:50 crc kubenswrapper[5118]: E0223 08:48:50.510845 5118 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8419493e1fd846703d277695e03fc5eb" Feb 23 08:48:50 crc kubenswrapper[5118]: E0223 08:48:50.511035 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8419493e1fd846703d277695e03fc5eb,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zkq8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fszjc_openstack(5f526e0b-c3b6-42d7-ba48-0d476701b579): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 08:48:50 crc kubenswrapper[5118]: E0223 08:48:50.512457 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fszjc" podUID="5f526e0b-c3b6-42d7-ba48-0d476701b579" Feb 23 08:48:50 crc kubenswrapper[5118]: E0223 08:48:50.948297 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/cinder-db-sync-fszjc" podUID="5f526e0b-c3b6-42d7-ba48-0d476701b579" Feb 23 08:49:03 crc kubenswrapper[5118]: I0223 08:49:03.289760 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fszjc" event={"ID":"5f526e0b-c3b6-42d7-ba48-0d476701b579","Type":"ContainerStarted","Data":"4109af6013c738107be704ca631d39f3a74fbe065217b23d61a2be23c6929ffc"} Feb 23 08:49:03 crc kubenswrapper[5118]: I0223 08:49:03.317765 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fszjc" podStartSLOduration=2.796970764 podStartE2EDuration="34.317743411s" podCreationTimestamp="2026-02-23 08:48:29 +0000 UTC" firstStartedPulling="2026-02-23 08:48:30.615794706 +0000 UTC m=+7373.619579279" lastFinishedPulling="2026-02-23 08:49:02.136567353 +0000 UTC m=+7405.140351926" observedRunningTime="2026-02-23 08:49:03.307906214 +0000 UTC m=+7406.311690787" watchObservedRunningTime="2026-02-23 08:49:03.317743411 +0000 UTC m=+7406.321527994" Feb 23 08:49:06 crc kubenswrapper[5118]: I0223 08:49:06.327124 5118 generic.go:334] "Generic (PLEG): container finished" podID="5f526e0b-c3b6-42d7-ba48-0d476701b579" containerID="4109af6013c738107be704ca631d39f3a74fbe065217b23d61a2be23c6929ffc" exitCode=0 Feb 23 08:49:06 crc kubenswrapper[5118]: I0223 08:49:06.327153 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fszjc" event={"ID":"5f526e0b-c3b6-42d7-ba48-0d476701b579","Type":"ContainerDied","Data":"4109af6013c738107be704ca631d39f3a74fbe065217b23d61a2be23c6929ffc"} Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.787082 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fszjc" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.841927 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkq8h\" (UniqueName: \"kubernetes.io/projected/5f526e0b-c3b6-42d7-ba48-0d476701b579-kube-api-access-zkq8h\") pod \"5f526e0b-c3b6-42d7-ba48-0d476701b579\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.842084 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-config-data\") pod \"5f526e0b-c3b6-42d7-ba48-0d476701b579\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.842238 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-db-sync-config-data\") pod \"5f526e0b-c3b6-42d7-ba48-0d476701b579\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.842288 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-combined-ca-bundle\") pod \"5f526e0b-c3b6-42d7-ba48-0d476701b579\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.842437 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f526e0b-c3b6-42d7-ba48-0d476701b579-etc-machine-id\") pod \"5f526e0b-c3b6-42d7-ba48-0d476701b579\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.842511 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-scripts\") pod \"5f526e0b-c3b6-42d7-ba48-0d476701b579\" (UID: \"5f526e0b-c3b6-42d7-ba48-0d476701b579\") " Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.843043 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f526e0b-c3b6-42d7-ba48-0d476701b579-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5f526e0b-c3b6-42d7-ba48-0d476701b579" (UID: "5f526e0b-c3b6-42d7-ba48-0d476701b579"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.843302 5118 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5f526e0b-c3b6-42d7-ba48-0d476701b579-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.853740 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-scripts" (OuterVolumeSpecName: "scripts") pod "5f526e0b-c3b6-42d7-ba48-0d476701b579" (UID: "5f526e0b-c3b6-42d7-ba48-0d476701b579"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.854905 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f526e0b-c3b6-42d7-ba48-0d476701b579-kube-api-access-zkq8h" (OuterVolumeSpecName: "kube-api-access-zkq8h") pod "5f526e0b-c3b6-42d7-ba48-0d476701b579" (UID: "5f526e0b-c3b6-42d7-ba48-0d476701b579"). InnerVolumeSpecName "kube-api-access-zkq8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.855209 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5f526e0b-c3b6-42d7-ba48-0d476701b579" (UID: "5f526e0b-c3b6-42d7-ba48-0d476701b579"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.884057 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f526e0b-c3b6-42d7-ba48-0d476701b579" (UID: "5f526e0b-c3b6-42d7-ba48-0d476701b579"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.905412 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-config-data" (OuterVolumeSpecName: "config-data") pod "5f526e0b-c3b6-42d7-ba48-0d476701b579" (UID: "5f526e0b-c3b6-42d7-ba48-0d476701b579"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.945263 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.945319 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkq8h\" (UniqueName: \"kubernetes.io/projected/5f526e0b-c3b6-42d7-ba48-0d476701b579-kube-api-access-zkq8h\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.945333 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.945345 5118 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:07 crc kubenswrapper[5118]: I0223 08:49:07.945356 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f526e0b-c3b6-42d7-ba48-0d476701b579-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.353513 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fszjc" event={"ID":"5f526e0b-c3b6-42d7-ba48-0d476701b579","Type":"ContainerDied","Data":"67a1c825e3206bde5ae8f96e7c3b02e16f717717b4caa4129a05d5c047c4a90e"} Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.353905 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67a1c825e3206bde5ae8f96e7c3b02e16f717717b4caa4129a05d5c047c4a90e" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.353794 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fszjc" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.696695 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b6d64ffc9-rwxpm"] Feb 23 08:49:08 crc kubenswrapper[5118]: E0223 08:49:08.697529 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f526e0b-c3b6-42d7-ba48-0d476701b579" containerName="cinder-db-sync" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.697555 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f526e0b-c3b6-42d7-ba48-0d476701b579" containerName="cinder-db-sync" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.697839 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f526e0b-c3b6-42d7-ba48-0d476701b579" containerName="cinder-db-sync" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.699268 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.761049 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b6d64ffc9-rwxpm"] Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.865523 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-dns-svc\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.865580 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.865603 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.865693 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qlmh\" (UniqueName: \"kubernetes.io/projected/214a2d91-6efd-4585-951e-00ab57da645b-kube-api-access-2qlmh\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.865742 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-config\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.925260 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.927542 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.934749 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.934937 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.935148 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.935293 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t827v" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.944243 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.968954 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qlmh\" (UniqueName: \"kubernetes.io/projected/214a2d91-6efd-4585-951e-00ab57da645b-kube-api-access-2qlmh\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.969428 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-config\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.969909 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.969953 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-dns-svc\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.970030 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.971449 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.972047 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-config\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.972587 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:08 crc kubenswrapper[5118]: I0223 08:49:08.973146 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-dns-svc\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.000141 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qlmh\" (UniqueName: \"kubernetes.io/projected/214a2d91-6efd-4585-951e-00ab57da645b-kube-api-access-2qlmh\") pod \"dnsmasq-dns-7b6d64ffc9-rwxpm\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.062930 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.072087 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05f06da1-9540-4385-9815-17ba1edd64d2-logs\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.072162 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgzm\" (UniqueName: \"kubernetes.io/projected/05f06da1-9540-4385-9815-17ba1edd64d2-kube-api-access-vxgzm\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.072211 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.072238 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-scripts\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.072264 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05f06da1-9540-4385-9815-17ba1edd64d2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.072716 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data-custom\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.073008 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.181116 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data-custom\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.181245 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.181360 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05f06da1-9540-4385-9815-17ba1edd64d2-logs\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.181400 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgzm\" (UniqueName: \"kubernetes.io/projected/05f06da1-9540-4385-9815-17ba1edd64d2-kube-api-access-vxgzm\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.181485 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.181518 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-scripts\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.181552 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05f06da1-9540-4385-9815-17ba1edd64d2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.181654 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05f06da1-9540-4385-9815-17ba1edd64d2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.184972 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05f06da1-9540-4385-9815-17ba1edd64d2-logs\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.205379 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data-custom\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.208631 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-scripts\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.209482 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.239529 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.263797 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgzm\" (UniqueName: \"kubernetes.io/projected/05f06da1-9540-4385-9815-17ba1edd64d2-kube-api-access-vxgzm\") pod \"cinder-api-0\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.549066 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:49:09 crc kubenswrapper[5118]: I0223 08:49:09.778663 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b6d64ffc9-rwxpm"] Feb 23 08:49:10 crc kubenswrapper[5118]: I0223 08:49:10.096661 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:49:10 crc kubenswrapper[5118]: W0223 08:49:10.122066 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05f06da1_9540_4385_9815_17ba1edd64d2.slice/crio-f6446efc2778963504bc60721f289032571fadfeb86f57c9048ba551c2a99876 WatchSource:0}: Error finding container f6446efc2778963504bc60721f289032571fadfeb86f57c9048ba551c2a99876: Status 404 returned error can't find the container with id f6446efc2778963504bc60721f289032571fadfeb86f57c9048ba551c2a99876 Feb 23 08:49:10 crc kubenswrapper[5118]: I0223 08:49:10.381311 5118 generic.go:334] "Generic (PLEG): container finished" podID="214a2d91-6efd-4585-951e-00ab57da645b" containerID="b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e" exitCode=0 Feb 23 08:49:10 crc kubenswrapper[5118]: I0223 08:49:10.381843 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" event={"ID":"214a2d91-6efd-4585-951e-00ab57da645b","Type":"ContainerDied","Data":"b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e"} Feb 23 08:49:10 crc kubenswrapper[5118]: I0223 08:49:10.381879 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" event={"ID":"214a2d91-6efd-4585-951e-00ab57da645b","Type":"ContainerStarted","Data":"cc8a8ed5f04d1d0443062700e1bdfd6730efe5cef7e675c5bc6cbbab255cd06e"} Feb 23 08:49:10 crc kubenswrapper[5118]: I0223 08:49:10.388768 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05f06da1-9540-4385-9815-17ba1edd64d2","Type":"ContainerStarted","Data":"f6446efc2778963504bc60721f289032571fadfeb86f57c9048ba551c2a99876"} Feb 23 08:49:11 crc kubenswrapper[5118]: I0223 08:49:11.409484 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" event={"ID":"214a2d91-6efd-4585-951e-00ab57da645b","Type":"ContainerStarted","Data":"3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b"} Feb 23 08:49:11 crc kubenswrapper[5118]: I0223 08:49:11.411469 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:11 crc kubenswrapper[5118]: I0223 08:49:11.423137 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05f06da1-9540-4385-9815-17ba1edd64d2","Type":"ContainerStarted","Data":"fe042ef0d555dd5feaf2513d36e29a1302dc0d12f7548bc911336fb4754bbff8"} Feb 23 08:49:11 crc kubenswrapper[5118]: I0223 08:49:11.447670 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" podStartSLOduration=3.447647441 podStartE2EDuration="3.447647441s" podCreationTimestamp="2026-02-23 08:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:11.435575201 +0000 UTC m=+7414.439359784" watchObservedRunningTime="2026-02-23 08:49:11.447647441 +0000 UTC m=+7414.451432024" Feb 23 08:49:12 crc kubenswrapper[5118]: I0223 08:49:12.441181 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05f06da1-9540-4385-9815-17ba1edd64d2","Type":"ContainerStarted","Data":"f2494bf1123c09af1557b911f3583242dfd171e5e0c831ff907da92cae520a68"} Feb 23 08:49:12 crc kubenswrapper[5118]: I0223 08:49:12.441708 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 08:49:12 crc kubenswrapper[5118]: I0223 08:49:12.483036 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.482989477 podStartE2EDuration="4.482989477s" podCreationTimestamp="2026-02-23 08:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:12.459133253 +0000 UTC m=+7415.462917876" watchObservedRunningTime="2026-02-23 08:49:12.482989477 +0000 UTC m=+7415.486774070" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.064312 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.144644 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7876bbdd9c-gsctp"] Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.144940 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" podUID="ca3daa1a-aef3-4a38-82d3-9b87638123ab" containerName="dnsmasq-dns" containerID="cri-o://6369f880eea137dc4ab7ed1b3b57e5197888987cbc58828dd48692bda29ccf36" gracePeriod=10 Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.540397 5118 generic.go:334] "Generic (PLEG): container finished" podID="ca3daa1a-aef3-4a38-82d3-9b87638123ab" containerID="6369f880eea137dc4ab7ed1b3b57e5197888987cbc58828dd48692bda29ccf36" exitCode=0 Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.540764 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" event={"ID":"ca3daa1a-aef3-4a38-82d3-9b87638123ab","Type":"ContainerDied","Data":"6369f880eea137dc4ab7ed1b3b57e5197888987cbc58828dd48692bda29ccf36"} Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.723718 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.777637 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-nb\") pod \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.777894 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-sb\") pod \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.846640 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ca3daa1a-aef3-4a38-82d3-9b87638123ab" (UID: "ca3daa1a-aef3-4a38-82d3-9b87638123ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.856625 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ca3daa1a-aef3-4a38-82d3-9b87638123ab" (UID: "ca3daa1a-aef3-4a38-82d3-9b87638123ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.879226 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-config\") pod \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.879652 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-dns-svc\") pod \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.879742 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4ng5\" (UniqueName: \"kubernetes.io/projected/ca3daa1a-aef3-4a38-82d3-9b87638123ab-kube-api-access-k4ng5\") pod \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\" (UID: \"ca3daa1a-aef3-4a38-82d3-9b87638123ab\") " Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.880078 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.880105 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.883786 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3daa1a-aef3-4a38-82d3-9b87638123ab-kube-api-access-k4ng5" (OuterVolumeSpecName: "kube-api-access-k4ng5") pod "ca3daa1a-aef3-4a38-82d3-9b87638123ab" (UID: "ca3daa1a-aef3-4a38-82d3-9b87638123ab"). InnerVolumeSpecName "kube-api-access-k4ng5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.924661 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-config" (OuterVolumeSpecName: "config") pod "ca3daa1a-aef3-4a38-82d3-9b87638123ab" (UID: "ca3daa1a-aef3-4a38-82d3-9b87638123ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.925363 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca3daa1a-aef3-4a38-82d3-9b87638123ab" (UID: "ca3daa1a-aef3-4a38-82d3-9b87638123ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.981511 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.981557 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca3daa1a-aef3-4a38-82d3-9b87638123ab-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:19 crc kubenswrapper[5118]: I0223 08:49:19.981574 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4ng5\" (UniqueName: \"kubernetes.io/projected/ca3daa1a-aef3-4a38-82d3-9b87638123ab-kube-api-access-k4ng5\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.552842 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" event={"ID":"ca3daa1a-aef3-4a38-82d3-9b87638123ab","Type":"ContainerDied","Data":"a313fe3424dc01fe62cf5d38c291b1d5994fd75831deca477bd5634a672322d6"} Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.552905 5118 scope.go:117] "RemoveContainer" containerID="6369f880eea137dc4ab7ed1b3b57e5197888987cbc58828dd48692bda29ccf36" Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.553073 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7876bbdd9c-gsctp" Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.601915 5118 scope.go:117] "RemoveContainer" containerID="3f8d91d8f93781bca7cd264703f679cfd62f9bd071ace2751ee7b348c2918382" Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.616177 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7876bbdd9c-gsctp"] Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.628649 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7876bbdd9c-gsctp"] Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.760856 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.761482 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d576fe42-601f-4592-9c3f-61d507293af9" containerName="nova-metadata-log" containerID="cri-o://20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371" gracePeriod=30 Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.761567 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d576fe42-601f-4592-9c3f-61d507293af9" containerName="nova-metadata-metadata" containerID="cri-o://195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513" gracePeriod=30 Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.777209 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.777474 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="57f97ea1-96a6-4bb5-8df9-5f14c5c96e52" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://78d34c23cc6382d352e86b30a107dc2a84567ddc2ec844444c8d11077d4669bc" gracePeriod=30 Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.793921 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.794248 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" containerName="nova-scheduler-scheduler" containerID="cri-o://dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3" gracePeriod=30 Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.808404 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.808737 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" containerName="nova-api-log" containerID="cri-o://bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b" gracePeriod=30 Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.809203 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" containerName="nova-api-api" containerID="cri-o://7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd" gracePeriod=30 Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.822225 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.822447 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="00136a25-626f-4abc-9f43-776ce7aa84bd" containerName="nova-cell0-conductor-conductor" containerID="cri-o://b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286" gracePeriod=30 Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.868175 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:49:20 crc kubenswrapper[5118]: I0223 08:49:20.868393 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="285709b4-ebda-4e1f-91c3-905c79ab31af" containerName="nova-cell1-conductor-conductor" containerID="cri-o://25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1" gracePeriod=30 Feb 23 08:49:20 crc kubenswrapper[5118]: E0223 08:49:20.878632 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 08:49:20 crc kubenswrapper[5118]: E0223 08:49:20.881258 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 08:49:20 crc kubenswrapper[5118]: E0223 08:49:20.883733 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 08:49:20 crc kubenswrapper[5118]: E0223 08:49:20.883807 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="00136a25-626f-4abc-9f43-776ce7aa84bd" containerName="nova-cell0-conductor-conductor" Feb 23 08:49:21 crc kubenswrapper[5118]: E0223 08:49:21.565681 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:49:21 crc kubenswrapper[5118]: E0223 08:49:21.567156 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:49:21 crc kubenswrapper[5118]: E0223 08:49:21.568610 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:49:21 crc kubenswrapper[5118]: E0223 08:49:21.568654 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" containerName="nova-scheduler-scheduler" Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.574603 5118 generic.go:334] "Generic (PLEG): container finished" podID="57f97ea1-96a6-4bb5-8df9-5f14c5c96e52" containerID="78d34c23cc6382d352e86b30a107dc2a84567ddc2ec844444c8d11077d4669bc" exitCode=0 Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.574679 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52","Type":"ContainerDied","Data":"78d34c23cc6382d352e86b30a107dc2a84567ddc2ec844444c8d11077d4669bc"} Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.580328 5118 generic.go:334] "Generic (PLEG): container finished" podID="bccbce9a-839b-4566-a047-9cd168e610b4" containerID="bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b" exitCode=143 Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.580481 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bccbce9a-839b-4566-a047-9cd168e610b4","Type":"ContainerDied","Data":"bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b"} Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.584278 5118 generic.go:334] "Generic (PLEG): container finished" podID="d576fe42-601f-4592-9c3f-61d507293af9" containerID="20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371" exitCode=143 Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.584327 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d576fe42-601f-4592-9c3f-61d507293af9","Type":"ContainerDied","Data":"20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371"} Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.663761 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.726804 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3daa1a-aef3-4a38-82d3-9b87638123ab" path="/var/lib/kubelet/pods/ca3daa1a-aef3-4a38-82d3-9b87638123ab/volumes" Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.785760 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.816967 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-combined-ca-bundle\") pod \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.817037 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxf6r\" (UniqueName: \"kubernetes.io/projected/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-kube-api-access-kxf6r\") pod \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.817132 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-config-data\") pod \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\" (UID: \"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52\") " Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.857403 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-kube-api-access-kxf6r" (OuterVolumeSpecName: "kube-api-access-kxf6r") pod "57f97ea1-96a6-4bb5-8df9-5f14c5c96e52" (UID: "57f97ea1-96a6-4bb5-8df9-5f14c5c96e52"). InnerVolumeSpecName "kube-api-access-kxf6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.858946 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-config-data" (OuterVolumeSpecName: "config-data") pod "57f97ea1-96a6-4bb5-8df9-5f14c5c96e52" (UID: "57f97ea1-96a6-4bb5-8df9-5f14c5c96e52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.862339 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57f97ea1-96a6-4bb5-8df9-5f14c5c96e52" (UID: "57f97ea1-96a6-4bb5-8df9-5f14c5c96e52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.919738 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.919766 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxf6r\" (UniqueName: \"kubernetes.io/projected/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-kube-api-access-kxf6r\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:21 crc kubenswrapper[5118]: I0223 08:49:21.919782 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.594241 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"57f97ea1-96a6-4bb5-8df9-5f14c5c96e52","Type":"ContainerDied","Data":"2fe2cf6c4ced7ae91309f4600867caa542e6430892eb040e12462812e82156d9"} Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.594292 5118 scope.go:117] "RemoveContainer" containerID="78d34c23cc6382d352e86b30a107dc2a84567ddc2ec844444c8d11077d4669bc" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.594312 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.641237 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.652546 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.665839 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:49:22 crc kubenswrapper[5118]: E0223 08:49:22.666948 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f97ea1-96a6-4bb5-8df9-5f14c5c96e52" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.666974 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f97ea1-96a6-4bb5-8df9-5f14c5c96e52" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 08:49:22 crc kubenswrapper[5118]: E0223 08:49:22.667013 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3daa1a-aef3-4a38-82d3-9b87638123ab" containerName="dnsmasq-dns" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.667021 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3daa1a-aef3-4a38-82d3-9b87638123ab" containerName="dnsmasq-dns" Feb 23 08:49:22 crc kubenswrapper[5118]: E0223 08:49:22.667034 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3daa1a-aef3-4a38-82d3-9b87638123ab" containerName="init" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.667041 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3daa1a-aef3-4a38-82d3-9b87638123ab" containerName="init" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.667293 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f97ea1-96a6-4bb5-8df9-5f14c5c96e52" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.667321 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3daa1a-aef3-4a38-82d3-9b87638123ab" containerName="dnsmasq-dns" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.668124 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.671247 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.684164 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.834228 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6f7cef-764b-4c09-920e-38f952ce4538-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e6f7cef-764b-4c09-920e-38f952ce4538\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.834396 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gwx\" (UniqueName: \"kubernetes.io/projected/7e6f7cef-764b-4c09-920e-38f952ce4538-kube-api-access-m2gwx\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e6f7cef-764b-4c09-920e-38f952ce4538\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.834467 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6f7cef-764b-4c09-920e-38f952ce4538-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e6f7cef-764b-4c09-920e-38f952ce4538\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.935768 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6f7cef-764b-4c09-920e-38f952ce4538-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e6f7cef-764b-4c09-920e-38f952ce4538\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.935891 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2gwx\" (UniqueName: \"kubernetes.io/projected/7e6f7cef-764b-4c09-920e-38f952ce4538-kube-api-access-m2gwx\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e6f7cef-764b-4c09-920e-38f952ce4538\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.935964 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6f7cef-764b-4c09-920e-38f952ce4538-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e6f7cef-764b-4c09-920e-38f952ce4538\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.943056 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6f7cef-764b-4c09-920e-38f952ce4538-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e6f7cef-764b-4c09-920e-38f952ce4538\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.951984 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2gwx\" (UniqueName: \"kubernetes.io/projected/7e6f7cef-764b-4c09-920e-38f952ce4538-kube-api-access-m2gwx\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e6f7cef-764b-4c09-920e-38f952ce4538\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.952264 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6f7cef-764b-4c09-920e-38f952ce4538-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7e6f7cef-764b-4c09-920e-38f952ce4538\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:22 crc kubenswrapper[5118]: I0223 08:49:22.986379 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:23 crc kubenswrapper[5118]: I0223 08:49:23.458897 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 08:49:23 crc kubenswrapper[5118]: W0223 08:49:23.461454 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e6f7cef_764b_4c09_920e_38f952ce4538.slice/crio-ea4aeaf3261a664c57948f80000bc52ed533f75a8c0773c455fc2df548732408 WatchSource:0}: Error finding container ea4aeaf3261a664c57948f80000bc52ed533f75a8c0773c455fc2df548732408: Status 404 returned error can't find the container with id ea4aeaf3261a664c57948f80000bc52ed533f75a8c0773c455fc2df548732408 Feb 23 08:49:23 crc kubenswrapper[5118]: I0223 08:49:23.609846 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7e6f7cef-764b-4c09-920e-38f952ce4538","Type":"ContainerStarted","Data":"ea4aeaf3261a664c57948f80000bc52ed533f75a8c0773c455fc2df548732408"} Feb 23 08:49:23 crc kubenswrapper[5118]: I0223 08:49:23.717419 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57f97ea1-96a6-4bb5-8df9-5f14c5c96e52" path="/var/lib/kubelet/pods/57f97ea1-96a6-4bb5-8df9-5f14c5c96e52/volumes" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.408906 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.542307 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.582125 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv9xc\" (UniqueName: \"kubernetes.io/projected/d576fe42-601f-4592-9c3f-61d507293af9-kube-api-access-fv9xc\") pod \"d576fe42-601f-4592-9c3f-61d507293af9\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.589751 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d576fe42-601f-4592-9c3f-61d507293af9-kube-api-access-fv9xc" (OuterVolumeSpecName: "kube-api-access-fv9xc") pod "d576fe42-601f-4592-9c3f-61d507293af9" (UID: "d576fe42-601f-4592-9c3f-61d507293af9"). InnerVolumeSpecName "kube-api-access-fv9xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.589900 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-config-data\") pod \"d576fe42-601f-4592-9c3f-61d507293af9\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.589968 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-combined-ca-bundle\") pod \"d576fe42-601f-4592-9c3f-61d507293af9\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.590222 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d576fe42-601f-4592-9c3f-61d507293af9-logs\") pod \"d576fe42-601f-4592-9c3f-61d507293af9\" (UID: \"d576fe42-601f-4592-9c3f-61d507293af9\") " Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.590969 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv9xc\" (UniqueName: \"kubernetes.io/projected/d576fe42-601f-4592-9c3f-61d507293af9-kube-api-access-fv9xc\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.591294 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d576fe42-601f-4592-9c3f-61d507293af9-logs" (OuterVolumeSpecName: "logs") pod "d576fe42-601f-4592-9c3f-61d507293af9" (UID: "d576fe42-601f-4592-9c3f-61d507293af9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.652188 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-config-data" (OuterVolumeSpecName: "config-data") pod "d576fe42-601f-4592-9c3f-61d507293af9" (UID: "d576fe42-601f-4592-9c3f-61d507293af9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.691713 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7e6f7cef-764b-4c09-920e-38f952ce4538","Type":"ContainerStarted","Data":"1802a51275c5acb702e6b47f0cce3aedfd635ce33ac445690db7020d498671ba"} Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.693273 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccbce9a-839b-4566-a047-9cd168e610b4-logs\") pod \"bccbce9a-839b-4566-a047-9cd168e610b4\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.693390 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-config-data\") pod \"bccbce9a-839b-4566-a047-9cd168e610b4\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.693520 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9zm5\" (UniqueName: \"kubernetes.io/projected/bccbce9a-839b-4566-a047-9cd168e610b4-kube-api-access-v9zm5\") pod \"bccbce9a-839b-4566-a047-9cd168e610b4\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.693664 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-combined-ca-bundle\") pod \"bccbce9a-839b-4566-a047-9cd168e610b4\" (UID: \"bccbce9a-839b-4566-a047-9cd168e610b4\") " Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.694059 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.694072 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d576fe42-601f-4592-9c3f-61d507293af9-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.701904 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bccbce9a-839b-4566-a047-9cd168e610b4-logs" (OuterVolumeSpecName: "logs") pod "bccbce9a-839b-4566-a047-9cd168e610b4" (UID: "bccbce9a-839b-4566-a047-9cd168e610b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.705607 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccbce9a-839b-4566-a047-9cd168e610b4-kube-api-access-v9zm5" (OuterVolumeSpecName: "kube-api-access-v9zm5") pod "bccbce9a-839b-4566-a047-9cd168e610b4" (UID: "bccbce9a-839b-4566-a047-9cd168e610b4"). InnerVolumeSpecName "kube-api-access-v9zm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.710556 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d576fe42-601f-4592-9c3f-61d507293af9" (UID: "d576fe42-601f-4592-9c3f-61d507293af9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.739493 5118 generic.go:334] "Generic (PLEG): container finished" podID="285709b4-ebda-4e1f-91c3-905c79ab31af" containerID="25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1" exitCode=0 Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.740075 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"285709b4-ebda-4e1f-91c3-905c79ab31af","Type":"ContainerDied","Data":"25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1"} Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.742654 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.742627839 podStartE2EDuration="2.742627839s" podCreationTimestamp="2026-02-23 08:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:24.738796906 +0000 UTC m=+7427.742581479" watchObservedRunningTime="2026-02-23 08:49:24.742627839 +0000 UTC m=+7427.746412402" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.750055 5118 generic.go:334] "Generic (PLEG): container finished" podID="bccbce9a-839b-4566-a047-9cd168e610b4" containerID="7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd" exitCode=0 Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.750386 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bccbce9a-839b-4566-a047-9cd168e610b4","Type":"ContainerDied","Data":"7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd"} Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.750691 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bccbce9a-839b-4566-a047-9cd168e610b4","Type":"ContainerDied","Data":"92a21b3c8e45dafeb9c205a33e40b66bf84e268323034855efceb99312a09217"} Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.750847 5118 scope.go:117] "RemoveContainer" containerID="7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.751237 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.763393 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-config-data" (OuterVolumeSpecName: "config-data") pod "bccbce9a-839b-4566-a047-9cd168e610b4" (UID: "bccbce9a-839b-4566-a047-9cd168e610b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.771607 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bccbce9a-839b-4566-a047-9cd168e610b4" (UID: "bccbce9a-839b-4566-a047-9cd168e610b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.784066 5118 generic.go:334] "Generic (PLEG): container finished" podID="d576fe42-601f-4592-9c3f-61d507293af9" containerID="195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513" exitCode=0 Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.784139 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d576fe42-601f-4592-9c3f-61d507293af9","Type":"ContainerDied","Data":"195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513"} Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.784175 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d576fe42-601f-4592-9c3f-61d507293af9","Type":"ContainerDied","Data":"b08d566111abf84b500aba9e95630b8d86befd8045b99138a5a3216976ef3186"} Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.784242 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.796894 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccbce9a-839b-4566-a047-9cd168e610b4-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.797165 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.797235 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9zm5\" (UniqueName: \"kubernetes.io/projected/bccbce9a-839b-4566-a047-9cd168e610b4-kube-api-access-v9zm5\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.797324 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d576fe42-601f-4592-9c3f-61d507293af9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:24 crc kubenswrapper[5118]: I0223 08:49:24.797382 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccbce9a-839b-4566-a047-9cd168e610b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.013370 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.026493 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.033744 5118 scope.go:117] "RemoveContainer" containerID="bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.046188 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.046846 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" containerName="nova-api-api" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.046863 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" containerName="nova-api-api" Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.046873 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" containerName="nova-api-log" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.046879 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" containerName="nova-api-log" Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.046895 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d576fe42-601f-4592-9c3f-61d507293af9" containerName="nova-metadata-metadata" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.046901 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d576fe42-601f-4592-9c3f-61d507293af9" containerName="nova-metadata-metadata" Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.046912 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d576fe42-601f-4592-9c3f-61d507293af9" containerName="nova-metadata-log" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.046921 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d576fe42-601f-4592-9c3f-61d507293af9" containerName="nova-metadata-log" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.047136 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d576fe42-601f-4592-9c3f-61d507293af9" containerName="nova-metadata-metadata" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.047164 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" containerName="nova-api-log" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.047180 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d576fe42-601f-4592-9c3f-61d507293af9" containerName="nova-metadata-log" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.047194 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" containerName="nova-api-api" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.048396 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.051994 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.073261 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.075389 5118 scope.go:117] "RemoveContainer" containerID="7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd" Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.076537 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd\": container with ID starting with 7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd not found: ID does not exist" containerID="7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.076588 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd"} err="failed to get container status \"7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd\": rpc error: code = NotFound desc = could not find container \"7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd\": container with ID starting with 7cbaa7b2bd35b2e9744a83c332b360768e05af602adae75c06325f45ee28b7cd not found: ID does not exist" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.076625 5118 scope.go:117] "RemoveContainer" containerID="bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b" Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.077115 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b\": container with ID starting with bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b not found: ID does not exist" containerID="bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.077139 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b"} err="failed to get container status \"bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b\": rpc error: code = NotFound desc = could not find container \"bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b\": container with ID starting with bf0e29d936a4b69bce3ed3b6ae4c5904a11cdec976cf3b5b6841ffd9de77657b not found: ID does not exist" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.077153 5118 scope.go:117] "RemoveContainer" containerID="195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513" Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.082200 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1 is running failed: container process not found" containerID="25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.083636 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1 is running failed: container process not found" containerID="25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.089798 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1 is running failed: container process not found" containerID="25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.089897 5118 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="285709b4-ebda-4e1f-91c3-905c79ab31af" containerName="nova-cell1-conductor-conductor" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.108865 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.114608 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-config-data\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.114722 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fa1ea1b-b64d-48fd-a760-75dce9412009-logs\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.114802 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.114910 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt5nk\" (UniqueName: \"kubernetes.io/projected/2fa1ea1b-b64d-48fd-a760-75dce9412009-kube-api-access-rt5nk\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.132591 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.134132 5118 scope.go:117] "RemoveContainer" containerID="20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.140921 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.143595 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.147383 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.153251 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.179440 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.188277 5118 scope.go:117] "RemoveContainer" containerID="195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513" Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.196341 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513\": container with ID starting with 195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513 not found: ID does not exist" containerID="195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.196427 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513"} err="failed to get container status \"195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513\": rpc error: code = NotFound desc = could not find container \"195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513\": container with ID starting with 195c83f7bec95af87cb791b0d6809feffca90a0975146159a59211bd908b4513 not found: ID does not exist" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.196464 5118 scope.go:117] "RemoveContainer" containerID="20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.216837 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.216939 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt5nk\" (UniqueName: \"kubernetes.io/projected/2fa1ea1b-b64d-48fd-a760-75dce9412009-kube-api-access-rt5nk\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.217017 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-config-data\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.217074 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fa1ea1b-b64d-48fd-a760-75dce9412009-logs\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.219468 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fa1ea1b-b64d-48fd-a760-75dce9412009-logs\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.221856 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.230274 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-config-data\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.233667 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371\": container with ID starting with 20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371 not found: ID does not exist" containerID="20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.233742 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371"} err="failed to get container status \"20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371\": rpc error: code = NotFound desc = could not find container \"20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371\": container with ID starting with 20192000f0b5de0505d84767d96d420f5fdc1e68365ad2be4324b913cd1b4371 not found: ID does not exist" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.236372 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt5nk\" (UniqueName: \"kubernetes.io/projected/2fa1ea1b-b64d-48fd-a760-75dce9412009-kube-api-access-rt5nk\") pod \"nova-metadata-0\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.316270 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.317913 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-config-data\") pod \"285709b4-ebda-4e1f-91c3-905c79ab31af\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.318056 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-combined-ca-bundle\") pod \"285709b4-ebda-4e1f-91c3-905c79ab31af\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.318110 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbqrg\" (UniqueName: \"kubernetes.io/projected/285709b4-ebda-4e1f-91c3-905c79ab31af-kube-api-access-dbqrg\") pod \"285709b4-ebda-4e1f-91c3-905c79ab31af\" (UID: \"285709b4-ebda-4e1f-91c3-905c79ab31af\") " Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.323788 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5203da4c-1034-4cfa-9671-18b8de107d14-logs\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.323868 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/285709b4-ebda-4e1f-91c3-905c79ab31af-kube-api-access-dbqrg" (OuterVolumeSpecName: "kube-api-access-dbqrg") pod "285709b4-ebda-4e1f-91c3-905c79ab31af" (UID: "285709b4-ebda-4e1f-91c3-905c79ab31af"). InnerVolumeSpecName "kube-api-access-dbqrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.324345 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-config-data\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.324438 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ktrf\" (UniqueName: \"kubernetes.io/projected/5203da4c-1034-4cfa-9671-18b8de107d14-kube-api-access-9ktrf\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.324558 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.325010 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbqrg\" (UniqueName: \"kubernetes.io/projected/285709b4-ebda-4e1f-91c3-905c79ab31af-kube-api-access-dbqrg\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.370183 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-config-data" (OuterVolumeSpecName: "config-data") pod "285709b4-ebda-4e1f-91c3-905c79ab31af" (UID: "285709b4-ebda-4e1f-91c3-905c79ab31af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.378952 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "285709b4-ebda-4e1f-91c3-905c79ab31af" (UID: "285709b4-ebda-4e1f-91c3-905c79ab31af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.384621 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.427935 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-combined-ca-bundle\") pod \"00136a25-626f-4abc-9f43-776ce7aa84bd\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.428131 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-config-data\") pod \"00136a25-626f-4abc-9f43-776ce7aa84bd\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.428368 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8xdf\" (UniqueName: \"kubernetes.io/projected/00136a25-626f-4abc-9f43-776ce7aa84bd-kube-api-access-d8xdf\") pod \"00136a25-626f-4abc-9f43-776ce7aa84bd\" (UID: \"00136a25-626f-4abc-9f43-776ce7aa84bd\") " Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.428769 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5203da4c-1034-4cfa-9671-18b8de107d14-logs\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.428923 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-config-data\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.428942 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ktrf\" (UniqueName: \"kubernetes.io/projected/5203da4c-1034-4cfa-9671-18b8de107d14-kube-api-access-9ktrf\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.428978 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.429127 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.429154 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/285709b4-ebda-4e1f-91c3-905c79ab31af-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.430540 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5203da4c-1034-4cfa-9671-18b8de107d14-logs\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.435188 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-config-data\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.438931 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.443354 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00136a25-626f-4abc-9f43-776ce7aa84bd-kube-api-access-d8xdf" (OuterVolumeSpecName: "kube-api-access-d8xdf") pod "00136a25-626f-4abc-9f43-776ce7aa84bd" (UID: "00136a25-626f-4abc-9f43-776ce7aa84bd"). InnerVolumeSpecName "kube-api-access-d8xdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.459256 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ktrf\" (UniqueName: \"kubernetes.io/projected/5203da4c-1034-4cfa-9671-18b8de107d14-kube-api-access-9ktrf\") pod \"nova-api-0\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.472181 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.475240 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-config-data" (OuterVolumeSpecName: "config-data") pod "00136a25-626f-4abc-9f43-776ce7aa84bd" (UID: "00136a25-626f-4abc-9f43-776ce7aa84bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.478555 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00136a25-626f-4abc-9f43-776ce7aa84bd" (UID: "00136a25-626f-4abc-9f43-776ce7aa84bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.534404 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8xdf\" (UniqueName: \"kubernetes.io/projected/00136a25-626f-4abc-9f43-776ce7aa84bd-kube-api-access-d8xdf\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.534796 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.534805 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00136a25-626f-4abc-9f43-776ce7aa84bd-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.716477 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bccbce9a-839b-4566-a047-9cd168e610b4" path="/var/lib/kubelet/pods/bccbce9a-839b-4566-a047-9cd168e610b4/volumes" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.717313 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d576fe42-601f-4592-9c3f-61d507293af9" path="/var/lib/kubelet/pods/d576fe42-601f-4592-9c3f-61d507293af9/volumes" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.801248 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"285709b4-ebda-4e1f-91c3-905c79ab31af","Type":"ContainerDied","Data":"942ea0f7651acc5fa8de96ba6853c4e1244fae50d98d4d9d2f52e29b0a6fa599"} Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.801342 5118 scope.go:117] "RemoveContainer" containerID="25170ba5465f5168ad3aeb01416205cd8e389fb5c7a12e85fdf21568f1041ab1" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.801414 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.829737 5118 generic.go:334] "Generic (PLEG): container finished" podID="00136a25-626f-4abc-9f43-776ce7aa84bd" containerID="b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286" exitCode=0 Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.830117 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"00136a25-626f-4abc-9f43-776ce7aa84bd","Type":"ContainerDied","Data":"b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286"} Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.830281 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"00136a25-626f-4abc-9f43-776ce7aa84bd","Type":"ContainerDied","Data":"859e7b7a673f0cd6d9df80fa1db16e5a1b4651b64dfe9e5705cef62ada003511"} Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.830147 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.894268 5118 scope.go:117] "RemoveContainer" containerID="b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.900564 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.922598 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.934598 5118 scope.go:117] "RemoveContainer" containerID="b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.934747 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.935238 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00136a25-626f-4abc-9f43-776ce7aa84bd" containerName="nova-cell0-conductor-conductor" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.935262 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="00136a25-626f-4abc-9f43-776ce7aa84bd" containerName="nova-cell0-conductor-conductor" Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.935302 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="285709b4-ebda-4e1f-91c3-905c79ab31af" containerName="nova-cell1-conductor-conductor" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.935310 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="285709b4-ebda-4e1f-91c3-905c79ab31af" containerName="nova-cell1-conductor-conductor" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.935482 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="00136a25-626f-4abc-9f43-776ce7aa84bd" containerName="nova-cell0-conductor-conductor" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.935505 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="285709b4-ebda-4e1f-91c3-905c79ab31af" containerName="nova-cell1-conductor-conductor" Feb 23 08:49:25 crc kubenswrapper[5118]: E0223 08:49:25.938001 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286\": container with ID starting with b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286 not found: ID does not exist" containerID="b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.938032 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.938068 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286"} err="failed to get container status \"b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286\": rpc error: code = NotFound desc = could not find container \"b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286\": container with ID starting with b4bc0b095c9666a92cd0063745620af51e15a2d83d657ab265f3a9b4acfec286 not found: ID does not exist" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.942008 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.953797 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.966384 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.979749 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.988839 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.991370 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:25 crc kubenswrapper[5118]: I0223 08:49:25.994606 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.017628 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.030616 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 08:49:26 crc kubenswrapper[5118]: W0223 08:49:26.030853 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5203da4c_1034_4cfa_9671_18b8de107d14.slice/crio-8909dce1f456926b718dbf7ab8c780846a9ea8cf2e84096c7f89a5754501a243 WatchSource:0}: Error finding container 8909dce1f456926b718dbf7ab8c780846a9ea8cf2e84096c7f89a5754501a243: Status 404 returned error can't find the container with id 8909dce1f456926b718dbf7ab8c780846a9ea8cf2e84096c7f89a5754501a243 Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.048839 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs5nv\" (UniqueName: \"kubernetes.io/projected/fe565b17-17ee-4719-969b-0e4ee9cda9b5-kube-api-access-qs5nv\") pod \"nova-cell1-conductor-0\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.048894 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.048924 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.053972 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.150768 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9m5h\" (UniqueName: \"kubernetes.io/projected/5314407a-1b6c-4ad5-89aa-c876d1aaa379-kube-api-access-z9m5h\") pod \"nova-cell0-conductor-0\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.150926 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs5nv\" (UniqueName: \"kubernetes.io/projected/fe565b17-17ee-4719-969b-0e4ee9cda9b5-kube-api-access-qs5nv\") pod \"nova-cell1-conductor-0\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.150971 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.151000 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.151036 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.151071 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.160655 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.169717 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs5nv\" (UniqueName: \"kubernetes.io/projected/fe565b17-17ee-4719-969b-0e4ee9cda9b5-kube-api-access-qs5nv\") pod \"nova-cell1-conductor-0\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.169889 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.252855 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9m5h\" (UniqueName: \"kubernetes.io/projected/5314407a-1b6c-4ad5-89aa-c876d1aaa379-kube-api-access-z9m5h\") pod \"nova-cell0-conductor-0\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.252998 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.253049 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.261132 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.261315 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.272065 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9m5h\" (UniqueName: \"kubernetes.io/projected/5314407a-1b6c-4ad5-89aa-c876d1aaa379-kube-api-access-z9m5h\") pod \"nova-cell0-conductor-0\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.377765 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.406604 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:26 crc kubenswrapper[5118]: E0223 08:49:26.567855 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:49:26 crc kubenswrapper[5118]: E0223 08:49:26.570079 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:49:26 crc kubenswrapper[5118]: E0223 08:49:26.573545 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 08:49:26 crc kubenswrapper[5118]: E0223 08:49:26.573611 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" containerName="nova-scheduler-scheduler" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.848031 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fa1ea1b-b64d-48fd-a760-75dce9412009","Type":"ContainerStarted","Data":"c56fbd16f80d712125d16d5cc5cf324d93722944ee8505318a95e6cc986a0499"} Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.848076 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fa1ea1b-b64d-48fd-a760-75dce9412009","Type":"ContainerStarted","Data":"404cb13a74f8a0d840e7ef560b45a9bba5286fdbba1c44359a1dc0c3d3698763"} Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.848086 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fa1ea1b-b64d-48fd-a760-75dce9412009","Type":"ContainerStarted","Data":"cc29465f9ae403d3f12a55c357d0f476f2ff0d1dbfdcd07d3053d4fda66479ec"} Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.852830 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5203da4c-1034-4cfa-9671-18b8de107d14","Type":"ContainerStarted","Data":"49bcaced6476503e7831711bb37e6b1e48108345338319e243caad11e097fa80"} Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.852896 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5203da4c-1034-4cfa-9671-18b8de107d14","Type":"ContainerStarted","Data":"7cecc2661030d87870d8645913fc54230ed2c8e4402656e658611a330679ec48"} Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.852919 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5203da4c-1034-4cfa-9671-18b8de107d14","Type":"ContainerStarted","Data":"8909dce1f456926b718dbf7ab8c780846a9ea8cf2e84096c7f89a5754501a243"} Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.887815 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.887789643 podStartE2EDuration="1.887789643s" podCreationTimestamp="2026-02-23 08:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:26.877768612 +0000 UTC m=+7429.881553215" watchObservedRunningTime="2026-02-23 08:49:26.887789643 +0000 UTC m=+7429.891574216" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.907281 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.907258032 podStartE2EDuration="1.907258032s" podCreationTimestamp="2026-02-23 08:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:26.898318716 +0000 UTC m=+7429.902103309" watchObservedRunningTime="2026-02-23 08:49:26.907258032 +0000 UTC m=+7429.911042605" Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.946342 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 08:49:26 crc kubenswrapper[5118]: I0223 08:49:26.997814 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.714021 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00136a25-626f-4abc-9f43-776ce7aa84bd" path="/var/lib/kubelet/pods/00136a25-626f-4abc-9f43-776ce7aa84bd/volumes" Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.715243 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="285709b4-ebda-4e1f-91c3-905c79ab31af" path="/var/lib/kubelet/pods/285709b4-ebda-4e1f-91c3-905c79ab31af/volumes" Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.874736 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5314407a-1b6c-4ad5-89aa-c876d1aaa379","Type":"ContainerStarted","Data":"8e3a321083b8fe17b275eb3b0b34b396e8af35cd8e7b81623428981c66d976f7"} Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.874890 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5314407a-1b6c-4ad5-89aa-c876d1aaa379","Type":"ContainerStarted","Data":"bcac2794880d02397dd8d438704ee9a87df0de3a68d5262c3477eeca56ed994c"} Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.877076 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.883681 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fe565b17-17ee-4719-969b-0e4ee9cda9b5","Type":"ContainerStarted","Data":"d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b"} Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.883729 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fe565b17-17ee-4719-969b-0e4ee9cda9b5","Type":"ContainerStarted","Data":"8140d2d30c43be9152c73623a0b0a737ce28c27bfe3450591ff90fdefdd7ee55"} Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.886040 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.900079 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.900057842 podStartE2EDuration="2.900057842s" podCreationTimestamp="2026-02-23 08:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:27.896415294 +0000 UTC m=+7430.900199867" watchObservedRunningTime="2026-02-23 08:49:27.900057842 +0000 UTC m=+7430.903842425" Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.919639 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.919622284 podStartE2EDuration="2.919622284s" podCreationTimestamp="2026-02-23 08:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:27.913242709 +0000 UTC m=+7430.917027282" watchObservedRunningTime="2026-02-23 08:49:27.919622284 +0000 UTC m=+7430.923406857" Feb 23 08:49:27 crc kubenswrapper[5118]: I0223 08:49:27.989316 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:30 crc kubenswrapper[5118]: I0223 08:49:30.385657 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:49:30 crc kubenswrapper[5118]: I0223 08:49:30.386595 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 08:49:30 crc kubenswrapper[5118]: I0223 08:49:30.940526 5118 generic.go:334] "Generic (PLEG): container finished" podID="cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" containerID="dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3" exitCode=0 Feb 23 08:49:30 crc kubenswrapper[5118]: I0223 08:49:30.940645 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9","Type":"ContainerDied","Data":"dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3"} Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.206057 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.387850 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-config-data\") pod \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.388002 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-combined-ca-bundle\") pod \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.388151 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh24l\" (UniqueName: \"kubernetes.io/projected/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-kube-api-access-kh24l\") pod \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\" (UID: \"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9\") " Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.411655 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-kube-api-access-kh24l" (OuterVolumeSpecName: "kube-api-access-kh24l") pod "cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" (UID: "cfcd21b6-4c5f-4cd8-8854-3a979d36fee9"). InnerVolumeSpecName "kube-api-access-kh24l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.430697 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-config-data" (OuterVolumeSpecName: "config-data") pod "cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" (UID: "cfcd21b6-4c5f-4cd8-8854-3a979d36fee9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.437282 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" (UID: "cfcd21b6-4c5f-4cd8-8854-3a979d36fee9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.490955 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.491002 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.491019 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh24l\" (UniqueName: \"kubernetes.io/projected/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9-kube-api-access-kh24l\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.954644 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cfcd21b6-4c5f-4cd8-8854-3a979d36fee9","Type":"ContainerDied","Data":"23f03c27125a7cd1b91c7212d636c147d110df390e014a3f09f479cd8652e798"} Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.954749 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:49:31 crc kubenswrapper[5118]: I0223 08:49:31.955289 5118 scope.go:117] "RemoveContainer" containerID="dcbe1d1ce0bf8f78861a281ee403927d237a6263703d26749dae7f88a3b4a3c3" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.020880 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.035242 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.054289 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:49:32 crc kubenswrapper[5118]: E0223 08:49:32.055118 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" containerName="nova-scheduler-scheduler" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.055149 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" containerName="nova-scheduler-scheduler" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.055459 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" containerName="nova-scheduler-scheduler" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.056584 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.060621 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.067903 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.215024 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.215168 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfvk\" (UniqueName: \"kubernetes.io/projected/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-kube-api-access-nkfvk\") pod \"nova-scheduler-0\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.215290 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-config-data\") pod \"nova-scheduler-0\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.316924 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.317113 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfvk\" (UniqueName: \"kubernetes.io/projected/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-kube-api-access-nkfvk\") pod \"nova-scheduler-0\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.317298 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-config-data\") pod \"nova-scheduler-0\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.325201 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.338501 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-config-data\") pod \"nova-scheduler-0\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.338905 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfvk\" (UniqueName: \"kubernetes.io/projected/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-kube-api-access-nkfvk\") pod \"nova-scheduler-0\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.396069 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.879862 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.967573 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e","Type":"ContainerStarted","Data":"00a165fd24be78aae22c99fffcd59ae365a879b1fa85030f431941a23e5d2145"} Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.975019 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.975136 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:49:32 crc kubenswrapper[5118]: I0223 08:49:32.988189 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:33 crc kubenswrapper[5118]: I0223 08:49:33.008886 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:33 crc kubenswrapper[5118]: I0223 08:49:33.709583 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfcd21b6-4c5f-4cd8-8854-3a979d36fee9" path="/var/lib/kubelet/pods/cfcd21b6-4c5f-4cd8-8854-3a979d36fee9/volumes" Feb 23 08:49:33 crc kubenswrapper[5118]: I0223 08:49:33.994232 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e","Type":"ContainerStarted","Data":"b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9"} Feb 23 08:49:34 crc kubenswrapper[5118]: I0223 08:49:34.012158 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 23 08:49:34 crc kubenswrapper[5118]: I0223 08:49:34.024867 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.024828021 podStartE2EDuration="3.024828021s" podCreationTimestamp="2026-02-23 08:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:34.01438736 +0000 UTC m=+7437.018172033" watchObservedRunningTime="2026-02-23 08:49:34.024828021 +0000 UTC m=+7437.028612604" Feb 23 08:49:35 crc kubenswrapper[5118]: I0223 08:49:35.385818 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 08:49:35 crc kubenswrapper[5118]: I0223 08:49:35.386179 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 08:49:35 crc kubenswrapper[5118]: I0223 08:49:35.473004 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:49:35 crc kubenswrapper[5118]: I0223 08:49:35.473090 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 08:49:36 crc kubenswrapper[5118]: I0223 08:49:36.469324 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.96:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:49:36 crc kubenswrapper[5118]: I0223 08:49:36.469381 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.96:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:49:36 crc kubenswrapper[5118]: I0223 08:49:36.481463 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 23 08:49:36 crc kubenswrapper[5118]: I0223 08:49:36.495520 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 23 08:49:36 crc kubenswrapper[5118]: I0223 08:49:36.556284 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.97:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:49:36 crc kubenswrapper[5118]: I0223 08:49:36.556335 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.97:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 08:49:37 crc kubenswrapper[5118]: I0223 08:49:37.397547 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 08:49:42 crc kubenswrapper[5118]: I0223 08:49:42.397672 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 08:49:42 crc kubenswrapper[5118]: I0223 08:49:42.454814 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 08:49:43 crc kubenswrapper[5118]: I0223 08:49:43.125469 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.388494 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.391074 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.394784 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.479254 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.479860 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.482181 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.484900 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.754141 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.757324 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.764938 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.774639 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.807474 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7526776-e596-4aa9-a924-b1a61392e9fd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.807524 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.807543 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.807715 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtzsm\" (UniqueName: \"kubernetes.io/projected/d7526776-e596-4aa9-a924-b1a61392e9fd-kube-api-access-xtzsm\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.807822 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.807964 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-scripts\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.909599 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7526776-e596-4aa9-a924-b1a61392e9fd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.909692 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.909727 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.909754 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7526776-e596-4aa9-a924-b1a61392e9fd-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.909791 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtzsm\" (UniqueName: \"kubernetes.io/projected/d7526776-e596-4aa9-a924-b1a61392e9fd-kube-api-access-xtzsm\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.909841 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.909891 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-scripts\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.915687 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.915745 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.915745 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.928283 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-scripts\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:45 crc kubenswrapper[5118]: I0223 08:49:45.932713 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtzsm\" (UniqueName: \"kubernetes.io/projected/d7526776-e596-4aa9-a924-b1a61392e9fd-kube-api-access-xtzsm\") pod \"cinder-scheduler-0\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " pod="openstack/cinder-scheduler-0" Feb 23 08:49:46 crc kubenswrapper[5118]: I0223 08:49:46.085846 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:49:46 crc kubenswrapper[5118]: I0223 08:49:46.126678 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 08:49:46 crc kubenswrapper[5118]: I0223 08:49:46.130245 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 08:49:46 crc kubenswrapper[5118]: I0223 08:49:46.139573 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 08:49:46 crc kubenswrapper[5118]: I0223 08:49:46.638287 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:49:46 crc kubenswrapper[5118]: W0223 08:49:46.640947 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7526776_e596_4aa9_a924_b1a61392e9fd.slice/crio-ce99fab498e5188ee4a82e3c3ea30f690b681dd202d674531c97f4b8e14ab138 WatchSource:0}: Error finding container ce99fab498e5188ee4a82e3c3ea30f690b681dd202d674531c97f4b8e14ab138: Status 404 returned error can't find the container with id ce99fab498e5188ee4a82e3c3ea30f690b681dd202d674531c97f4b8e14ab138 Feb 23 08:49:46 crc kubenswrapper[5118]: I0223 08:49:46.644509 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:49:47 crc kubenswrapper[5118]: I0223 08:49:47.144853 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d7526776-e596-4aa9-a924-b1a61392e9fd","Type":"ContainerStarted","Data":"ce99fab498e5188ee4a82e3c3ea30f690b681dd202d674531c97f4b8e14ab138"} Feb 23 08:49:47 crc kubenswrapper[5118]: I0223 08:49:47.818081 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:49:47 crc kubenswrapper[5118]: I0223 08:49:47.818717 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="05f06da1-9540-4385-9815-17ba1edd64d2" containerName="cinder-api-log" containerID="cri-o://fe042ef0d555dd5feaf2513d36e29a1302dc0d12f7548bc911336fb4754bbff8" gracePeriod=30 Feb 23 08:49:47 crc kubenswrapper[5118]: I0223 08:49:47.819312 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="05f06da1-9540-4385-9815-17ba1edd64d2" containerName="cinder-api" containerID="cri-o://f2494bf1123c09af1557b911f3583242dfd171e5e0c831ff907da92cae520a68" gracePeriod=30 Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.156192 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d7526776-e596-4aa9-a924-b1a61392e9fd","Type":"ContainerStarted","Data":"f185ee9e9cf94ff45dafa3f416d9e2906e35bd7df49193bc842a3eba3ac38d87"} Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.161874 5118 generic.go:334] "Generic (PLEG): container finished" podID="05f06da1-9540-4385-9815-17ba1edd64d2" containerID="fe042ef0d555dd5feaf2513d36e29a1302dc0d12f7548bc911336fb4754bbff8" exitCode=143 Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.162832 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05f06da1-9540-4385-9815-17ba1edd64d2","Type":"ContainerDied","Data":"fe042ef0d555dd5feaf2513d36e29a1302dc0d12f7548bc911336fb4754bbff8"} Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.424699 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.427125 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.431710 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.436013 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.479777 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.479849 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.479904 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.479942 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.479962 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480021 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-run\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480088 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/54fb818d-8422-4517-84b1-b9d71a9f213a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480137 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480177 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480291 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd5t5\" (UniqueName: \"kubernetes.io/projected/54fb818d-8422-4517-84b1-b9d71a9f213a-kube-api-access-xd5t5\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480328 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480360 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480426 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480457 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480484 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.480527 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582299 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582355 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582374 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582417 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-run\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582481 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/54fb818d-8422-4517-84b1-b9d71a9f213a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582498 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582516 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582540 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd5t5\" (UniqueName: \"kubernetes.io/projected/54fb818d-8422-4517-84b1-b9d71a9f213a-kube-api-access-xd5t5\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582556 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582579 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582625 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582669 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582688 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582720 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582738 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.582761 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.583091 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-sys\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.583094 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-run\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.583230 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.583378 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.583473 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-dev\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.583513 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.583559 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.583557 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.583541 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.583649 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54fb818d-8422-4517-84b1-b9d71a9f213a-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.589834 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.590453 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.590544 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.591804 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/54fb818d-8422-4517-84b1-b9d71a9f213a-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.600924 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54fb818d-8422-4517-84b1-b9d71a9f213a-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.601030 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd5t5\" (UniqueName: \"kubernetes.io/projected/54fb818d-8422-4517-84b1-b9d71a9f213a-kube-api-access-xd5t5\") pod \"cinder-volume-volume1-0\" (UID: \"54fb818d-8422-4517-84b1-b9d71a9f213a\") " pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:48 crc kubenswrapper[5118]: I0223 08:49:48.800383 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.186476 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d7526776-e596-4aa9-a924-b1a61392e9fd","Type":"ContainerStarted","Data":"aa7256261df254014bf1dcec4f784b58c78aeb61dd52f8c5463dcf09336ae0b3"} Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.216387 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.218957 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.225541 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.266483 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.277643 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.037264201 podStartE2EDuration="4.2776143s" podCreationTimestamp="2026-02-23 08:49:45 +0000 UTC" firstStartedPulling="2026-02-23 08:49:46.644192887 +0000 UTC m=+7449.647977470" lastFinishedPulling="2026-02-23 08:49:46.884543006 +0000 UTC m=+7449.888327569" observedRunningTime="2026-02-23 08:49:49.214311636 +0000 UTC m=+7452.218096209" watchObservedRunningTime="2026-02-23 08:49:49.2776143 +0000 UTC m=+7452.281398873" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.295910 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.295961 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296001 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296021 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-etc-nvme\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296054 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-config-data\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296072 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296130 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-run\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296156 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g85qf\" (UniqueName: \"kubernetes.io/projected/6b3cd17b-fa93-44fa-9571-cebff83d65ad-kube-api-access-g85qf\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296244 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-scripts\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296270 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-dev\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296298 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296318 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-lib-modules\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296346 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-sys\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296370 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-config-data-custom\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296388 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.296404 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6b3cd17b-fa93-44fa-9571-cebff83d65ad-ceph\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.398395 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.398452 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.398512 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.398621 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.398711 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.399049 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-etc-nvme\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.398538 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-etc-nvme\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.399776 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-config-data\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.399802 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.399853 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-run\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.399882 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g85qf\" (UniqueName: \"kubernetes.io/projected/6b3cd17b-fa93-44fa-9571-cebff83d65ad-kube-api-access-g85qf\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.399932 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-scripts\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.399954 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.399961 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-dev\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.400007 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-run\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.400057 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.400145 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-lib-modules\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.400213 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-sys\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.400265 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-config-data-custom\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.400320 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6b3cd17b-fa93-44fa-9571-cebff83d65ad-ceph\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.400353 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.400692 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.399988 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-dev\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.402166 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-lib-modules\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.402231 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.402795 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b3cd17b-fa93-44fa-9571-cebff83d65ad-sys\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.406995 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-scripts\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.409544 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-config-data-custom\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.410689 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-config-data\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.414669 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3cd17b-fa93-44fa-9571-cebff83d65ad-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.418751 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6b3cd17b-fa93-44fa-9571-cebff83d65ad-ceph\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.432078 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g85qf\" (UniqueName: \"kubernetes.io/projected/6b3cd17b-fa93-44fa-9571-cebff83d65ad-kube-api-access-g85qf\") pod \"cinder-backup-0\" (UID: \"6b3cd17b-fa93-44fa-9571-cebff83d65ad\") " pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.570738 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 23 08:49:49 crc kubenswrapper[5118]: I0223 08:49:49.576881 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 23 08:49:50 crc kubenswrapper[5118]: I0223 08:49:50.197312 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"54fb818d-8422-4517-84b1-b9d71a9f213a","Type":"ContainerStarted","Data":"092d35dd50bcda9c3238cc61076af21f51af6bcca5e8386807ff067100931679"} Feb 23 08:49:50 crc kubenswrapper[5118]: I0223 08:49:50.220044 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 23 08:49:50 crc kubenswrapper[5118]: W0223 08:49:50.224259 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b3cd17b_fa93_44fa_9571_cebff83d65ad.slice/crio-dffacafe8ccc23f858c7aa16c732854c7c32db78649f98e9d0e1a56cbf66e2e8 WatchSource:0}: Error finding container dffacafe8ccc23f858c7aa16c732854c7c32db78649f98e9d0e1a56cbf66e2e8: Status 404 returned error can't find the container with id dffacafe8ccc23f858c7aa16c732854c7c32db78649f98e9d0e1a56cbf66e2e8 Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.017082 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="05f06da1-9540-4385-9815-17ba1edd64d2" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.94:8776/healthcheck\": read tcp 10.217.0.2:54560->10.217.1.94:8776: read: connection reset by peer" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.086985 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.224185 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6b3cd17b-fa93-44fa-9571-cebff83d65ad","Type":"ContainerStarted","Data":"1c42f55d230dcc3171c833fddc48ac2e15ce5416c9cdbc5b343632b3f80c719f"} Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.224233 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6b3cd17b-fa93-44fa-9571-cebff83d65ad","Type":"ContainerStarted","Data":"6881a67160010059398317ace9154f13a06dbd58244c977e52407e940a0704a1"} Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.224244 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6b3cd17b-fa93-44fa-9571-cebff83d65ad","Type":"ContainerStarted","Data":"dffacafe8ccc23f858c7aa16c732854c7c32db78649f98e9d0e1a56cbf66e2e8"} Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.236307 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"54fb818d-8422-4517-84b1-b9d71a9f213a","Type":"ContainerStarted","Data":"29c576cb624906abada36125da77012a51e50df4bf15f9b0a2a1c00c28983cdb"} Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.236358 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"54fb818d-8422-4517-84b1-b9d71a9f213a","Type":"ContainerStarted","Data":"4c79f177ad515968885c3d8d11e9a38761b4cba8b2a220929d8acd2bcfa1ff16"} Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.252611 5118 generic.go:334] "Generic (PLEG): container finished" podID="05f06da1-9540-4385-9815-17ba1edd64d2" containerID="f2494bf1123c09af1557b911f3583242dfd171e5e0c831ff907da92cae520a68" exitCode=0 Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.253171 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05f06da1-9540-4385-9815-17ba1edd64d2","Type":"ContainerDied","Data":"f2494bf1123c09af1557b911f3583242dfd171e5e0c831ff907da92cae520a68"} Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.253589 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=1.8638391520000002 podStartE2EDuration="2.253573209s" podCreationTimestamp="2026-02-23 08:49:49 +0000 UTC" firstStartedPulling="2026-02-23 08:49:50.22886694 +0000 UTC m=+7453.232651513" lastFinishedPulling="2026-02-23 08:49:50.618600997 +0000 UTC m=+7453.622385570" observedRunningTime="2026-02-23 08:49:51.250410983 +0000 UTC m=+7454.254195556" watchObservedRunningTime="2026-02-23 08:49:51.253573209 +0000 UTC m=+7454.257357782" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.301036 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.885006793 podStartE2EDuration="3.301013162s" podCreationTimestamp="2026-02-23 08:49:48 +0000 UTC" firstStartedPulling="2026-02-23 08:49:49.624901184 +0000 UTC m=+7452.628685757" lastFinishedPulling="2026-02-23 08:49:50.040907543 +0000 UTC m=+7453.044692126" observedRunningTime="2026-02-23 08:49:51.28514342 +0000 UTC m=+7454.288928003" watchObservedRunningTime="2026-02-23 08:49:51.301013162 +0000 UTC m=+7454.304797735" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.505751 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.694826 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05f06da1-9540-4385-9815-17ba1edd64d2-logs\") pod \"05f06da1-9540-4385-9815-17ba1edd64d2\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.694919 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-combined-ca-bundle\") pod \"05f06da1-9540-4385-9815-17ba1edd64d2\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.695196 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data-custom\") pod \"05f06da1-9540-4385-9815-17ba1edd64d2\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.695248 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-scripts\") pod \"05f06da1-9540-4385-9815-17ba1edd64d2\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.695325 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxgzm\" (UniqueName: \"kubernetes.io/projected/05f06da1-9540-4385-9815-17ba1edd64d2-kube-api-access-vxgzm\") pod \"05f06da1-9540-4385-9815-17ba1edd64d2\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.695389 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data\") pod \"05f06da1-9540-4385-9815-17ba1edd64d2\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.695592 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05f06da1-9540-4385-9815-17ba1edd64d2-etc-machine-id\") pod \"05f06da1-9540-4385-9815-17ba1edd64d2\" (UID: \"05f06da1-9540-4385-9815-17ba1edd64d2\") " Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.696498 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05f06da1-9540-4385-9815-17ba1edd64d2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "05f06da1-9540-4385-9815-17ba1edd64d2" (UID: "05f06da1-9540-4385-9815-17ba1edd64d2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.699356 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05f06da1-9540-4385-9815-17ba1edd64d2-logs" (OuterVolumeSpecName: "logs") pod "05f06da1-9540-4385-9815-17ba1edd64d2" (UID: "05f06da1-9540-4385-9815-17ba1edd64d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.708073 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-scripts" (OuterVolumeSpecName: "scripts") pod "05f06da1-9540-4385-9815-17ba1edd64d2" (UID: "05f06da1-9540-4385-9815-17ba1edd64d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.708207 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "05f06da1-9540-4385-9815-17ba1edd64d2" (UID: "05f06da1-9540-4385-9815-17ba1edd64d2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.717179 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05f06da1-9540-4385-9815-17ba1edd64d2-kube-api-access-vxgzm" (OuterVolumeSpecName: "kube-api-access-vxgzm") pod "05f06da1-9540-4385-9815-17ba1edd64d2" (UID: "05f06da1-9540-4385-9815-17ba1edd64d2"). InnerVolumeSpecName "kube-api-access-vxgzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.730677 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05f06da1-9540-4385-9815-17ba1edd64d2" (UID: "05f06da1-9540-4385-9815-17ba1edd64d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.754825 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data" (OuterVolumeSpecName: "config-data") pod "05f06da1-9540-4385-9815-17ba1edd64d2" (UID: "05f06da1-9540-4385-9815-17ba1edd64d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.798656 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.798691 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.798704 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxgzm\" (UniqueName: \"kubernetes.io/projected/05f06da1-9540-4385-9815-17ba1edd64d2-kube-api-access-vxgzm\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.798713 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.798724 5118 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/05f06da1-9540-4385-9815-17ba1edd64d2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.798733 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05f06da1-9540-4385-9815-17ba1edd64d2-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:51 crc kubenswrapper[5118]: I0223 08:49:51.798741 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05f06da1-9540-4385-9815-17ba1edd64d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.266741 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.266919 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"05f06da1-9540-4385-9815-17ba1edd64d2","Type":"ContainerDied","Data":"f6446efc2778963504bc60721f289032571fadfeb86f57c9048ba551c2a99876"} Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.267682 5118 scope.go:117] "RemoveContainer" containerID="f2494bf1123c09af1557b911f3583242dfd171e5e0c831ff907da92cae520a68" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.313831 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.317718 5118 scope.go:117] "RemoveContainer" containerID="fe042ef0d555dd5feaf2513d36e29a1302dc0d12f7548bc911336fb4754bbff8" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.326734 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.348957 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:49:52 crc kubenswrapper[5118]: E0223 08:49:52.349744 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f06da1-9540-4385-9815-17ba1edd64d2" containerName="cinder-api-log" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.349771 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f06da1-9540-4385-9815-17ba1edd64d2" containerName="cinder-api-log" Feb 23 08:49:52 crc kubenswrapper[5118]: E0223 08:49:52.349779 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05f06da1-9540-4385-9815-17ba1edd64d2" containerName="cinder-api" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.349785 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="05f06da1-9540-4385-9815-17ba1edd64d2" containerName="cinder-api" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.356373 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f06da1-9540-4385-9815-17ba1edd64d2" containerName="cinder-api" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.356446 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="05f06da1-9540-4385-9815-17ba1edd64d2" containerName="cinder-api-log" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.357844 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.360611 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.366062 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.520761 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cce4e91f-1362-4a4c-820f-d4a6ba874e33-logs\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.520931 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-config-data\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.520997 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cce4e91f-1362-4a4c-820f-d4a6ba874e33-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.521063 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.521135 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9s5\" (UniqueName: \"kubernetes.io/projected/cce4e91f-1362-4a4c-820f-d4a6ba874e33-kube-api-access-lq9s5\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.521238 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-scripts\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.521284 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-config-data-custom\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.623160 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cce4e91f-1362-4a4c-820f-d4a6ba874e33-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.623244 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.623269 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9s5\" (UniqueName: \"kubernetes.io/projected/cce4e91f-1362-4a4c-820f-d4a6ba874e33-kube-api-access-lq9s5\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.623341 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-scripts\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.623377 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-config-data-custom\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.623510 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cce4e91f-1362-4a4c-820f-d4a6ba874e33-logs\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.623570 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-config-data\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.624524 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cce4e91f-1362-4a4c-820f-d4a6ba874e33-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.628943 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cce4e91f-1362-4a4c-820f-d4a6ba874e33-logs\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.634032 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.644710 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-scripts\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.645326 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-config-data\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.645490 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cce4e91f-1362-4a4c-820f-d4a6ba874e33-config-data-custom\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.662193 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9s5\" (UniqueName: \"kubernetes.io/projected/cce4e91f-1362-4a4c-820f-d4a6ba874e33-kube-api-access-lq9s5\") pod \"cinder-api-0\" (UID: \"cce4e91f-1362-4a4c-820f-d4a6ba874e33\") " pod="openstack/cinder-api-0" Feb 23 08:49:52 crc kubenswrapper[5118]: I0223 08:49:52.685547 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 23 08:49:53 crc kubenswrapper[5118]: I0223 08:49:53.140935 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 23 08:49:53 crc kubenswrapper[5118]: I0223 08:49:53.286424 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cce4e91f-1362-4a4c-820f-d4a6ba874e33","Type":"ContainerStarted","Data":"e1bfcd8132eb7060cf41e0340dc5a38cf055cbd69e4c167b09de2398c87821f5"} Feb 23 08:49:53 crc kubenswrapper[5118]: I0223 08:49:53.713681 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05f06da1-9540-4385-9815-17ba1edd64d2" path="/var/lib/kubelet/pods/05f06da1-9540-4385-9815-17ba1edd64d2/volumes" Feb 23 08:49:53 crc kubenswrapper[5118]: I0223 08:49:53.800847 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:54 crc kubenswrapper[5118]: I0223 08:49:54.300240 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cce4e91f-1362-4a4c-820f-d4a6ba874e33","Type":"ContainerStarted","Data":"67dfa45593441fad300a5e2a91f2c94a49ce0a79cfbacc241ba706bceec6431e"} Feb 23 08:49:54 crc kubenswrapper[5118]: I0223 08:49:54.571669 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 23 08:49:55 crc kubenswrapper[5118]: I0223 08:49:55.320259 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cce4e91f-1362-4a4c-820f-d4a6ba874e33","Type":"ContainerStarted","Data":"787980dade267fa82f2bdc912cf81aa76a9092aacb6e943346cf458873edc141"} Feb 23 08:49:55 crc kubenswrapper[5118]: I0223 08:49:55.320648 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 23 08:49:55 crc kubenswrapper[5118]: I0223 08:49:55.358549 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.358511333 podStartE2EDuration="3.358511333s" podCreationTimestamp="2026-02-23 08:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:49:55.349513727 +0000 UTC m=+7458.353298360" watchObservedRunningTime="2026-02-23 08:49:55.358511333 +0000 UTC m=+7458.362295936" Feb 23 08:49:56 crc kubenswrapper[5118]: I0223 08:49:56.353184 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 08:49:56 crc kubenswrapper[5118]: I0223 08:49:56.451483 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:49:57 crc kubenswrapper[5118]: I0223 08:49:57.342575 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d7526776-e596-4aa9-a924-b1a61392e9fd" containerName="cinder-scheduler" containerID="cri-o://f185ee9e9cf94ff45dafa3f416d9e2906e35bd7df49193bc842a3eba3ac38d87" gracePeriod=30 Feb 23 08:49:57 crc kubenswrapper[5118]: I0223 08:49:57.342664 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d7526776-e596-4aa9-a924-b1a61392e9fd" containerName="probe" containerID="cri-o://aa7256261df254014bf1dcec4f784b58c78aeb61dd52f8c5463dcf09336ae0b3" gracePeriod=30 Feb 23 08:49:58 crc kubenswrapper[5118]: I0223 08:49:58.356318 5118 generic.go:334] "Generic (PLEG): container finished" podID="d7526776-e596-4aa9-a924-b1a61392e9fd" containerID="aa7256261df254014bf1dcec4f784b58c78aeb61dd52f8c5463dcf09336ae0b3" exitCode=0 Feb 23 08:49:58 crc kubenswrapper[5118]: I0223 08:49:58.356423 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d7526776-e596-4aa9-a924-b1a61392e9fd","Type":"ContainerDied","Data":"aa7256261df254014bf1dcec4f784b58c78aeb61dd52f8c5463dcf09336ae0b3"} Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.028644 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.370429 5118 generic.go:334] "Generic (PLEG): container finished" podID="d7526776-e596-4aa9-a924-b1a61392e9fd" containerID="f185ee9e9cf94ff45dafa3f416d9e2906e35bd7df49193bc842a3eba3ac38d87" exitCode=0 Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.370710 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d7526776-e596-4aa9-a924-b1a61392e9fd","Type":"ContainerDied","Data":"f185ee9e9cf94ff45dafa3f416d9e2906e35bd7df49193bc842a3eba3ac38d87"} Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.568162 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.696960 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7526776-e596-4aa9-a924-b1a61392e9fd-etc-machine-id\") pod \"d7526776-e596-4aa9-a924-b1a61392e9fd\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.697230 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-combined-ca-bundle\") pod \"d7526776-e596-4aa9-a924-b1a61392e9fd\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.697303 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtzsm\" (UniqueName: \"kubernetes.io/projected/d7526776-e596-4aa9-a924-b1a61392e9fd-kube-api-access-xtzsm\") pod \"d7526776-e596-4aa9-a924-b1a61392e9fd\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.697477 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data-custom\") pod \"d7526776-e596-4aa9-a924-b1a61392e9fd\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.697533 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-scripts\") pod \"d7526776-e596-4aa9-a924-b1a61392e9fd\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.697643 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data\") pod \"d7526776-e596-4aa9-a924-b1a61392e9fd\" (UID: \"d7526776-e596-4aa9-a924-b1a61392e9fd\") " Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.699157 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7526776-e596-4aa9-a924-b1a61392e9fd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d7526776-e596-4aa9-a924-b1a61392e9fd" (UID: "d7526776-e596-4aa9-a924-b1a61392e9fd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.708324 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d7526776-e596-4aa9-a924-b1a61392e9fd" (UID: "d7526776-e596-4aa9-a924-b1a61392e9fd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.708380 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-scripts" (OuterVolumeSpecName: "scripts") pod "d7526776-e596-4aa9-a924-b1a61392e9fd" (UID: "d7526776-e596-4aa9-a924-b1a61392e9fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.715641 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7526776-e596-4aa9-a924-b1a61392e9fd-kube-api-access-xtzsm" (OuterVolumeSpecName: "kube-api-access-xtzsm") pod "d7526776-e596-4aa9-a924-b1a61392e9fd" (UID: "d7526776-e596-4aa9-a924-b1a61392e9fd"). InnerVolumeSpecName "kube-api-access-xtzsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.774500 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7526776-e596-4aa9-a924-b1a61392e9fd" (UID: "d7526776-e596-4aa9-a924-b1a61392e9fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.800503 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtzsm\" (UniqueName: \"kubernetes.io/projected/d7526776-e596-4aa9-a924-b1a61392e9fd-kube-api-access-xtzsm\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.800552 5118 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.800563 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.800572 5118 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7526776-e596-4aa9-a924-b1a61392e9fd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.800580 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.827209 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data" (OuterVolumeSpecName: "config-data") pod "d7526776-e596-4aa9-a924-b1a61392e9fd" (UID: "d7526776-e596-4aa9-a924-b1a61392e9fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.860774 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 23 08:49:59 crc kubenswrapper[5118]: I0223 08:49:59.902527 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7526776-e596-4aa9-a924-b1a61392e9fd-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.385422 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d7526776-e596-4aa9-a924-b1a61392e9fd","Type":"ContainerDied","Data":"ce99fab498e5188ee4a82e3c3ea30f690b681dd202d674531c97f4b8e14ab138"} Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.385506 5118 scope.go:117] "RemoveContainer" containerID="aa7256261df254014bf1dcec4f784b58c78aeb61dd52f8c5463dcf09336ae0b3" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.385590 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.411599 5118 scope.go:117] "RemoveContainer" containerID="f185ee9e9cf94ff45dafa3f416d9e2906e35bd7df49193bc842a3eba3ac38d87" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.441021 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.473057 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.485751 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:50:00 crc kubenswrapper[5118]: E0223 08:50:00.486474 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7526776-e596-4aa9-a924-b1a61392e9fd" containerName="probe" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.486505 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7526776-e596-4aa9-a924-b1a61392e9fd" containerName="probe" Feb 23 08:50:00 crc kubenswrapper[5118]: E0223 08:50:00.486538 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7526776-e596-4aa9-a924-b1a61392e9fd" containerName="cinder-scheduler" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.486552 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7526776-e596-4aa9-a924-b1a61392e9fd" containerName="cinder-scheduler" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.486863 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7526776-e596-4aa9-a924-b1a61392e9fd" containerName="probe" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.486941 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7526776-e596-4aa9-a924-b1a61392e9fd" containerName="cinder-scheduler" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.489026 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.494438 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.495893 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.622238 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.622875 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dz26\" (UniqueName: \"kubernetes.io/projected/1d980983-651d-4df6-a7ca-18d1191300cf-kube-api-access-5dz26\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.623067 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d980983-651d-4df6-a7ca-18d1191300cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.623128 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.623256 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.623612 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.725293 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dz26\" (UniqueName: \"kubernetes.io/projected/1d980983-651d-4df6-a7ca-18d1191300cf-kube-api-access-5dz26\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.725358 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d980983-651d-4df6-a7ca-18d1191300cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.725375 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.725404 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.725447 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.725493 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.725590 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d980983-651d-4df6-a7ca-18d1191300cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.733149 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.733826 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.736437 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.738571 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d980983-651d-4df6-a7ca-18d1191300cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.757270 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dz26\" (UniqueName: \"kubernetes.io/projected/1d980983-651d-4df6-a7ca-18d1191300cf-kube-api-access-5dz26\") pod \"cinder-scheduler-0\" (UID: \"1d980983-651d-4df6-a7ca-18d1191300cf\") " pod="openstack/cinder-scheduler-0" Feb 23 08:50:00 crc kubenswrapper[5118]: I0223 08:50:00.817974 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 23 08:50:01 crc kubenswrapper[5118]: I0223 08:50:01.386255 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 23 08:50:01 crc kubenswrapper[5118]: I0223 08:50:01.423225 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d980983-651d-4df6-a7ca-18d1191300cf","Type":"ContainerStarted","Data":"6b65619d1a5f08edb20306ac1c94fbc5c84f8a22e5ccf3d09602c1e2727d6802"} Feb 23 08:50:01 crc kubenswrapper[5118]: I0223 08:50:01.710502 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7526776-e596-4aa9-a924-b1a61392e9fd" path="/var/lib/kubelet/pods/d7526776-e596-4aa9-a924-b1a61392e9fd/volumes" Feb 23 08:50:02 crc kubenswrapper[5118]: I0223 08:50:02.441015 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d980983-651d-4df6-a7ca-18d1191300cf","Type":"ContainerStarted","Data":"8ef69207cf9008b4276250fa2cb094863b20603f2bc298365835ad2fdba89005"} Feb 23 08:50:02 crc kubenswrapper[5118]: I0223 08:50:02.975558 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:50:02 crc kubenswrapper[5118]: I0223 08:50:02.976137 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:50:03 crc kubenswrapper[5118]: I0223 08:50:03.458164 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d980983-651d-4df6-a7ca-18d1191300cf","Type":"ContainerStarted","Data":"1cde80c1747b52aabb22a2fb309a046ea24bae38446757312b51ae7788bc5c78"} Feb 23 08:50:03 crc kubenswrapper[5118]: I0223 08:50:03.498522 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.498490037 podStartE2EDuration="3.498490037s" podCreationTimestamp="2026-02-23 08:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:50:03.482973053 +0000 UTC m=+7466.486757626" watchObservedRunningTime="2026-02-23 08:50:03.498490037 +0000 UTC m=+7466.502274620" Feb 23 08:50:04 crc kubenswrapper[5118]: I0223 08:50:04.416549 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 23 08:50:05 crc kubenswrapper[5118]: I0223 08:50:05.818698 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 08:50:09 crc kubenswrapper[5118]: I0223 08:50:09.074639 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-tgzf2"] Feb 23 08:50:09 crc kubenswrapper[5118]: I0223 08:50:09.087597 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9e19-account-create-update-pmfcs"] Feb 23 08:50:09 crc kubenswrapper[5118]: I0223 08:50:09.101423 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-tgzf2"] Feb 23 08:50:09 crc kubenswrapper[5118]: I0223 08:50:09.112845 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9e19-account-create-update-pmfcs"] Feb 23 08:50:09 crc kubenswrapper[5118]: I0223 08:50:09.711252 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c7a2f84-6788-4fae-a465-b0cf157c16e5" path="/var/lib/kubelet/pods/9c7a2f84-6788-4fae-a465-b0cf157c16e5/volumes" Feb 23 08:50:09 crc kubenswrapper[5118]: I0223 08:50:09.712506 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76" path="/var/lib/kubelet/pods/eaf3c309-3ffb-4ea8-aa56-0e9dfd8cba76/volumes" Feb 23 08:50:11 crc kubenswrapper[5118]: I0223 08:50:11.055921 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 08:50:20 crc kubenswrapper[5118]: I0223 08:50:20.050517 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-t4jr4"] Feb 23 08:50:20 crc kubenswrapper[5118]: I0223 08:50:20.066623 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-t4jr4"] Feb 23 08:50:21 crc kubenswrapper[5118]: I0223 08:50:21.713436 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d843d0a0-c5d1-47e0-8952-5504b7d79b88" path="/var/lib/kubelet/pods/d843d0a0-c5d1-47e0-8952-5504b7d79b88/volumes" Feb 23 08:50:32 crc kubenswrapper[5118]: I0223 08:50:32.975241 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:50:32 crc kubenswrapper[5118]: I0223 08:50:32.975953 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:50:32 crc kubenswrapper[5118]: I0223 08:50:32.976029 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 08:50:32 crc kubenswrapper[5118]: I0223 08:50:32.977194 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:50:32 crc kubenswrapper[5118]: I0223 08:50:32.977291 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" gracePeriod=600 Feb 23 08:50:33 crc kubenswrapper[5118]: E0223 08:50:33.107144 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:50:33 crc kubenswrapper[5118]: I0223 08:50:33.810189 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" exitCode=0 Feb 23 08:50:33 crc kubenswrapper[5118]: I0223 08:50:33.810239 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf"} Feb 23 08:50:33 crc kubenswrapper[5118]: I0223 08:50:33.810281 5118 scope.go:117] "RemoveContainer" containerID="736e449ee6985bd3eea8148239e5d9c3b28aa25fd251975364e0cd9fe953dfbb" Feb 23 08:50:33 crc kubenswrapper[5118]: I0223 08:50:33.811408 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:50:33 crc kubenswrapper[5118]: E0223 08:50:33.812041 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.060508 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rhx92"] Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.069490 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rhx92"] Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.398956 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wmv7q"] Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.402676 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.418787 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmv7q"] Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.577529 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-catalog-content\") pod \"redhat-marketplace-wmv7q\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.578427 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9jxq\" (UniqueName: \"kubernetes.io/projected/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-kube-api-access-x9jxq\") pod \"redhat-marketplace-wmv7q\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.578534 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-utilities\") pod \"redhat-marketplace-wmv7q\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.681223 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9jxq\" (UniqueName: \"kubernetes.io/projected/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-kube-api-access-x9jxq\") pod \"redhat-marketplace-wmv7q\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.681349 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-utilities\") pod \"redhat-marketplace-wmv7q\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.681416 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-catalog-content\") pod \"redhat-marketplace-wmv7q\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.681846 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-utilities\") pod \"redhat-marketplace-wmv7q\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.681957 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-catalog-content\") pod \"redhat-marketplace-wmv7q\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.704880 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9jxq\" (UniqueName: \"kubernetes.io/projected/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-kube-api-access-x9jxq\") pod \"redhat-marketplace-wmv7q\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:34 crc kubenswrapper[5118]: I0223 08:50:34.729986 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:35 crc kubenswrapper[5118]: I0223 08:50:35.239714 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmv7q"] Feb 23 08:50:35 crc kubenswrapper[5118]: I0223 08:50:35.709276 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="062a4716-72d8-4714-bee7-679602f5df50" path="/var/lib/kubelet/pods/062a4716-72d8-4714-bee7-679602f5df50/volumes" Feb 23 08:50:35 crc kubenswrapper[5118]: I0223 08:50:35.894504 5118 generic.go:334] "Generic (PLEG): container finished" podID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerID="3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd" exitCode=0 Feb 23 08:50:35 crc kubenswrapper[5118]: I0223 08:50:35.894876 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmv7q" event={"ID":"988f748b-e8b8-4eb8-852c-dc6597e6d6fc","Type":"ContainerDied","Data":"3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd"} Feb 23 08:50:35 crc kubenswrapper[5118]: I0223 08:50:35.895017 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmv7q" event={"ID":"988f748b-e8b8-4eb8-852c-dc6597e6d6fc","Type":"ContainerStarted","Data":"42bb20f8ab53fdde7e69f9fca9385ea353dd3d2e22a55b768d3f7d0c18b88641"} Feb 23 08:50:36 crc kubenswrapper[5118]: I0223 08:50:36.920996 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmv7q" event={"ID":"988f748b-e8b8-4eb8-852c-dc6597e6d6fc","Type":"ContainerStarted","Data":"e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646"} Feb 23 08:50:37 crc kubenswrapper[5118]: I0223 08:50:37.936085 5118 generic.go:334] "Generic (PLEG): container finished" podID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerID="e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646" exitCode=0 Feb 23 08:50:37 crc kubenswrapper[5118]: I0223 08:50:37.936190 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmv7q" event={"ID":"988f748b-e8b8-4eb8-852c-dc6597e6d6fc","Type":"ContainerDied","Data":"e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646"} Feb 23 08:50:38 crc kubenswrapper[5118]: I0223 08:50:38.962503 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmv7q" event={"ID":"988f748b-e8b8-4eb8-852c-dc6597e6d6fc","Type":"ContainerStarted","Data":"6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6"} Feb 23 08:50:38 crc kubenswrapper[5118]: I0223 08:50:38.982988 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wmv7q" podStartSLOduration=2.536167707 podStartE2EDuration="4.982965476s" podCreationTimestamp="2026-02-23 08:50:34 +0000 UTC" firstStartedPulling="2026-02-23 08:50:35.896750188 +0000 UTC m=+7498.900534761" lastFinishedPulling="2026-02-23 08:50:38.343547927 +0000 UTC m=+7501.347332530" observedRunningTime="2026-02-23 08:50:38.980887476 +0000 UTC m=+7501.984672069" watchObservedRunningTime="2026-02-23 08:50:38.982965476 +0000 UTC m=+7501.986750049" Feb 23 08:50:44 crc kubenswrapper[5118]: I0223 08:50:44.730251 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:44 crc kubenswrapper[5118]: I0223 08:50:44.730805 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:44 crc kubenswrapper[5118]: I0223 08:50:44.804200 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:45 crc kubenswrapper[5118]: I0223 08:50:45.120521 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:45 crc kubenswrapper[5118]: I0223 08:50:45.217593 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmv7q"] Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.062880 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wmv7q" podUID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerName="registry-server" containerID="cri-o://6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6" gracePeriod=2 Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.688652 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.802555 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-utilities\") pod \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.802639 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-catalog-content\") pod \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.802677 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9jxq\" (UniqueName: \"kubernetes.io/projected/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-kube-api-access-x9jxq\") pod \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\" (UID: \"988f748b-e8b8-4eb8-852c-dc6597e6d6fc\") " Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.804255 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-utilities" (OuterVolumeSpecName: "utilities") pod "988f748b-e8b8-4eb8-852c-dc6597e6d6fc" (UID: "988f748b-e8b8-4eb8-852c-dc6597e6d6fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.812656 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-kube-api-access-x9jxq" (OuterVolumeSpecName: "kube-api-access-x9jxq") pod "988f748b-e8b8-4eb8-852c-dc6597e6d6fc" (UID: "988f748b-e8b8-4eb8-852c-dc6597e6d6fc"). InnerVolumeSpecName "kube-api-access-x9jxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.836354 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "988f748b-e8b8-4eb8-852c-dc6597e6d6fc" (UID: "988f748b-e8b8-4eb8-852c-dc6597e6d6fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.907975 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.908056 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:47 crc kubenswrapper[5118]: I0223 08:50:47.908078 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9jxq\" (UniqueName: \"kubernetes.io/projected/988f748b-e8b8-4eb8-852c-dc6597e6d6fc-kube-api-access-x9jxq\") on node \"crc\" DevicePath \"\"" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.073735 5118 generic.go:334] "Generic (PLEG): container finished" podID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerID="6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6" exitCode=0 Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.073855 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmv7q" event={"ID":"988f748b-e8b8-4eb8-852c-dc6597e6d6fc","Type":"ContainerDied","Data":"6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6"} Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.074012 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmv7q" event={"ID":"988f748b-e8b8-4eb8-852c-dc6597e6d6fc","Type":"ContainerDied","Data":"42bb20f8ab53fdde7e69f9fca9385ea353dd3d2e22a55b768d3f7d0c18b88641"} Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.074047 5118 scope.go:117] "RemoveContainer" containerID="6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.073880 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmv7q" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.113037 5118 scope.go:117] "RemoveContainer" containerID="e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.123416 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmv7q"] Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.134813 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmv7q"] Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.144920 5118 scope.go:117] "RemoveContainer" containerID="3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.194515 5118 scope.go:117] "RemoveContainer" containerID="6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6" Feb 23 08:50:48 crc kubenswrapper[5118]: E0223 08:50:48.195186 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6\": container with ID starting with 6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6 not found: ID does not exist" containerID="6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.195250 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6"} err="failed to get container status \"6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6\": rpc error: code = NotFound desc = could not find container \"6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6\": container with ID starting with 6b339089397fcb1ff7c12ef8e7ca57299e20c57aea4a201561fa33f96a21a9d6 not found: ID does not exist" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.195310 5118 scope.go:117] "RemoveContainer" containerID="e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646" Feb 23 08:50:48 crc kubenswrapper[5118]: E0223 08:50:48.196332 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646\": container with ID starting with e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646 not found: ID does not exist" containerID="e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.196399 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646"} err="failed to get container status \"e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646\": rpc error: code = NotFound desc = could not find container \"e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646\": container with ID starting with e6a1442675d4180b8ab9675e408a56eea2b0e20f8cb1e4891b690f5a06c5c646 not found: ID does not exist" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.196441 5118 scope.go:117] "RemoveContainer" containerID="3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd" Feb 23 08:50:48 crc kubenswrapper[5118]: E0223 08:50:48.196798 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd\": container with ID starting with 3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd not found: ID does not exist" containerID="3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.196846 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd"} err="failed to get container status \"3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd\": rpc error: code = NotFound desc = could not find container \"3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd\": container with ID starting with 3545e28a812ddb850bbe71f24bb51486af0ed71c95ebe78a4908bd1347c7bafd not found: ID does not exist" Feb 23 08:50:48 crc kubenswrapper[5118]: I0223 08:50:48.698043 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:50:48 crc kubenswrapper[5118]: E0223 08:50:48.698782 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:50:49 crc kubenswrapper[5118]: I0223 08:50:49.721066 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" path="/var/lib/kubelet/pods/988f748b-e8b8-4eb8-852c-dc6597e6d6fc/volumes" Feb 23 08:51:02 crc kubenswrapper[5118]: I0223 08:51:02.833076 5118 scope.go:117] "RemoveContainer" containerID="2e865aad1edfc20cfe99140a3916f976fa142f7f9704d7ff8a3981adfdcb865d" Feb 23 08:51:02 crc kubenswrapper[5118]: I0223 08:51:02.894280 5118 scope.go:117] "RemoveContainer" containerID="3e1a8298b1d3fc26fe3bcf9f090bc060316781292f29b2bdd1f6aad49531de10" Feb 23 08:51:02 crc kubenswrapper[5118]: I0223 08:51:02.947142 5118 scope.go:117] "RemoveContainer" containerID="1e986731a23223ef3011ca66015a2a6164654579cd221931de7f6423bdd94d3a" Feb 23 08:51:03 crc kubenswrapper[5118]: I0223 08:51:03.000552 5118 scope.go:117] "RemoveContainer" containerID="203e325c2db89155b3ef8068e97b6c4fddeb91994f94fab144f7d1123b4bd143" Feb 23 08:51:03 crc kubenswrapper[5118]: I0223 08:51:03.699879 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:51:03 crc kubenswrapper[5118]: E0223 08:51:03.701121 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:51:16 crc kubenswrapper[5118]: I0223 08:51:16.699261 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:51:16 crc kubenswrapper[5118]: E0223 08:51:16.700717 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:51:27 crc kubenswrapper[5118]: I0223 08:51:27.705591 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:51:27 crc kubenswrapper[5118]: E0223 08:51:27.706770 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:51:40 crc kubenswrapper[5118]: I0223 08:51:40.697725 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:51:40 crc kubenswrapper[5118]: E0223 08:51:40.699142 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.727641 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b6f876c8f-khnvw"] Feb 23 08:51:45 crc kubenswrapper[5118]: E0223 08:51:45.729742 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerName="extract-utilities" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.729840 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerName="extract-utilities" Feb 23 08:51:45 crc kubenswrapper[5118]: E0223 08:51:45.729913 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerName="extract-content" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.729976 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerName="extract-content" Feb 23 08:51:45 crc kubenswrapper[5118]: E0223 08:51:45.730051 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerName="registry-server" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.730165 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerName="registry-server" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.730427 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="988f748b-e8b8-4eb8-852c-dc6597e6d6fc" containerName="registry-server" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.731535 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.734766 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.735054 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-6spdb" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.736330 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.736348 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.748986 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b6f876c8f-khnvw"] Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.789761 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.790320 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2e2a35af-1457-4c23-b5c8-425735c0d833" containerName="glance-log" containerID="cri-o://8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25" gracePeriod=30 Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.790442 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2e2a35af-1457-4c23-b5c8-425735c0d833" containerName="glance-httpd" containerID="cri-o://db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26" gracePeriod=30 Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.828599 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-horizon-secret-key\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.828661 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg9x9\" (UniqueName: \"kubernetes.io/projected/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-kube-api-access-mg9x9\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.828761 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-config-data\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.828830 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-scripts\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.829143 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-logs\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.852249 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.852766 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="de407020-eb83-4653-8880-2fd15bee8791" containerName="glance-log" containerID="cri-o://fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213" gracePeriod=30 Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.853318 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="de407020-eb83-4653-8880-2fd15bee8791" containerName="glance-httpd" containerID="cri-o://fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6" gracePeriod=30 Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.885399 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79bd4bc597-nbr9p"] Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.888826 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938258 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj57r\" (UniqueName: \"kubernetes.io/projected/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-kube-api-access-nj57r\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938330 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-scripts\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938460 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-horizon-secret-key\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938572 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-logs\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938614 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-logs\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938633 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-horizon-secret-key\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938659 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg9x9\" (UniqueName: \"kubernetes.io/projected/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-kube-api-access-mg9x9\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938710 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-config-data\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938738 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-config-data\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938801 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-scripts\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.938969 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-logs\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.940980 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-scripts\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.941000 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-config-data\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.941031 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79bd4bc597-nbr9p"] Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.948636 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-horizon-secret-key\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:45 crc kubenswrapper[5118]: I0223 08:51:45.956092 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg9x9\" (UniqueName: \"kubernetes.io/projected/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-kube-api-access-mg9x9\") pod \"horizon-6b6f876c8f-khnvw\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.040583 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-horizon-secret-key\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.040677 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-logs\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.040732 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-config-data\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.040800 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj57r\" (UniqueName: \"kubernetes.io/projected/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-kube-api-access-nj57r\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.040829 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-scripts\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.041454 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-logs\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.041805 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-scripts\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.043037 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-config-data\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.043467 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-horizon-secret-key\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.058081 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.058866 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj57r\" (UniqueName: \"kubernetes.io/projected/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-kube-api-access-nj57r\") pod \"horizon-79bd4bc597-nbr9p\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.331749 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.393820 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b6f876c8f-khnvw"] Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.437476 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8695445c8f-brjrf"] Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.439079 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.447774 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8695445c8f-brjrf"] Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.460404 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84e0f825-b01b-441d-a14c-3fd421cc67ff-horizon-secret-key\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.460447 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-config-data\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.460467 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84e0f825-b01b-441d-a14c-3fd421cc67ff-logs\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.460488 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-scripts\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.460529 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s4nx\" (UniqueName: \"kubernetes.io/projected/84e0f825-b01b-441d-a14c-3fd421cc67ff-kube-api-access-6s4nx\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.529151 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b6f876c8f-khnvw"] Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.562738 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s4nx\" (UniqueName: \"kubernetes.io/projected/84e0f825-b01b-441d-a14c-3fd421cc67ff-kube-api-access-6s4nx\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.563273 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84e0f825-b01b-441d-a14c-3fd421cc67ff-horizon-secret-key\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.563304 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-config-data\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.563344 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84e0f825-b01b-441d-a14c-3fd421cc67ff-logs\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.563365 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-scripts\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.564546 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84e0f825-b01b-441d-a14c-3fd421cc67ff-logs\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.565596 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-scripts\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.569984 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-config-data\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.577437 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84e0f825-b01b-441d-a14c-3fd421cc67ff-horizon-secret-key\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.587209 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s4nx\" (UniqueName: \"kubernetes.io/projected/84e0f825-b01b-441d-a14c-3fd421cc67ff-kube-api-access-6s4nx\") pod \"horizon-8695445c8f-brjrf\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.764051 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.806010 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6f876c8f-khnvw" event={"ID":"e65b9ebe-56f5-4755-95c5-8e787ff56cc7","Type":"ContainerStarted","Data":"e3cbaf0819692ca47a05979237bd4f1de7886f6b2d67ca21375abd1890c8bbfa"} Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.808611 5118 generic.go:334] "Generic (PLEG): container finished" podID="2e2a35af-1457-4c23-b5c8-425735c0d833" containerID="8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25" exitCode=143 Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.808724 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e2a35af-1457-4c23-b5c8-425735c0d833","Type":"ContainerDied","Data":"8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25"} Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.810727 5118 generic.go:334] "Generic (PLEG): container finished" podID="de407020-eb83-4653-8880-2fd15bee8791" containerID="fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213" exitCode=143 Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.810762 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de407020-eb83-4653-8880-2fd15bee8791","Type":"ContainerDied","Data":"fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213"} Feb 23 08:51:46 crc kubenswrapper[5118]: I0223 08:51:46.874740 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79bd4bc597-nbr9p"] Feb 23 08:51:46 crc kubenswrapper[5118]: W0223 08:51:46.877380 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeda99e5a_c7d9_45d9_ac0c_3124eaa4181d.slice/crio-38a063409887bbeb0e5ce347900afb605b490f121f960e300a2a03b7d2a628ac WatchSource:0}: Error finding container 38a063409887bbeb0e5ce347900afb605b490f121f960e300a2a03b7d2a628ac: Status 404 returned error can't find the container with id 38a063409887bbeb0e5ce347900afb605b490f121f960e300a2a03b7d2a628ac Feb 23 08:51:47 crc kubenswrapper[5118]: I0223 08:51:47.268431 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8695445c8f-brjrf"] Feb 23 08:51:47 crc kubenswrapper[5118]: I0223 08:51:47.822337 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bd4bc597-nbr9p" event={"ID":"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d","Type":"ContainerStarted","Data":"38a063409887bbeb0e5ce347900afb605b490f121f960e300a2a03b7d2a628ac"} Feb 23 08:51:47 crc kubenswrapper[5118]: I0223 08:51:47.824269 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8695445c8f-brjrf" event={"ID":"84e0f825-b01b-441d-a14c-3fd421cc67ff","Type":"ContainerStarted","Data":"50dd73f37f1138e8963930ac40a6e54498b69835deed35141dbb1160e85cb1b8"} Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.567352 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.634379 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-combined-ca-bundle\") pod \"2e2a35af-1457-4c23-b5c8-425735c0d833\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.634507 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-httpd-run\") pod \"2e2a35af-1457-4c23-b5c8-425735c0d833\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.634571 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-scripts\") pod \"2e2a35af-1457-4c23-b5c8-425735c0d833\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.634595 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h744\" (UniqueName: \"kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-kube-api-access-4h744\") pod \"2e2a35af-1457-4c23-b5c8-425735c0d833\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.634661 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-ceph\") pod \"2e2a35af-1457-4c23-b5c8-425735c0d833\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.634678 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-config-data\") pod \"2e2a35af-1457-4c23-b5c8-425735c0d833\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.634864 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-logs\") pod \"2e2a35af-1457-4c23-b5c8-425735c0d833\" (UID: \"2e2a35af-1457-4c23-b5c8-425735c0d833\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.635517 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2e2a35af-1457-4c23-b5c8-425735c0d833" (UID: "2e2a35af-1457-4c23-b5c8-425735c0d833"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.635779 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-logs" (OuterVolumeSpecName: "logs") pod "2e2a35af-1457-4c23-b5c8-425735c0d833" (UID: "2e2a35af-1457-4c23-b5c8-425735c0d833"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.636415 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.636444 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e2a35af-1457-4c23-b5c8-425735c0d833-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.642465 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-ceph" (OuterVolumeSpecName: "ceph") pod "2e2a35af-1457-4c23-b5c8-425735c0d833" (UID: "2e2a35af-1457-4c23-b5c8-425735c0d833"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.646060 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-kube-api-access-4h744" (OuterVolumeSpecName: "kube-api-access-4h744") pod "2e2a35af-1457-4c23-b5c8-425735c0d833" (UID: "2e2a35af-1457-4c23-b5c8-425735c0d833"). InnerVolumeSpecName "kube-api-access-4h744". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.651441 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-scripts" (OuterVolumeSpecName: "scripts") pod "2e2a35af-1457-4c23-b5c8-425735c0d833" (UID: "2e2a35af-1457-4c23-b5c8-425735c0d833"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.678346 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e2a35af-1457-4c23-b5c8-425735c0d833" (UID: "2e2a35af-1457-4c23-b5c8-425735c0d833"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.696516 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.742543 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq8zr\" (UniqueName: \"kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-kube-api-access-mq8zr\") pod \"de407020-eb83-4653-8880-2fd15bee8791\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.765596 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-httpd-run\") pod \"de407020-eb83-4653-8880-2fd15bee8791\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.765699 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-scripts\") pod \"de407020-eb83-4653-8880-2fd15bee8791\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.765786 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-config-data\") pod \"de407020-eb83-4653-8880-2fd15bee8791\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.765878 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-combined-ca-bundle\") pod \"de407020-eb83-4653-8880-2fd15bee8791\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.765990 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-logs\") pod \"de407020-eb83-4653-8880-2fd15bee8791\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.766051 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-ceph\") pod \"de407020-eb83-4653-8880-2fd15bee8791\" (UID: \"de407020-eb83-4653-8880-2fd15bee8791\") " Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.747533 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-kube-api-access-mq8zr" (OuterVolumeSpecName: "kube-api-access-mq8zr") pod "de407020-eb83-4653-8880-2fd15bee8791" (UID: "de407020-eb83-4653-8880-2fd15bee8791"). InnerVolumeSpecName "kube-api-access-mq8zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.768392 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.768422 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq8zr\" (UniqueName: \"kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-kube-api-access-mq8zr\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.768503 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.768518 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h744\" (UniqueName: \"kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-kube-api-access-4h744\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.768533 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2e2a35af-1457-4c23-b5c8-425735c0d833-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.769261 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "de407020-eb83-4653-8880-2fd15bee8791" (UID: "de407020-eb83-4653-8880-2fd15bee8791"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.770486 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-logs" (OuterVolumeSpecName: "logs") pod "de407020-eb83-4653-8880-2fd15bee8791" (UID: "de407020-eb83-4653-8880-2fd15bee8791"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.787152 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-scripts" (OuterVolumeSpecName: "scripts") pod "de407020-eb83-4653-8880-2fd15bee8791" (UID: "de407020-eb83-4653-8880-2fd15bee8791"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.792139 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-ceph" (OuterVolumeSpecName: "ceph") pod "de407020-eb83-4653-8880-2fd15bee8791" (UID: "de407020-eb83-4653-8880-2fd15bee8791"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.792494 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-config-data" (OuterVolumeSpecName: "config-data") pod "2e2a35af-1457-4c23-b5c8-425735c0d833" (UID: "2e2a35af-1457-4c23-b5c8-425735c0d833"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.810547 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de407020-eb83-4653-8880-2fd15bee8791" (UID: "de407020-eb83-4653-8880-2fd15bee8791"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.848631 5118 generic.go:334] "Generic (PLEG): container finished" podID="2e2a35af-1457-4c23-b5c8-425735c0d833" containerID="db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26" exitCode=0 Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.848692 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e2a35af-1457-4c23-b5c8-425735c0d833","Type":"ContainerDied","Data":"db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26"} Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.848723 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e2a35af-1457-4c23-b5c8-425735c0d833","Type":"ContainerDied","Data":"c7efe83597a93a1ddee41fa59313a0f0ddabc9ef894abcd1abf42ee725c4001e"} Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.848739 5118 scope.go:117] "RemoveContainer" containerID="db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.848861 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.855121 5118 generic.go:334] "Generic (PLEG): container finished" podID="de407020-eb83-4653-8880-2fd15bee8791" containerID="fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6" exitCode=0 Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.855180 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de407020-eb83-4653-8880-2fd15bee8791","Type":"ContainerDied","Data":"fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6"} Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.855213 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"de407020-eb83-4653-8880-2fd15bee8791","Type":"ContainerDied","Data":"61cf4b0db9dc81c3ee85f5820d3705495f5040ce7ee960c88c33f2e57ded0c0d"} Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.855288 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.862887 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-config-data" (OuterVolumeSpecName: "config-data") pod "de407020-eb83-4653-8880-2fd15bee8791" (UID: "de407020-eb83-4653-8880-2fd15bee8791"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.871165 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.871212 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e2a35af-1457-4c23-b5c8-425735c0d833-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.871222 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.871235 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.871244 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/de407020-eb83-4653-8880-2fd15bee8791-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.871254 5118 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de407020-eb83-4653-8880-2fd15bee8791-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.871265 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de407020-eb83-4653-8880-2fd15bee8791-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.891047 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.901915 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.909004 5118 scope.go:117] "RemoveContainer" containerID="8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.910743 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:51:49 crc kubenswrapper[5118]: E0223 08:51:49.911136 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e2a35af-1457-4c23-b5c8-425735c0d833" containerName="glance-httpd" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.911155 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e2a35af-1457-4c23-b5c8-425735c0d833" containerName="glance-httpd" Feb 23 08:51:49 crc kubenswrapper[5118]: E0223 08:51:49.911170 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de407020-eb83-4653-8880-2fd15bee8791" containerName="glance-log" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.911177 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="de407020-eb83-4653-8880-2fd15bee8791" containerName="glance-log" Feb 23 08:51:49 crc kubenswrapper[5118]: E0223 08:51:49.911189 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e2a35af-1457-4c23-b5c8-425735c0d833" containerName="glance-log" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.911196 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e2a35af-1457-4c23-b5c8-425735c0d833" containerName="glance-log" Feb 23 08:51:49 crc kubenswrapper[5118]: E0223 08:51:49.911223 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de407020-eb83-4653-8880-2fd15bee8791" containerName="glance-httpd" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.911229 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="de407020-eb83-4653-8880-2fd15bee8791" containerName="glance-httpd" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.911387 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="de407020-eb83-4653-8880-2fd15bee8791" containerName="glance-log" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.911399 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e2a35af-1457-4c23-b5c8-425735c0d833" containerName="glance-log" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.911409 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e2a35af-1457-4c23-b5c8-425735c0d833" containerName="glance-httpd" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.911429 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="de407020-eb83-4653-8880-2fd15bee8791" containerName="glance-httpd" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.912337 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.920554 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.927825 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.961466 5118 scope.go:117] "RemoveContainer" containerID="db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26" Feb 23 08:51:49 crc kubenswrapper[5118]: E0223 08:51:49.962376 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26\": container with ID starting with db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26 not found: ID does not exist" containerID="db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.962428 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26"} err="failed to get container status \"db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26\": rpc error: code = NotFound desc = could not find container \"db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26\": container with ID starting with db75e2b6be943f555c9c520b1bb3ddf430d86e5436ab2a9d1a987af9a1f33f26 not found: ID does not exist" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.962477 5118 scope.go:117] "RemoveContainer" containerID="8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25" Feb 23 08:51:49 crc kubenswrapper[5118]: E0223 08:51:49.962838 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25\": container with ID starting with 8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25 not found: ID does not exist" containerID="8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.962917 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25"} err="failed to get container status \"8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25\": rpc error: code = NotFound desc = could not find container \"8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25\": container with ID starting with 8b4d9d08f82902717535f7547cfb535c3a31498e7c2e6775939bb9e6dfa19a25 not found: ID does not exist" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.962977 5118 scope.go:117] "RemoveContainer" containerID="fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.977347 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c8427e-8b9a-4896-bf16-d50804df2346-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.977395 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c8427e-8b9a-4896-bf16-d50804df2346-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.977596 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c3c8427e-8b9a-4896-bf16-d50804df2346-ceph\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.977632 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c8427e-8b9a-4896-bf16-d50804df2346-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.977953 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prddz\" (UniqueName: \"kubernetes.io/projected/c3c8427e-8b9a-4896-bf16-d50804df2346-kube-api-access-prddz\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.978272 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c8427e-8b9a-4896-bf16-d50804df2346-logs\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.978392 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3c8427e-8b9a-4896-bf16-d50804df2346-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:49 crc kubenswrapper[5118]: I0223 08:51:49.997732 5118 scope.go:117] "RemoveContainer" containerID="fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.047136 5118 scope.go:117] "RemoveContainer" containerID="fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6" Feb 23 08:51:50 crc kubenswrapper[5118]: E0223 08:51:50.047689 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6\": container with ID starting with fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6 not found: ID does not exist" containerID="fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.047768 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6"} err="failed to get container status \"fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6\": rpc error: code = NotFound desc = could not find container \"fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6\": container with ID starting with fe8a03aaac456a8f88c2604b3de3397d211f98fb7a9d376eacd544bf0695bcc6 not found: ID does not exist" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.047805 5118 scope.go:117] "RemoveContainer" containerID="fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213" Feb 23 08:51:50 crc kubenswrapper[5118]: E0223 08:51:50.049569 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213\": container with ID starting with fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213 not found: ID does not exist" containerID="fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.049602 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213"} err="failed to get container status \"fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213\": rpc error: code = NotFound desc = could not find container \"fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213\": container with ID starting with fea25896664e7d1811e841d7611eb27ceaa1646d21c8f56cac331e04eb5d1213 not found: ID does not exist" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.086564 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c3c8427e-8b9a-4896-bf16-d50804df2346-ceph\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.086627 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c8427e-8b9a-4896-bf16-d50804df2346-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.086720 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prddz\" (UniqueName: \"kubernetes.io/projected/c3c8427e-8b9a-4896-bf16-d50804df2346-kube-api-access-prddz\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.086838 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c8427e-8b9a-4896-bf16-d50804df2346-logs\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.086885 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3c8427e-8b9a-4896-bf16-d50804df2346-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.086916 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c8427e-8b9a-4896-bf16-d50804df2346-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.086937 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c8427e-8b9a-4896-bf16-d50804df2346-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.088387 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3c8427e-8b9a-4896-bf16-d50804df2346-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.088402 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3c8427e-8b9a-4896-bf16-d50804df2346-logs\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.092596 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3c8427e-8b9a-4896-bf16-d50804df2346-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.093199 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3c8427e-8b9a-4896-bf16-d50804df2346-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.097450 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c3c8427e-8b9a-4896-bf16-d50804df2346-ceph\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.097492 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3c8427e-8b9a-4896-bf16-d50804df2346-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.104587 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prddz\" (UniqueName: \"kubernetes.io/projected/c3c8427e-8b9a-4896-bf16-d50804df2346-kube-api-access-prddz\") pod \"glance-default-external-api-0\" (UID: \"c3c8427e-8b9a-4896-bf16-d50804df2346\") " pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.202677 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.220755 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.230220 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.232679 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.235020 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.237649 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.243465 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.292508 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a8db46-f5fa-42f9-b28e-68791b50dc7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.292576 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a8db46-f5fa-42f9-b28e-68791b50dc7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.292918 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a8db46-f5fa-42f9-b28e-68791b50dc7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.293026 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwmzz\" (UniqueName: \"kubernetes.io/projected/51a8db46-f5fa-42f9-b28e-68791b50dc7a-kube-api-access-xwmzz\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.293192 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a8db46-f5fa-42f9-b28e-68791b50dc7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.293642 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51a8db46-f5fa-42f9-b28e-68791b50dc7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.293731 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/51a8db46-f5fa-42f9-b28e-68791b50dc7a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.395970 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a8db46-f5fa-42f9-b28e-68791b50dc7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.396644 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwmzz\" (UniqueName: \"kubernetes.io/projected/51a8db46-f5fa-42f9-b28e-68791b50dc7a-kube-api-access-xwmzz\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.396727 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a8db46-f5fa-42f9-b28e-68791b50dc7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.396851 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51a8db46-f5fa-42f9-b28e-68791b50dc7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.396883 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/51a8db46-f5fa-42f9-b28e-68791b50dc7a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.396908 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a8db46-f5fa-42f9-b28e-68791b50dc7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.396949 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a8db46-f5fa-42f9-b28e-68791b50dc7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.397875 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a8db46-f5fa-42f9-b28e-68791b50dc7a-logs\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.398260 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51a8db46-f5fa-42f9-b28e-68791b50dc7a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.400314 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/51a8db46-f5fa-42f9-b28e-68791b50dc7a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.401453 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a8db46-f5fa-42f9-b28e-68791b50dc7a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.404219 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a8db46-f5fa-42f9-b28e-68791b50dc7a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.409072 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a8db46-f5fa-42f9-b28e-68791b50dc7a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.412938 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwmzz\" (UniqueName: \"kubernetes.io/projected/51a8db46-f5fa-42f9-b28e-68791b50dc7a-kube-api-access-xwmzz\") pod \"glance-default-internal-api-0\" (UID: \"51a8db46-f5fa-42f9-b28e-68791b50dc7a\") " pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.552875 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 23 08:51:50 crc kubenswrapper[5118]: I0223 08:51:50.823272 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 23 08:51:51 crc kubenswrapper[5118]: I0223 08:51:51.712982 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e2a35af-1457-4c23-b5c8-425735c0d833" path="/var/lib/kubelet/pods/2e2a35af-1457-4c23-b5c8-425735c0d833/volumes" Feb 23 08:51:51 crc kubenswrapper[5118]: I0223 08:51:51.713981 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de407020-eb83-4653-8880-2fd15bee8791" path="/var/lib/kubelet/pods/de407020-eb83-4653-8880-2fd15bee8791/volumes" Feb 23 08:51:54 crc kubenswrapper[5118]: I0223 08:51:54.697754 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:51:54 crc kubenswrapper[5118]: E0223 08:51:54.698716 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:51:55 crc kubenswrapper[5118]: I0223 08:51:55.947164 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3c8427e-8b9a-4896-bf16-d50804df2346","Type":"ContainerStarted","Data":"35b2f4693b972c5dc75a1f6b45af57e348e739e3ccbab9a8b4b4d155eedcb28b"} Feb 23 08:51:56 crc kubenswrapper[5118]: I0223 08:51:56.317111 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 23 08:51:56 crc kubenswrapper[5118]: I0223 08:51:56.982084 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bd4bc597-nbr9p" event={"ID":"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d","Type":"ContainerStarted","Data":"c46b00bd65273ab9009a77f50b95cd4265a2ac1b2384009c4f687834ac16b13a"} Feb 23 08:51:56 crc kubenswrapper[5118]: I0223 08:51:56.983023 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bd4bc597-nbr9p" event={"ID":"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d","Type":"ContainerStarted","Data":"1b83b7b75dda9f94ae4e589088112a7497d29a197126f4f9d4b38e51d0449ff6"} Feb 23 08:51:56 crc kubenswrapper[5118]: I0223 08:51:56.993954 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6f876c8f-khnvw" event={"ID":"e65b9ebe-56f5-4755-95c5-8e787ff56cc7","Type":"ContainerStarted","Data":"00e6e1f98f77a4e9c690fa82a16f640b300f205807b666add1533b5760dd00a5"} Feb 23 08:51:56 crc kubenswrapper[5118]: I0223 08:51:56.994019 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6f876c8f-khnvw" event={"ID":"e65b9ebe-56f5-4755-95c5-8e787ff56cc7","Type":"ContainerStarted","Data":"c82a1db7dcb8c470362958480bd9c7af8a74826c7699f87017adf348392a7cc7"} Feb 23 08:51:56 crc kubenswrapper[5118]: I0223 08:51:56.995201 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6b6f876c8f-khnvw" podUID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" containerName="horizon-log" containerID="cri-o://c82a1db7dcb8c470362958480bd9c7af8a74826c7699f87017adf348392a7cc7" gracePeriod=30 Feb 23 08:51:56 crc kubenswrapper[5118]: I0223 08:51:56.995521 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6b6f876c8f-khnvw" podUID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" containerName="horizon" containerID="cri-o://00e6e1f98f77a4e9c690fa82a16f640b300f205807b666add1533b5760dd00a5" gracePeriod=30 Feb 23 08:51:56 crc kubenswrapper[5118]: I0223 08:51:56.999802 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3c8427e-8b9a-4896-bf16-d50804df2346","Type":"ContainerStarted","Data":"657bb050e9e3e6537605de5d7eab7d18b72aeacb97a379146a538f36506a3206"} Feb 23 08:51:57 crc kubenswrapper[5118]: I0223 08:51:57.024525 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51a8db46-f5fa-42f9-b28e-68791b50dc7a","Type":"ContainerStarted","Data":"67803461f2dc589fd719ffa2ed1909b790b286266c4778364e23736373eba785"} Feb 23 08:51:57 crc kubenswrapper[5118]: I0223 08:51:57.024602 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51a8db46-f5fa-42f9-b28e-68791b50dc7a","Type":"ContainerStarted","Data":"86981e86469805a5ac333a6178983b6cc93e3df79b785b3b4bf808045447dc08"} Feb 23 08:51:57 crc kubenswrapper[5118]: I0223 08:51:57.034691 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8695445c8f-brjrf" event={"ID":"84e0f825-b01b-441d-a14c-3fd421cc67ff","Type":"ContainerStarted","Data":"4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad"} Feb 23 08:51:57 crc kubenswrapper[5118]: I0223 08:51:57.034811 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8695445c8f-brjrf" event={"ID":"84e0f825-b01b-441d-a14c-3fd421cc67ff","Type":"ContainerStarted","Data":"edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c"} Feb 23 08:51:57 crc kubenswrapper[5118]: I0223 08:51:57.038419 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79bd4bc597-nbr9p" podStartSLOduration=3.062677136 podStartE2EDuration="12.038399348s" podCreationTimestamp="2026-02-23 08:51:45 +0000 UTC" firstStartedPulling="2026-02-23 08:51:46.882719978 +0000 UTC m=+7569.886504551" lastFinishedPulling="2026-02-23 08:51:55.85844219 +0000 UTC m=+7578.862226763" observedRunningTime="2026-02-23 08:51:57.00693145 +0000 UTC m=+7580.010716013" watchObservedRunningTime="2026-02-23 08:51:57.038399348 +0000 UTC m=+7580.042183921" Feb 23 08:51:57 crc kubenswrapper[5118]: I0223 08:51:57.050473 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6b6f876c8f-khnvw" podStartSLOduration=2.7174944119999997 podStartE2EDuration="12.050422397s" podCreationTimestamp="2026-02-23 08:51:45 +0000 UTC" firstStartedPulling="2026-02-23 08:51:46.535395823 +0000 UTC m=+7569.539180396" lastFinishedPulling="2026-02-23 08:51:55.868323808 +0000 UTC m=+7578.872108381" observedRunningTime="2026-02-23 08:51:57.027438144 +0000 UTC m=+7580.031222727" watchObservedRunningTime="2026-02-23 08:51:57.050422397 +0000 UTC m=+7580.054206970" Feb 23 08:51:57 crc kubenswrapper[5118]: I0223 08:51:57.071218 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8695445c8f-brjrf" podStartSLOduration=2.4838854 podStartE2EDuration="11.071201958s" podCreationTimestamp="2026-02-23 08:51:46 +0000 UTC" firstStartedPulling="2026-02-23 08:51:47.281872171 +0000 UTC m=+7570.285656744" lastFinishedPulling="2026-02-23 08:51:55.869188739 +0000 UTC m=+7578.872973302" observedRunningTime="2026-02-23 08:51:57.055595332 +0000 UTC m=+7580.059379905" watchObservedRunningTime="2026-02-23 08:51:57.071201958 +0000 UTC m=+7580.074986531" Feb 23 08:51:58 crc kubenswrapper[5118]: I0223 08:51:58.049534 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3c8427e-8b9a-4896-bf16-d50804df2346","Type":"ContainerStarted","Data":"55f587c103930ac3426bb9aec8ad36e975445eac81aec918264f5eb48cbbd275"} Feb 23 08:51:58 crc kubenswrapper[5118]: I0223 08:51:58.052817 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"51a8db46-f5fa-42f9-b28e-68791b50dc7a","Type":"ContainerStarted","Data":"121c9ae6ee5fa8682d3f190a72152d9ea38743546c89f4126f1ad2f133e41501"} Feb 23 08:51:58 crc kubenswrapper[5118]: I0223 08:51:58.080978 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.080960077 podStartE2EDuration="9.080960077s" podCreationTimestamp="2026-02-23 08:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:51:58.071560741 +0000 UTC m=+7581.075345324" watchObservedRunningTime="2026-02-23 08:51:58.080960077 +0000 UTC m=+7581.084744650" Feb 23 08:51:58 crc kubenswrapper[5118]: I0223 08:51:58.108827 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.108799967 podStartE2EDuration="8.108799967s" podCreationTimestamp="2026-02-23 08:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:51:58.097541116 +0000 UTC m=+7581.101325699" watchObservedRunningTime="2026-02-23 08:51:58.108799967 +0000 UTC m=+7581.112584540" Feb 23 08:52:00 crc kubenswrapper[5118]: I0223 08:52:00.238512 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 08:52:00 crc kubenswrapper[5118]: I0223 08:52:00.239172 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 23 08:52:00 crc kubenswrapper[5118]: I0223 08:52:00.275312 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 08:52:00 crc kubenswrapper[5118]: I0223 08:52:00.283872 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 23 08:52:00 crc kubenswrapper[5118]: I0223 08:52:00.553995 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 08:52:00 crc kubenswrapper[5118]: I0223 08:52:00.554059 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 23 08:52:00 crc kubenswrapper[5118]: I0223 08:52:00.598343 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 08:52:00 crc kubenswrapper[5118]: I0223 08:52:00.605020 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 23 08:52:01 crc kubenswrapper[5118]: I0223 08:52:01.084424 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 08:52:01 crc kubenswrapper[5118]: I0223 08:52:01.084513 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 08:52:01 crc kubenswrapper[5118]: I0223 08:52:01.084527 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 23 08:52:01 crc kubenswrapper[5118]: I0223 08:52:01.084536 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 23 08:52:03 crc kubenswrapper[5118]: I0223 08:52:03.807932 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 08:52:03 crc kubenswrapper[5118]: I0223 08:52:03.829336 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 08:52:03 crc kubenswrapper[5118]: I0223 08:52:03.834220 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 23 08:52:04 crc kubenswrapper[5118]: I0223 08:52:04.510589 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 23 08:52:06 crc kubenswrapper[5118]: I0223 08:52:06.059245 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:52:06 crc kubenswrapper[5118]: I0223 08:52:06.332632 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:52:06 crc kubenswrapper[5118]: I0223 08:52:06.333683 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:52:06 crc kubenswrapper[5118]: I0223 08:52:06.334733 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79bd4bc597-nbr9p" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Feb 23 08:52:06 crc kubenswrapper[5118]: I0223 08:52:06.699492 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:52:06 crc kubenswrapper[5118]: E0223 08:52:06.700532 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:52:06 crc kubenswrapper[5118]: I0223 08:52:06.764640 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:52:06 crc kubenswrapper[5118]: I0223 08:52:06.764719 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:52:06 crc kubenswrapper[5118]: I0223 08:52:06.767414 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8695445c8f-brjrf" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Feb 23 08:52:16 crc kubenswrapper[5118]: I0223 08:52:16.333282 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79bd4bc597-nbr9p" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Feb 23 08:52:16 crc kubenswrapper[5118]: I0223 08:52:16.764790 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-8695445c8f-brjrf" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Feb 23 08:52:21 crc kubenswrapper[5118]: I0223 08:52:21.698389 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:52:21 crc kubenswrapper[5118]: E0223 08:52:21.699798 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.480091 5118 generic.go:334] "Generic (PLEG): container finished" podID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" containerID="00e6e1f98f77a4e9c690fa82a16f640b300f205807b666add1533b5760dd00a5" exitCode=137 Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.480654 5118 generic.go:334] "Generic (PLEG): container finished" podID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" containerID="c82a1db7dcb8c470362958480bd9c7af8a74826c7699f87017adf348392a7cc7" exitCode=137 Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.480677 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6f876c8f-khnvw" event={"ID":"e65b9ebe-56f5-4755-95c5-8e787ff56cc7","Type":"ContainerDied","Data":"00e6e1f98f77a4e9c690fa82a16f640b300f205807b666add1533b5760dd00a5"} Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.480706 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6f876c8f-khnvw" event={"ID":"e65b9ebe-56f5-4755-95c5-8e787ff56cc7","Type":"ContainerDied","Data":"c82a1db7dcb8c470362958480bd9c7af8a74826c7699f87017adf348392a7cc7"} Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.480717 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b6f876c8f-khnvw" event={"ID":"e65b9ebe-56f5-4755-95c5-8e787ff56cc7","Type":"ContainerDied","Data":"e3cbaf0819692ca47a05979237bd4f1de7886f6b2d67ca21375abd1890c8bbfa"} Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.480726 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3cbaf0819692ca47a05979237bd4f1de7886f6b2d67ca21375abd1890c8bbfa" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.530015 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.634998 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-scripts\") pod \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.635166 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg9x9\" (UniqueName: \"kubernetes.io/projected/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-kube-api-access-mg9x9\") pod \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.635328 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-horizon-secret-key\") pod \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.635362 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-logs\") pod \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.635410 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-config-data\") pod \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\" (UID: \"e65b9ebe-56f5-4755-95c5-8e787ff56cc7\") " Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.637268 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-logs" (OuterVolumeSpecName: "logs") pod "e65b9ebe-56f5-4755-95c5-8e787ff56cc7" (UID: "e65b9ebe-56f5-4755-95c5-8e787ff56cc7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.643855 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-kube-api-access-mg9x9" (OuterVolumeSpecName: "kube-api-access-mg9x9") pod "e65b9ebe-56f5-4755-95c5-8e787ff56cc7" (UID: "e65b9ebe-56f5-4755-95c5-8e787ff56cc7"). InnerVolumeSpecName "kube-api-access-mg9x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.651273 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e65b9ebe-56f5-4755-95c5-8e787ff56cc7" (UID: "e65b9ebe-56f5-4755-95c5-8e787ff56cc7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.667866 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-config-data" (OuterVolumeSpecName: "config-data") pod "e65b9ebe-56f5-4755-95c5-8e787ff56cc7" (UID: "e65b9ebe-56f5-4755-95c5-8e787ff56cc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.679263 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-scripts" (OuterVolumeSpecName: "scripts") pod "e65b9ebe-56f5-4755-95c5-8e787ff56cc7" (UID: "e65b9ebe-56f5-4755-95c5-8e787ff56cc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.738768 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg9x9\" (UniqueName: \"kubernetes.io/projected/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-kube-api-access-mg9x9\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.739004 5118 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.739029 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.739140 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:27 crc kubenswrapper[5118]: I0223 08:52:27.739182 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e65b9ebe-56f5-4755-95c5-8e787ff56cc7-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:52:28 crc kubenswrapper[5118]: I0223 08:52:28.493058 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b6f876c8f-khnvw" Feb 23 08:52:28 crc kubenswrapper[5118]: I0223 08:52:28.538602 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b6f876c8f-khnvw"] Feb 23 08:52:28 crc kubenswrapper[5118]: I0223 08:52:28.552457 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b6f876c8f-khnvw"] Feb 23 08:52:28 crc kubenswrapper[5118]: I0223 08:52:28.622889 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:52:28 crc kubenswrapper[5118]: I0223 08:52:28.638845 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:52:29 crc kubenswrapper[5118]: I0223 08:52:29.719758 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" path="/var/lib/kubelet/pods/e65b9ebe-56f5-4755-95c5-8e787ff56cc7/volumes" Feb 23 08:52:30 crc kubenswrapper[5118]: I0223 08:52:30.371315 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:52:30 crc kubenswrapper[5118]: I0223 08:52:30.403942 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:52:30 crc kubenswrapper[5118]: I0223 08:52:30.499179 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79bd4bc597-nbr9p"] Feb 23 08:52:30 crc kubenswrapper[5118]: I0223 08:52:30.524293 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79bd4bc597-nbr9p" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon-log" containerID="cri-o://1b83b7b75dda9f94ae4e589088112a7497d29a197126f4f9d4b38e51d0449ff6" gracePeriod=30 Feb 23 08:52:30 crc kubenswrapper[5118]: I0223 08:52:30.524349 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79bd4bc597-nbr9p" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon" containerID="cri-o://c46b00bd65273ab9009a77f50b95cd4265a2ac1b2384009c4f687834ac16b13a" gracePeriod=30 Feb 23 08:52:34 crc kubenswrapper[5118]: I0223 08:52:34.572690 5118 generic.go:334] "Generic (PLEG): container finished" podID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerID="c46b00bd65273ab9009a77f50b95cd4265a2ac1b2384009c4f687834ac16b13a" exitCode=0 Feb 23 08:52:34 crc kubenswrapper[5118]: I0223 08:52:34.572796 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bd4bc597-nbr9p" event={"ID":"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d","Type":"ContainerDied","Data":"c46b00bd65273ab9009a77f50b95cd4265a2ac1b2384009c4f687834ac16b13a"} Feb 23 08:52:35 crc kubenswrapper[5118]: I0223 08:52:35.698351 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:52:35 crc kubenswrapper[5118]: E0223 08:52:35.698791 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:52:36 crc kubenswrapper[5118]: I0223 08:52:36.332939 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79bd4bc597-nbr9p" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Feb 23 08:52:46 crc kubenswrapper[5118]: I0223 08:52:46.332979 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79bd4bc597-nbr9p" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Feb 23 08:52:47 crc kubenswrapper[5118]: I0223 08:52:47.708192 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:52:47 crc kubenswrapper[5118]: E0223 08:52:47.708641 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:52:56 crc kubenswrapper[5118]: I0223 08:52:56.333129 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79bd4bc597-nbr9p" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Feb 23 08:52:56 crc kubenswrapper[5118]: I0223 08:52:56.333988 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:53:00 crc kubenswrapper[5118]: I0223 08:53:00.697236 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:53:00 crc kubenswrapper[5118]: E0223 08:53:00.698215 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:53:00 crc kubenswrapper[5118]: I0223 08:53:00.858860 5118 generic.go:334] "Generic (PLEG): container finished" podID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerID="1b83b7b75dda9f94ae4e589088112a7497d29a197126f4f9d4b38e51d0449ff6" exitCode=137 Feb 23 08:53:00 crc kubenswrapper[5118]: I0223 08:53:00.858944 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bd4bc597-nbr9p" event={"ID":"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d","Type":"ContainerDied","Data":"1b83b7b75dda9f94ae4e589088112a7497d29a197126f4f9d4b38e51d0449ff6"} Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.028040 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.206262 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-horizon-secret-key\") pod \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.206319 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-logs\") pod \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.206352 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-config-data\") pod \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.206395 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj57r\" (UniqueName: \"kubernetes.io/projected/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-kube-api-access-nj57r\") pod \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.206493 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-scripts\") pod \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\" (UID: \"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d\") " Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.207805 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-logs" (OuterVolumeSpecName: "logs") pod "eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" (UID: "eda99e5a-c7d9-45d9-ac0c-3124eaa4181d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.263545 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" (UID: "eda99e5a-c7d9-45d9-ac0c-3124eaa4181d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.263707 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-kube-api-access-nj57r" (OuterVolumeSpecName: "kube-api-access-nj57r") pod "eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" (UID: "eda99e5a-c7d9-45d9-ac0c-3124eaa4181d"). InnerVolumeSpecName "kube-api-access-nj57r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.264511 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-scripts" (OuterVolumeSpecName: "scripts") pod "eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" (UID: "eda99e5a-c7d9-45d9-ac0c-3124eaa4181d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.264915 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-config-data" (OuterVolumeSpecName: "config-data") pod "eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" (UID: "eda99e5a-c7d9-45d9-ac0c-3124eaa4181d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.309368 5118 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.309400 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.309425 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.309435 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj57r\" (UniqueName: \"kubernetes.io/projected/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-kube-api-access-nj57r\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.309444 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.869617 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79bd4bc597-nbr9p" event={"ID":"eda99e5a-c7d9-45d9-ac0c-3124eaa4181d","Type":"ContainerDied","Data":"38a063409887bbeb0e5ce347900afb605b490f121f960e300a2a03b7d2a628ac"} Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.869681 5118 scope.go:117] "RemoveContainer" containerID="c46b00bd65273ab9009a77f50b95cd4265a2ac1b2384009c4f687834ac16b13a" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.869727 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79bd4bc597-nbr9p" Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.898215 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79bd4bc597-nbr9p"] Feb 23 08:53:01 crc kubenswrapper[5118]: I0223 08:53:01.908240 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79bd4bc597-nbr9p"] Feb 23 08:53:02 crc kubenswrapper[5118]: I0223 08:53:02.072253 5118 scope.go:117] "RemoveContainer" containerID="1b83b7b75dda9f94ae4e589088112a7497d29a197126f4f9d4b38e51d0449ff6" Feb 23 08:53:03 crc kubenswrapper[5118]: I0223 08:53:03.711987 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" path="/var/lib/kubelet/pods/eda99e5a-c7d9-45d9-ac0c-3124eaa4181d/volumes" Feb 23 08:53:10 crc kubenswrapper[5118]: I0223 08:53:10.056806 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8c49-account-create-update-jl45l"] Feb 23 08:53:10 crc kubenswrapper[5118]: I0223 08:53:10.065059 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-zc4ct"] Feb 23 08:53:10 crc kubenswrapper[5118]: I0223 08:53:10.073705 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8c49-account-create-update-jl45l"] Feb 23 08:53:10 crc kubenswrapper[5118]: I0223 08:53:10.080844 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-zc4ct"] Feb 23 08:53:11 crc kubenswrapper[5118]: I0223 08:53:11.710127 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c0c14ba-8d05-47c2-b951-114afe9834c1" path="/var/lib/kubelet/pods/0c0c14ba-8d05-47c2-b951-114afe9834c1/volumes" Feb 23 08:53:11 crc kubenswrapper[5118]: I0223 08:53:11.710698 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3d9bf0-7540-4c43-b198-e096fa76f5ec" path="/var/lib/kubelet/pods/9c3d9bf0-7540-4c43-b198-e096fa76f5ec/volumes" Feb 23 08:53:12 crc kubenswrapper[5118]: I0223 08:53:12.697684 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:53:12 crc kubenswrapper[5118]: E0223 08:53:12.698242 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:53:21 crc kubenswrapper[5118]: I0223 08:53:21.064036 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-9qmfr"] Feb 23 08:53:21 crc kubenswrapper[5118]: I0223 08:53:21.080005 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-9qmfr"] Feb 23 08:53:21 crc kubenswrapper[5118]: I0223 08:53:21.712910 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf" path="/var/lib/kubelet/pods/ddd1fa73-a399-4ab8-83fc-0aacfc3eb7bf/volumes" Feb 23 08:53:25 crc kubenswrapper[5118]: I0223 08:53:25.698214 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:53:25 crc kubenswrapper[5118]: E0223 08:53:25.699443 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.042056 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6994f557dc-cpfnd"] Feb 23 08:53:38 crc kubenswrapper[5118]: E0223 08:53:38.043161 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.043182 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon" Feb 23 08:53:38 crc kubenswrapper[5118]: E0223 08:53:38.043211 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" containerName="horizon-log" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.043219 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" containerName="horizon-log" Feb 23 08:53:38 crc kubenswrapper[5118]: E0223 08:53:38.043245 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon-log" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.043254 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon-log" Feb 23 08:53:38 crc kubenswrapper[5118]: E0223 08:53:38.043280 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" containerName="horizon" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.043287 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" containerName="horizon" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.043509 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" containerName="horizon-log" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.043553 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e65b9ebe-56f5-4755-95c5-8e787ff56cc7" containerName="horizon" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.043577 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon-log" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.043592 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda99e5a-c7d9-45d9-ac0c-3124eaa4181d" containerName="horizon" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.044907 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.063501 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6994f557dc-cpfnd"] Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.126767 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5090d54b-dd37-4141-b04c-0a5451a4f264-logs\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.126863 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5090d54b-dd37-4141-b04c-0a5451a4f264-scripts\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.126991 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkrfn\" (UniqueName: \"kubernetes.io/projected/5090d54b-dd37-4141-b04c-0a5451a4f264-kube-api-access-zkrfn\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.127305 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5090d54b-dd37-4141-b04c-0a5451a4f264-config-data\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.127471 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5090d54b-dd37-4141-b04c-0a5451a4f264-horizon-secret-key\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.228828 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkrfn\" (UniqueName: \"kubernetes.io/projected/5090d54b-dd37-4141-b04c-0a5451a4f264-kube-api-access-zkrfn\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.228949 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5090d54b-dd37-4141-b04c-0a5451a4f264-config-data\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.229021 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5090d54b-dd37-4141-b04c-0a5451a4f264-horizon-secret-key\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.229046 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5090d54b-dd37-4141-b04c-0a5451a4f264-logs\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.229095 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5090d54b-dd37-4141-b04c-0a5451a4f264-scripts\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.229981 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5090d54b-dd37-4141-b04c-0a5451a4f264-scripts\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.231292 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5090d54b-dd37-4141-b04c-0a5451a4f264-config-data\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.232159 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5090d54b-dd37-4141-b04c-0a5451a4f264-logs\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.236428 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5090d54b-dd37-4141-b04c-0a5451a4f264-horizon-secret-key\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.254291 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkrfn\" (UniqueName: \"kubernetes.io/projected/5090d54b-dd37-4141-b04c-0a5451a4f264-kube-api-access-zkrfn\") pod \"horizon-6994f557dc-cpfnd\" (UID: \"5090d54b-dd37-4141-b04c-0a5451a4f264\") " pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.363859 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:38 crc kubenswrapper[5118]: I0223 08:53:38.880980 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6994f557dc-cpfnd"] Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.290654 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6994f557dc-cpfnd" event={"ID":"5090d54b-dd37-4141-b04c-0a5451a4f264","Type":"ContainerStarted","Data":"ff60d880be120b89f6afd63c4d2fe6cbc6163510e1cf2e8fa43f19564057a0f9"} Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.290992 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6994f557dc-cpfnd" event={"ID":"5090d54b-dd37-4141-b04c-0a5451a4f264","Type":"ContainerStarted","Data":"9830e70a3697a1561837f14901f14ab845b446173b870d8099c00f947fccb2f9"} Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.291002 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6994f557dc-cpfnd" event={"ID":"5090d54b-dd37-4141-b04c-0a5451a4f264","Type":"ContainerStarted","Data":"084c7bf757a03a768aae852e5d220347fa7c4bf369d94fc4ff296c0b23b9f340"} Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.324069 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6994f557dc-cpfnd" podStartSLOduration=1.324049501 podStartE2EDuration="1.324049501s" podCreationTimestamp="2026-02-23 08:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:53:39.316191912 +0000 UTC m=+7682.319976475" watchObservedRunningTime="2026-02-23 08:53:39.324049501 +0000 UTC m=+7682.327834084" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.657602 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-d9crc"] Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.659066 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-d9crc" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.674204 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-d9crc"] Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.742513 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-7af3-account-create-update-pflph"] Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.743937 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-7af3-account-create-update-pflph" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.746670 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.754956 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kftbj\" (UniqueName: \"kubernetes.io/projected/5fed72ec-b872-42ca-91e9-354debb471bd-kube-api-access-kftbj\") pod \"heat-7af3-account-create-update-pflph\" (UID: \"5fed72ec-b872-42ca-91e9-354debb471bd\") " pod="openstack/heat-7af3-account-create-update-pflph" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.755029 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqd2t\" (UniqueName: \"kubernetes.io/projected/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-kube-api-access-qqd2t\") pod \"heat-db-create-d9crc\" (UID: \"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5\") " pod="openstack/heat-db-create-d9crc" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.755148 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-operator-scripts\") pod \"heat-db-create-d9crc\" (UID: \"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5\") " pod="openstack/heat-db-create-d9crc" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.755212 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fed72ec-b872-42ca-91e9-354debb471bd-operator-scripts\") pod \"heat-7af3-account-create-update-pflph\" (UID: \"5fed72ec-b872-42ca-91e9-354debb471bd\") " pod="openstack/heat-7af3-account-create-update-pflph" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.756145 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-7af3-account-create-update-pflph"] Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.856994 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-operator-scripts\") pod \"heat-db-create-d9crc\" (UID: \"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5\") " pod="openstack/heat-db-create-d9crc" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.857059 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fed72ec-b872-42ca-91e9-354debb471bd-operator-scripts\") pod \"heat-7af3-account-create-update-pflph\" (UID: \"5fed72ec-b872-42ca-91e9-354debb471bd\") " pod="openstack/heat-7af3-account-create-update-pflph" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.857161 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kftbj\" (UniqueName: \"kubernetes.io/projected/5fed72ec-b872-42ca-91e9-354debb471bd-kube-api-access-kftbj\") pod \"heat-7af3-account-create-update-pflph\" (UID: \"5fed72ec-b872-42ca-91e9-354debb471bd\") " pod="openstack/heat-7af3-account-create-update-pflph" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.857196 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqd2t\" (UniqueName: \"kubernetes.io/projected/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-kube-api-access-qqd2t\") pod \"heat-db-create-d9crc\" (UID: \"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5\") " pod="openstack/heat-db-create-d9crc" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.857650 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-operator-scripts\") pod \"heat-db-create-d9crc\" (UID: \"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5\") " pod="openstack/heat-db-create-d9crc" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.857903 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fed72ec-b872-42ca-91e9-354debb471bd-operator-scripts\") pod \"heat-7af3-account-create-update-pflph\" (UID: \"5fed72ec-b872-42ca-91e9-354debb471bd\") " pod="openstack/heat-7af3-account-create-update-pflph" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.875480 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqd2t\" (UniqueName: \"kubernetes.io/projected/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-kube-api-access-qqd2t\") pod \"heat-db-create-d9crc\" (UID: \"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5\") " pod="openstack/heat-db-create-d9crc" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.877466 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kftbj\" (UniqueName: \"kubernetes.io/projected/5fed72ec-b872-42ca-91e9-354debb471bd-kube-api-access-kftbj\") pod \"heat-7af3-account-create-update-pflph\" (UID: \"5fed72ec-b872-42ca-91e9-354debb471bd\") " pod="openstack/heat-7af3-account-create-update-pflph" Feb 23 08:53:39 crc kubenswrapper[5118]: I0223 08:53:39.997112 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-d9crc" Feb 23 08:53:40 crc kubenswrapper[5118]: I0223 08:53:40.103086 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-7af3-account-create-update-pflph" Feb 23 08:53:40 crc kubenswrapper[5118]: W0223 08:53:40.309455 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7f99e13_f91b_4ee8_a7f1_4e0b7399b0c5.slice/crio-cc788a9e0a6cdf4f67e15d27d714865b7d9cc3e598663c87d1dbcf34e39c2bf3 WatchSource:0}: Error finding container cc788a9e0a6cdf4f67e15d27d714865b7d9cc3e598663c87d1dbcf34e39c2bf3: Status 404 returned error can't find the container with id cc788a9e0a6cdf4f67e15d27d714865b7d9cc3e598663c87d1dbcf34e39c2bf3 Feb 23 08:53:40 crc kubenswrapper[5118]: I0223 08:53:40.311722 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-d9crc"] Feb 23 08:53:40 crc kubenswrapper[5118]: I0223 08:53:40.433702 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-7af3-account-create-update-pflph"] Feb 23 08:53:40 crc kubenswrapper[5118]: W0223 08:53:40.441680 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fed72ec_b872_42ca_91e9_354debb471bd.slice/crio-dd1061516318813976d8406950f1608e2fcafed294ee3fe6880b9545e0e7d966 WatchSource:0}: Error finding container dd1061516318813976d8406950f1608e2fcafed294ee3fe6880b9545e0e7d966: Status 404 returned error can't find the container with id dd1061516318813976d8406950f1608e2fcafed294ee3fe6880b9545e0e7d966 Feb 23 08:53:40 crc kubenswrapper[5118]: I0223 08:53:40.697169 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:53:40 crc kubenswrapper[5118]: E0223 08:53:40.697440 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:53:41 crc kubenswrapper[5118]: I0223 08:53:41.313660 5118 generic.go:334] "Generic (PLEG): container finished" podID="b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5" containerID="390206dd53bd2a04438ba11eb9dd3db5f160cedeefc5f1853c0a8812278a576b" exitCode=0 Feb 23 08:53:41 crc kubenswrapper[5118]: I0223 08:53:41.313775 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-d9crc" event={"ID":"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5","Type":"ContainerDied","Data":"390206dd53bd2a04438ba11eb9dd3db5f160cedeefc5f1853c0a8812278a576b"} Feb 23 08:53:41 crc kubenswrapper[5118]: I0223 08:53:41.313817 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-d9crc" event={"ID":"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5","Type":"ContainerStarted","Data":"cc788a9e0a6cdf4f67e15d27d714865b7d9cc3e598663c87d1dbcf34e39c2bf3"} Feb 23 08:53:41 crc kubenswrapper[5118]: I0223 08:53:41.316565 5118 generic.go:334] "Generic (PLEG): container finished" podID="5fed72ec-b872-42ca-91e9-354debb471bd" containerID="de1cf432c845b9fed9f90796455105de65a739de606adeee1601a42dd09ed884" exitCode=0 Feb 23 08:53:41 crc kubenswrapper[5118]: I0223 08:53:41.316662 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-7af3-account-create-update-pflph" event={"ID":"5fed72ec-b872-42ca-91e9-354debb471bd","Type":"ContainerDied","Data":"de1cf432c845b9fed9f90796455105de65a739de606adeee1601a42dd09ed884"} Feb 23 08:53:41 crc kubenswrapper[5118]: I0223 08:53:41.316725 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-7af3-account-create-update-pflph" event={"ID":"5fed72ec-b872-42ca-91e9-354debb471bd","Type":"ContainerStarted","Data":"dd1061516318813976d8406950f1608e2fcafed294ee3fe6880b9545e0e7d966"} Feb 23 08:53:42 crc kubenswrapper[5118]: I0223 08:53:42.855041 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-7af3-account-create-update-pflph" Feb 23 08:53:42 crc kubenswrapper[5118]: I0223 08:53:42.861072 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-d9crc" Feb 23 08:53:42 crc kubenswrapper[5118]: I0223 08:53:42.946294 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fed72ec-b872-42ca-91e9-354debb471bd-operator-scripts\") pod \"5fed72ec-b872-42ca-91e9-354debb471bd\" (UID: \"5fed72ec-b872-42ca-91e9-354debb471bd\") " Feb 23 08:53:42 crc kubenswrapper[5118]: I0223 08:53:42.946390 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kftbj\" (UniqueName: \"kubernetes.io/projected/5fed72ec-b872-42ca-91e9-354debb471bd-kube-api-access-kftbj\") pod \"5fed72ec-b872-42ca-91e9-354debb471bd\" (UID: \"5fed72ec-b872-42ca-91e9-354debb471bd\") " Feb 23 08:53:42 crc kubenswrapper[5118]: I0223 08:53:42.946523 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqd2t\" (UniqueName: \"kubernetes.io/projected/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-kube-api-access-qqd2t\") pod \"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5\" (UID: \"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5\") " Feb 23 08:53:42 crc kubenswrapper[5118]: I0223 08:53:42.946649 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-operator-scripts\") pod \"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5\" (UID: \"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5\") " Feb 23 08:53:42 crc kubenswrapper[5118]: I0223 08:53:42.948405 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fed72ec-b872-42ca-91e9-354debb471bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fed72ec-b872-42ca-91e9-354debb471bd" (UID: "5fed72ec-b872-42ca-91e9-354debb471bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:53:42 crc kubenswrapper[5118]: I0223 08:53:42.948624 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5" (UID: "b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:53:42 crc kubenswrapper[5118]: I0223 08:53:42.958378 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-kube-api-access-qqd2t" (OuterVolumeSpecName: "kube-api-access-qqd2t") pod "b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5" (UID: "b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5"). InnerVolumeSpecName "kube-api-access-qqd2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:53:42 crc kubenswrapper[5118]: I0223 08:53:42.963041 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fed72ec-b872-42ca-91e9-354debb471bd-kube-api-access-kftbj" (OuterVolumeSpecName: "kube-api-access-kftbj") pod "5fed72ec-b872-42ca-91e9-354debb471bd" (UID: "5fed72ec-b872-42ca-91e9-354debb471bd"). InnerVolumeSpecName "kube-api-access-kftbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:53:43 crc kubenswrapper[5118]: I0223 08:53:43.049397 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fed72ec-b872-42ca-91e9-354debb471bd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:43 crc kubenswrapper[5118]: I0223 08:53:43.049440 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kftbj\" (UniqueName: \"kubernetes.io/projected/5fed72ec-b872-42ca-91e9-354debb471bd-kube-api-access-kftbj\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:43 crc kubenswrapper[5118]: I0223 08:53:43.049452 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqd2t\" (UniqueName: \"kubernetes.io/projected/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-kube-api-access-qqd2t\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:43 crc kubenswrapper[5118]: I0223 08:53:43.049461 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:43 crc kubenswrapper[5118]: I0223 08:53:43.336136 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-d9crc" event={"ID":"b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5","Type":"ContainerDied","Data":"cc788a9e0a6cdf4f67e15d27d714865b7d9cc3e598663c87d1dbcf34e39c2bf3"} Feb 23 08:53:43 crc kubenswrapper[5118]: I0223 08:53:43.336185 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc788a9e0a6cdf4f67e15d27d714865b7d9cc3e598663c87d1dbcf34e39c2bf3" Feb 23 08:53:43 crc kubenswrapper[5118]: I0223 08:53:43.336245 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-d9crc" Feb 23 08:53:43 crc kubenswrapper[5118]: I0223 08:53:43.340962 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-7af3-account-create-update-pflph" event={"ID":"5fed72ec-b872-42ca-91e9-354debb471bd","Type":"ContainerDied","Data":"dd1061516318813976d8406950f1608e2fcafed294ee3fe6880b9545e0e7d966"} Feb 23 08:53:43 crc kubenswrapper[5118]: I0223 08:53:43.340990 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd1061516318813976d8406950f1608e2fcafed294ee3fe6880b9545e0e7d966" Feb 23 08:53:43 crc kubenswrapper[5118]: I0223 08:53:43.341026 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-7af3-account-create-update-pflph" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.855087 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-tjk8l"] Feb 23 08:53:44 crc kubenswrapper[5118]: E0223 08:53:44.855826 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fed72ec-b872-42ca-91e9-354debb471bd" containerName="mariadb-account-create-update" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.855839 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fed72ec-b872-42ca-91e9-354debb471bd" containerName="mariadb-account-create-update" Feb 23 08:53:44 crc kubenswrapper[5118]: E0223 08:53:44.855851 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5" containerName="mariadb-database-create" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.855857 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5" containerName="mariadb-database-create" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.856061 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5" containerName="mariadb-database-create" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.856083 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fed72ec-b872-42ca-91e9-354debb471bd" containerName="mariadb-account-create-update" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.856774 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.859510 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-nhnpb" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.862749 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.872044 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-tjk8l"] Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.884289 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-combined-ca-bundle\") pod \"heat-db-sync-tjk8l\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.884355 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-config-data\") pod \"heat-db-sync-tjk8l\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.884480 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flr8t\" (UniqueName: \"kubernetes.io/projected/6d70fed4-6e64-4129-8fc5-75a44f448492-kube-api-access-flr8t\") pod \"heat-db-sync-tjk8l\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.985727 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flr8t\" (UniqueName: \"kubernetes.io/projected/6d70fed4-6e64-4129-8fc5-75a44f448492-kube-api-access-flr8t\") pod \"heat-db-sync-tjk8l\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.986005 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-combined-ca-bundle\") pod \"heat-db-sync-tjk8l\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.987653 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-config-data\") pod \"heat-db-sync-tjk8l\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.991544 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-combined-ca-bundle\") pod \"heat-db-sync-tjk8l\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:44 crc kubenswrapper[5118]: I0223 08:53:44.991940 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-config-data\") pod \"heat-db-sync-tjk8l\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:45 crc kubenswrapper[5118]: I0223 08:53:45.003830 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flr8t\" (UniqueName: \"kubernetes.io/projected/6d70fed4-6e64-4129-8fc5-75a44f448492-kube-api-access-flr8t\") pod \"heat-db-sync-tjk8l\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:45 crc kubenswrapper[5118]: I0223 08:53:45.178390 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:45 crc kubenswrapper[5118]: I0223 08:53:45.691439 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-tjk8l"] Feb 23 08:53:46 crc kubenswrapper[5118]: I0223 08:53:46.071352 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-9cp6b"] Feb 23 08:53:46 crc kubenswrapper[5118]: I0223 08:53:46.085488 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-599a-account-create-update-cbg84"] Feb 23 08:53:46 crc kubenswrapper[5118]: I0223 08:53:46.096070 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-599a-account-create-update-cbg84"] Feb 23 08:53:46 crc kubenswrapper[5118]: I0223 08:53:46.104831 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-9cp6b"] Feb 23 08:53:46 crc kubenswrapper[5118]: I0223 08:53:46.380075 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-tjk8l" event={"ID":"6d70fed4-6e64-4129-8fc5-75a44f448492","Type":"ContainerStarted","Data":"8c7668824ff8f5c57a53d94218970cc7043e79cc02725f72de00f8d9ebbf537f"} Feb 23 08:53:47 crc kubenswrapper[5118]: I0223 08:53:47.723838 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf806736-41ea-43fa-8bef-3a6e1099f205" path="/var/lib/kubelet/pods/cf806736-41ea-43fa-8bef-3a6e1099f205/volumes" Feb 23 08:53:47 crc kubenswrapper[5118]: I0223 08:53:47.725335 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da4a68c4-1cea-47aa-8c95-14ac66230b69" path="/var/lib/kubelet/pods/da4a68c4-1cea-47aa-8c95-14ac66230b69/volumes" Feb 23 08:53:48 crc kubenswrapper[5118]: I0223 08:53:48.364654 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:48 crc kubenswrapper[5118]: I0223 08:53:48.364720 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:53:54 crc kubenswrapper[5118]: I0223 08:53:54.697788 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:53:54 crc kubenswrapper[5118]: E0223 08:53:54.698952 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:53:55 crc kubenswrapper[5118]: I0223 08:53:55.038743 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-svcx6"] Feb 23 08:53:55 crc kubenswrapper[5118]: I0223 08:53:55.059328 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-svcx6"] Feb 23 08:53:55 crc kubenswrapper[5118]: I0223 08:53:55.481842 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-tjk8l" event={"ID":"6d70fed4-6e64-4129-8fc5-75a44f448492","Type":"ContainerStarted","Data":"b8e1f2a06064dfc1973d351a0bebc58729628220ebb1f3d731c3aaaf5b585d8f"} Feb 23 08:53:55 crc kubenswrapper[5118]: I0223 08:53:55.509494 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-tjk8l" podStartSLOduration=2.88834329 podStartE2EDuration="11.509460531s" podCreationTimestamp="2026-02-23 08:53:44 +0000 UTC" firstStartedPulling="2026-02-23 08:53:45.69502909 +0000 UTC m=+7688.698813653" lastFinishedPulling="2026-02-23 08:53:54.316146321 +0000 UTC m=+7697.319930894" observedRunningTime="2026-02-23 08:53:55.506485589 +0000 UTC m=+7698.510270202" watchObservedRunningTime="2026-02-23 08:53:55.509460531 +0000 UTC m=+7698.513245144" Feb 23 08:53:55 crc kubenswrapper[5118]: I0223 08:53:55.711072 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b363e1ea-c245-4751-a7d7-595f7cb83b25" path="/var/lib/kubelet/pods/b363e1ea-c245-4751-a7d7-595f7cb83b25/volumes" Feb 23 08:53:57 crc kubenswrapper[5118]: I0223 08:53:57.512195 5118 generic.go:334] "Generic (PLEG): container finished" podID="6d70fed4-6e64-4129-8fc5-75a44f448492" containerID="b8e1f2a06064dfc1973d351a0bebc58729628220ebb1f3d731c3aaaf5b585d8f" exitCode=0 Feb 23 08:53:57 crc kubenswrapper[5118]: I0223 08:53:57.512259 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-tjk8l" event={"ID":"6d70fed4-6e64-4129-8fc5-75a44f448492","Type":"ContainerDied","Data":"b8e1f2a06064dfc1973d351a0bebc58729628220ebb1f3d731c3aaaf5b585d8f"} Feb 23 08:53:58 crc kubenswrapper[5118]: I0223 08:53:58.368732 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6994f557dc-cpfnd" podUID="5090d54b-dd37-4141-b04c-0a5451a4f264" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.019135 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-tjk8l" Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.051048 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-combined-ca-bundle\") pod \"6d70fed4-6e64-4129-8fc5-75a44f448492\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.051249 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flr8t\" (UniqueName: \"kubernetes.io/projected/6d70fed4-6e64-4129-8fc5-75a44f448492-kube-api-access-flr8t\") pod \"6d70fed4-6e64-4129-8fc5-75a44f448492\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.051327 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-config-data\") pod \"6d70fed4-6e64-4129-8fc5-75a44f448492\" (UID: \"6d70fed4-6e64-4129-8fc5-75a44f448492\") " Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.063213 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d70fed4-6e64-4129-8fc5-75a44f448492-kube-api-access-flr8t" (OuterVolumeSpecName: "kube-api-access-flr8t") pod "6d70fed4-6e64-4129-8fc5-75a44f448492" (UID: "6d70fed4-6e64-4129-8fc5-75a44f448492"). InnerVolumeSpecName "kube-api-access-flr8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.084595 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d70fed4-6e64-4129-8fc5-75a44f448492" (UID: "6d70fed4-6e64-4129-8fc5-75a44f448492"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.135448 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-config-data" (OuterVolumeSpecName: "config-data") pod "6d70fed4-6e64-4129-8fc5-75a44f448492" (UID: "6d70fed4-6e64-4129-8fc5-75a44f448492"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.153931 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.153959 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flr8t\" (UniqueName: \"kubernetes.io/projected/6d70fed4-6e64-4129-8fc5-75a44f448492-kube-api-access-flr8t\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.153971 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d70fed4-6e64-4129-8fc5-75a44f448492-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.542707 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-tjk8l" event={"ID":"6d70fed4-6e64-4129-8fc5-75a44f448492","Type":"ContainerDied","Data":"8c7668824ff8f5c57a53d94218970cc7043e79cc02725f72de00f8d9ebbf537f"} Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.543177 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c7668824ff8f5c57a53d94218970cc7043e79cc02725f72de00f8d9ebbf537f" Feb 23 08:53:59 crc kubenswrapper[5118]: I0223 08:53:59.542924 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-tjk8l" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.734984 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5497bc9d57-4jlv5"] Feb 23 08:54:00 crc kubenswrapper[5118]: E0223 08:54:00.736947 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d70fed4-6e64-4129-8fc5-75a44f448492" containerName="heat-db-sync" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.737047 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d70fed4-6e64-4129-8fc5-75a44f448492" containerName="heat-db-sync" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.738483 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d70fed4-6e64-4129-8fc5-75a44f448492" containerName="heat-db-sync" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.739267 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.741921 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.744215 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-nhnpb" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.747881 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.756718 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5497bc9d57-4jlv5"] Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.790484 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl8j2\" (UniqueName: \"kubernetes.io/projected/21f60b5a-456e-4912-b16a-f0d3074f9d02-kube-api-access-hl8j2\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.790696 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f60b5a-456e-4912-b16a-f0d3074f9d02-combined-ca-bundle\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.790776 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f60b5a-456e-4912-b16a-f0d3074f9d02-config-data\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.790823 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21f60b5a-456e-4912-b16a-f0d3074f9d02-config-data-custom\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.893345 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f60b5a-456e-4912-b16a-f0d3074f9d02-config-data\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.893417 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21f60b5a-456e-4912-b16a-f0d3074f9d02-config-data-custom\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.893465 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl8j2\" (UniqueName: \"kubernetes.io/projected/21f60b5a-456e-4912-b16a-f0d3074f9d02-kube-api-access-hl8j2\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.893577 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f60b5a-456e-4912-b16a-f0d3074f9d02-combined-ca-bundle\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.901888 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21f60b5a-456e-4912-b16a-f0d3074f9d02-config-data-custom\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.903268 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f60b5a-456e-4912-b16a-f0d3074f9d02-combined-ca-bundle\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.915188 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f60b5a-456e-4912-b16a-f0d3074f9d02-config-data\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:00 crc kubenswrapper[5118]: I0223 08:54:00.917959 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl8j2\" (UniqueName: \"kubernetes.io/projected/21f60b5a-456e-4912-b16a-f0d3074f9d02-kube-api-access-hl8j2\") pod \"heat-engine-5497bc9d57-4jlv5\" (UID: \"21f60b5a-456e-4912-b16a-f0d3074f9d02\") " pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.020493 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-76664f888f-t8jd9"] Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.022746 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.028063 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-76664f888f-t8jd9"] Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.029580 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.085511 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-84f9ff9486-k2nsq"] Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.087115 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.089750 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.097738 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-84f9ff9486-k2nsq"] Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.099993 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd05d4d-997d-43f6-8144-90706df15017-config-data\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.100089 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd05d4d-997d-43f6-8144-90706df15017-combined-ca-bundle\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.100212 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cd05d4d-997d-43f6-8144-90706df15017-config-data-custom\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.100257 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrzxb\" (UniqueName: \"kubernetes.io/projected/1cd05d4d-997d-43f6-8144-90706df15017-kube-api-access-zrzxb\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.115881 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.201877 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e69549d-84dc-4239-8aff-a2c5e04dd246-config-data\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.202044 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cd05d4d-997d-43f6-8144-90706df15017-config-data-custom\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.202106 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrzxb\" (UniqueName: \"kubernetes.io/projected/1cd05d4d-997d-43f6-8144-90706df15017-kube-api-access-zrzxb\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.202142 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e69549d-84dc-4239-8aff-a2c5e04dd246-combined-ca-bundle\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.202181 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd05d4d-997d-43f6-8144-90706df15017-config-data\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.202217 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mg2m\" (UniqueName: \"kubernetes.io/projected/8e69549d-84dc-4239-8aff-a2c5e04dd246-kube-api-access-8mg2m\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.202243 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd05d4d-997d-43f6-8144-90706df15017-combined-ca-bundle\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.202267 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e69549d-84dc-4239-8aff-a2c5e04dd246-config-data-custom\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.208334 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cd05d4d-997d-43f6-8144-90706df15017-config-data\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.210268 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cd05d4d-997d-43f6-8144-90706df15017-config-data-custom\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.210541 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cd05d4d-997d-43f6-8144-90706df15017-combined-ca-bundle\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.227044 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrzxb\" (UniqueName: \"kubernetes.io/projected/1cd05d4d-997d-43f6-8144-90706df15017-kube-api-access-zrzxb\") pod \"heat-api-76664f888f-t8jd9\" (UID: \"1cd05d4d-997d-43f6-8144-90706df15017\") " pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.305614 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e69549d-84dc-4239-8aff-a2c5e04dd246-config-data\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.307503 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e69549d-84dc-4239-8aff-a2c5e04dd246-combined-ca-bundle\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.307660 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mg2m\" (UniqueName: \"kubernetes.io/projected/8e69549d-84dc-4239-8aff-a2c5e04dd246-kube-api-access-8mg2m\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.307711 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e69549d-84dc-4239-8aff-a2c5e04dd246-config-data-custom\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.314755 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e69549d-84dc-4239-8aff-a2c5e04dd246-combined-ca-bundle\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.314971 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e69549d-84dc-4239-8aff-a2c5e04dd246-config-data-custom\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.315041 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e69549d-84dc-4239-8aff-a2c5e04dd246-config-data\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.333780 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mg2m\" (UniqueName: \"kubernetes.io/projected/8e69549d-84dc-4239-8aff-a2c5e04dd246-kube-api-access-8mg2m\") pod \"heat-cfnapi-84f9ff9486-k2nsq\" (UID: \"8e69549d-84dc-4239-8aff-a2c5e04dd246\") " pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.353459 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.405145 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.653628 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5497bc9d57-4jlv5"] Feb 23 08:54:01 crc kubenswrapper[5118]: I0223 08:54:01.852370 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-76664f888f-t8jd9"] Feb 23 08:54:01 crc kubenswrapper[5118]: W0223 08:54:01.861815 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cd05d4d_997d_43f6_8144_90706df15017.slice/crio-9a13ed0192e991c625bc95c8a63363e6a278f05de44834c6bbd75932ca75c14d WatchSource:0}: Error finding container 9a13ed0192e991c625bc95c8a63363e6a278f05de44834c6bbd75932ca75c14d: Status 404 returned error can't find the container with id 9a13ed0192e991c625bc95c8a63363e6a278f05de44834c6bbd75932ca75c14d Feb 23 08:54:02 crc kubenswrapper[5118]: I0223 08:54:02.095731 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-84f9ff9486-k2nsq"] Feb 23 08:54:02 crc kubenswrapper[5118]: I0223 08:54:02.597322 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" event={"ID":"8e69549d-84dc-4239-8aff-a2c5e04dd246","Type":"ContainerStarted","Data":"c8598ccd87fde6dc048e0f6a57b6d7aa64081c2ba99918c831340082399df13d"} Feb 23 08:54:02 crc kubenswrapper[5118]: I0223 08:54:02.599922 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5497bc9d57-4jlv5" event={"ID":"21f60b5a-456e-4912-b16a-f0d3074f9d02","Type":"ContainerStarted","Data":"9774409e245a731591901b543812f6549c36c3e22e21e896d8e6f6bace97aad1"} Feb 23 08:54:02 crc kubenswrapper[5118]: I0223 08:54:02.599974 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5497bc9d57-4jlv5" event={"ID":"21f60b5a-456e-4912-b16a-f0d3074f9d02","Type":"ContainerStarted","Data":"fca3c19397a95795cec5c8c2b5276d4d0076b910ef986e99fc013fe2d801840e"} Feb 23 08:54:02 crc kubenswrapper[5118]: I0223 08:54:02.600337 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:02 crc kubenswrapper[5118]: I0223 08:54:02.602673 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-76664f888f-t8jd9" event={"ID":"1cd05d4d-997d-43f6-8144-90706df15017","Type":"ContainerStarted","Data":"9a13ed0192e991c625bc95c8a63363e6a278f05de44834c6bbd75932ca75c14d"} Feb 23 08:54:02 crc kubenswrapper[5118]: I0223 08:54:02.627131 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5497bc9d57-4jlv5" podStartSLOduration=2.627084153 podStartE2EDuration="2.627084153s" podCreationTimestamp="2026-02-23 08:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:54:02.62077084 +0000 UTC m=+7705.624555413" watchObservedRunningTime="2026-02-23 08:54:02.627084153 +0000 UTC m=+7705.630868716" Feb 23 08:54:03 crc kubenswrapper[5118]: I0223 08:54:03.319053 5118 scope.go:117] "RemoveContainer" containerID="52295287fb99064626e4f85b029292f39e059b8b3af1b87256e2917cb5bcbca8" Feb 23 08:54:03 crc kubenswrapper[5118]: I0223 08:54:03.567159 5118 scope.go:117] "RemoveContainer" containerID="ee73b33aec80b30dc9c913fd721b96c9b2a925fd87b7f41085d40b5c946603c7" Feb 23 08:54:03 crc kubenswrapper[5118]: I0223 08:54:03.597385 5118 scope.go:117] "RemoveContainer" containerID="292a5c10539c7b7306bc92442e41a062f52e91400f8b7275e1dbe8c80121b816" Feb 23 08:54:03 crc kubenswrapper[5118]: I0223 08:54:03.753362 5118 scope.go:117] "RemoveContainer" containerID="7e1702e98e6713f525cd60a9a1e3804e6c34d0d6ebe3057c0f901b0a02810eaf" Feb 23 08:54:03 crc kubenswrapper[5118]: I0223 08:54:03.824874 5118 scope.go:117] "RemoveContainer" containerID="49d77e9bcd3309a29df1c19fcc9a49b491ea5a748b6ed16d3b07372557a17234" Feb 23 08:54:03 crc kubenswrapper[5118]: I0223 08:54:03.851396 5118 scope.go:117] "RemoveContainer" containerID="02a9aea63e1501316ecf7d3daf0e6b20b4b0c3c97060236f6d4545d4837ced3b" Feb 23 08:54:04 crc kubenswrapper[5118]: I0223 08:54:04.643282 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-76664f888f-t8jd9" event={"ID":"1cd05d4d-997d-43f6-8144-90706df15017","Type":"ContainerStarted","Data":"68d0abd6e95b8bf89390ddeeb672a280d85783d36ad2f21745f01598716957aa"} Feb 23 08:54:04 crc kubenswrapper[5118]: I0223 08:54:04.643626 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:04 crc kubenswrapper[5118]: I0223 08:54:04.646319 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" event={"ID":"8e69549d-84dc-4239-8aff-a2c5e04dd246","Type":"ContainerStarted","Data":"2f5851b233c6ae81ff69b10355608bff66a35ed5ced3ed5f74ec415be4801cb2"} Feb 23 08:54:04 crc kubenswrapper[5118]: I0223 08:54:04.646535 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:04 crc kubenswrapper[5118]: I0223 08:54:04.665142 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-76664f888f-t8jd9" podStartSLOduration=2.929608767 podStartE2EDuration="4.665126396s" podCreationTimestamp="2026-02-23 08:54:00 +0000 UTC" firstStartedPulling="2026-02-23 08:54:01.864834434 +0000 UTC m=+7704.868619007" lastFinishedPulling="2026-02-23 08:54:03.600352053 +0000 UTC m=+7706.604136636" observedRunningTime="2026-02-23 08:54:04.662037652 +0000 UTC m=+7707.665822225" watchObservedRunningTime="2026-02-23 08:54:04.665126396 +0000 UTC m=+7707.668910969" Feb 23 08:54:04 crc kubenswrapper[5118]: I0223 08:54:04.691983 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" podStartSLOduration=2.229019221 podStartE2EDuration="3.691960013s" podCreationTimestamp="2026-02-23 08:54:01 +0000 UTC" firstStartedPulling="2026-02-23 08:54:02.135295629 +0000 UTC m=+7705.139080202" lastFinishedPulling="2026-02-23 08:54:03.598236421 +0000 UTC m=+7706.602020994" observedRunningTime="2026-02-23 08:54:04.68226069 +0000 UTC m=+7707.686045263" watchObservedRunningTime="2026-02-23 08:54:04.691960013 +0000 UTC m=+7707.695744586" Feb 23 08:54:07 crc kubenswrapper[5118]: I0223 08:54:07.709072 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:54:07 crc kubenswrapper[5118]: E0223 08:54:07.710614 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:54:10 crc kubenswrapper[5118]: I0223 08:54:10.863352 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:54:11 crc kubenswrapper[5118]: I0223 08:54:11.156638 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5497bc9d57-4jlv5" Feb 23 08:54:12 crc kubenswrapper[5118]: I0223 08:54:12.780734 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-76664f888f-t8jd9" Feb 23 08:54:12 crc kubenswrapper[5118]: I0223 08:54:12.927954 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-84f9ff9486-k2nsq" Feb 23 08:54:13 crc kubenswrapper[5118]: I0223 08:54:13.355845 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6994f557dc-cpfnd" Feb 23 08:54:13 crc kubenswrapper[5118]: I0223 08:54:13.417460 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8695445c8f-brjrf"] Feb 23 08:54:13 crc kubenswrapper[5118]: I0223 08:54:13.417756 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8695445c8f-brjrf" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon-log" containerID="cri-o://edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c" gracePeriod=30 Feb 23 08:54:13 crc kubenswrapper[5118]: I0223 08:54:13.417923 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8695445c8f-brjrf" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon" containerID="cri-o://4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad" gracePeriod=30 Feb 23 08:54:16 crc kubenswrapper[5118]: I0223 08:54:16.764927 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8695445c8f-brjrf" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Feb 23 08:54:16 crc kubenswrapper[5118]: I0223 08:54:16.776343 5118 generic.go:334] "Generic (PLEG): container finished" podID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerID="4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad" exitCode=0 Feb 23 08:54:16 crc kubenswrapper[5118]: I0223 08:54:16.776414 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8695445c8f-brjrf" event={"ID":"84e0f825-b01b-441d-a14c-3fd421cc67ff","Type":"ContainerDied","Data":"4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad"} Feb 23 08:54:21 crc kubenswrapper[5118]: I0223 08:54:21.914532 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7"] Feb 23 08:54:21 crc kubenswrapper[5118]: I0223 08:54:21.916791 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:21 crc kubenswrapper[5118]: I0223 08:54:21.922950 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 08:54:21 crc kubenswrapper[5118]: I0223 08:54:21.940161 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7"] Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.106086 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxjbk\" (UniqueName: \"kubernetes.io/projected/cd06a418-d41c-43a0-84fe-c05fb6e60b51-kube-api-access-nxjbk\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.106594 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.106906 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.208927 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.209036 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxjbk\" (UniqueName: \"kubernetes.io/projected/cd06a418-d41c-43a0-84fe-c05fb6e60b51-kube-api-access-nxjbk\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.209087 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.209699 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.209889 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.229574 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxjbk\" (UniqueName: \"kubernetes.io/projected/cd06a418-d41c-43a0-84fe-c05fb6e60b51-kube-api-access-nxjbk\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.265252 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.697412 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:54:22 crc kubenswrapper[5118]: E0223 08:54:22.698289 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.786373 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7"] Feb 23 08:54:22 crc kubenswrapper[5118]: I0223 08:54:22.855148 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" event={"ID":"cd06a418-d41c-43a0-84fe-c05fb6e60b51","Type":"ContainerStarted","Data":"5a8fa4f3227a9aaf124827f07a8d8871bd553d3b524d9285df1d9cd020622eb1"} Feb 23 08:54:23 crc kubenswrapper[5118]: I0223 08:54:23.873278 5118 generic.go:334] "Generic (PLEG): container finished" podID="cd06a418-d41c-43a0-84fe-c05fb6e60b51" containerID="51647bba0d671f06698c77231a7a85605f24de33a801f8a803f0ca6445f0bbb5" exitCode=0 Feb 23 08:54:23 crc kubenswrapper[5118]: I0223 08:54:23.873365 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" event={"ID":"cd06a418-d41c-43a0-84fe-c05fb6e60b51","Type":"ContainerDied","Data":"51647bba0d671f06698c77231a7a85605f24de33a801f8a803f0ca6445f0bbb5"} Feb 23 08:54:25 crc kubenswrapper[5118]: I0223 08:54:25.899377 5118 generic.go:334] "Generic (PLEG): container finished" podID="cd06a418-d41c-43a0-84fe-c05fb6e60b51" containerID="729fd3d57ed47162da59f58a9793d100d1415c98947cb7515261a0d9fb49e3b8" exitCode=0 Feb 23 08:54:25 crc kubenswrapper[5118]: I0223 08:54:25.899452 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" event={"ID":"cd06a418-d41c-43a0-84fe-c05fb6e60b51","Type":"ContainerDied","Data":"729fd3d57ed47162da59f58a9793d100d1415c98947cb7515261a0d9fb49e3b8"} Feb 23 08:54:26 crc kubenswrapper[5118]: I0223 08:54:26.764825 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8695445c8f-brjrf" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Feb 23 08:54:26 crc kubenswrapper[5118]: I0223 08:54:26.912268 5118 generic.go:334] "Generic (PLEG): container finished" podID="cd06a418-d41c-43a0-84fe-c05fb6e60b51" containerID="9ee77bbf3c419b68e1c299be1fa72bb64f2beaa97f7a2084930717a0a8549ba7" exitCode=0 Feb 23 08:54:26 crc kubenswrapper[5118]: I0223 08:54:26.912322 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" event={"ID":"cd06a418-d41c-43a0-84fe-c05fb6e60b51","Type":"ContainerDied","Data":"9ee77bbf3c419b68e1c299be1fa72bb64f2beaa97f7a2084930717a0a8549ba7"} Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.412065 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.562026 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxjbk\" (UniqueName: \"kubernetes.io/projected/cd06a418-d41c-43a0-84fe-c05fb6e60b51-kube-api-access-nxjbk\") pod \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.562224 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-util\") pod \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.562573 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-bundle\") pod \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\" (UID: \"cd06a418-d41c-43a0-84fe-c05fb6e60b51\") " Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.564118 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-bundle" (OuterVolumeSpecName: "bundle") pod "cd06a418-d41c-43a0-84fe-c05fb6e60b51" (UID: "cd06a418-d41c-43a0-84fe-c05fb6e60b51"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.569322 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd06a418-d41c-43a0-84fe-c05fb6e60b51-kube-api-access-nxjbk" (OuterVolumeSpecName: "kube-api-access-nxjbk") pod "cd06a418-d41c-43a0-84fe-c05fb6e60b51" (UID: "cd06a418-d41c-43a0-84fe-c05fb6e60b51"). InnerVolumeSpecName "kube-api-access-nxjbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.571915 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-util" (OuterVolumeSpecName: "util") pod "cd06a418-d41c-43a0-84fe-c05fb6e60b51" (UID: "cd06a418-d41c-43a0-84fe-c05fb6e60b51"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.666077 5118 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-util\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.666186 5118 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd06a418-d41c-43a0-84fe-c05fb6e60b51-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.666214 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxjbk\" (UniqueName: \"kubernetes.io/projected/cd06a418-d41c-43a0-84fe-c05fb6e60b51-kube-api-access-nxjbk\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.944303 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" event={"ID":"cd06a418-d41c-43a0-84fe-c05fb6e60b51","Type":"ContainerDied","Data":"5a8fa4f3227a9aaf124827f07a8d8871bd553d3b524d9285df1d9cd020622eb1"} Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.944721 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a8fa4f3227a9aaf124827f07a8d8871bd553d3b524d9285df1d9cd020622eb1" Feb 23 08:54:28 crc kubenswrapper[5118]: I0223 08:54:28.944416 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7" Feb 23 08:54:34 crc kubenswrapper[5118]: I0223 08:54:34.697927 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:54:34 crc kubenswrapper[5118]: E0223 08:54:34.698977 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:54:36 crc kubenswrapper[5118]: I0223 08:54:36.766305 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-8695445c8f-brjrf" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Feb 23 08:54:36 crc kubenswrapper[5118]: I0223 08:54:36.767672 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:54:37 crc kubenswrapper[5118]: I0223 08:54:37.046411 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5b81-account-create-update-7bbtq"] Feb 23 08:54:37 crc kubenswrapper[5118]: I0223 08:54:37.055716 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-28f62"] Feb 23 08:54:37 crc kubenswrapper[5118]: I0223 08:54:37.068034 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5b81-account-create-update-7bbtq"] Feb 23 08:54:37 crc kubenswrapper[5118]: I0223 08:54:37.076159 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-28f62"] Feb 23 08:54:37 crc kubenswrapper[5118]: I0223 08:54:37.715680 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90b03a3-618b-48e0-8de1-6290a7cf0140" path="/var/lib/kubelet/pods/c90b03a3-618b-48e0-8de1-6290a7cf0140/volumes" Feb 23 08:54:37 crc kubenswrapper[5118]: I0223 08:54:37.721027 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf300251-6812-4636-8f38-10cfe37fdcd0" path="/var/lib/kubelet/pods/cf300251-6812-4636-8f38-10cfe37fdcd0/volumes" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.851351 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv"] Feb 23 08:54:39 crc kubenswrapper[5118]: E0223 08:54:39.852369 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd06a418-d41c-43a0-84fe-c05fb6e60b51" containerName="util" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.852386 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd06a418-d41c-43a0-84fe-c05fb6e60b51" containerName="util" Feb 23 08:54:39 crc kubenswrapper[5118]: E0223 08:54:39.852404 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd06a418-d41c-43a0-84fe-c05fb6e60b51" containerName="pull" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.852411 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd06a418-d41c-43a0-84fe-c05fb6e60b51" containerName="pull" Feb 23 08:54:39 crc kubenswrapper[5118]: E0223 08:54:39.852427 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd06a418-d41c-43a0-84fe-c05fb6e60b51" containerName="extract" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.852436 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd06a418-d41c-43a0-84fe-c05fb6e60b51" containerName="extract" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.852621 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd06a418-d41c-43a0-84fe-c05fb6e60b51" containerName="extract" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.855286 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.856995 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rqjzx" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.857225 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.858513 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.903739 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv"] Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.925222 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4vv9\" (UniqueName: \"kubernetes.io/projected/de0aafc9-dc3e-4457-96c6-5ef2ba36efbc-kube-api-access-g4vv9\") pod \"obo-prometheus-operator-68bc856cb9-5w9fv\" (UID: \"de0aafc9-dc3e-4457-96c6-5ef2ba36efbc\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv" Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.992938 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq"] Feb 23 08:54:39 crc kubenswrapper[5118]: I0223 08:54:39.994148 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.002699 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.004804 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-l9hh6" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.024520 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj"] Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.026015 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.029389 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17fb334f-b958-4fba-b22b-f18ba52f29ae-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq\" (UID: \"17fb334f-b958-4fba-b22b-f18ba52f29ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.029518 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17fb334f-b958-4fba-b22b-f18ba52f29ae-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq\" (UID: \"17fb334f-b958-4fba-b22b-f18ba52f29ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.029605 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4vv9\" (UniqueName: \"kubernetes.io/projected/de0aafc9-dc3e-4457-96c6-5ef2ba36efbc-kube-api-access-g4vv9\") pod \"obo-prometheus-operator-68bc856cb9-5w9fv\" (UID: \"de0aafc9-dc3e-4457-96c6-5ef2ba36efbc\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.049310 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq"] Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.062742 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4vv9\" (UniqueName: \"kubernetes.io/projected/de0aafc9-dc3e-4457-96c6-5ef2ba36efbc-kube-api-access-g4vv9\") pod \"obo-prometheus-operator-68bc856cb9-5w9fv\" (UID: \"de0aafc9-dc3e-4457-96c6-5ef2ba36efbc\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.090537 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj"] Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.136357 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92174a3a-511b-43fd-ac12-662d7d419640-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj\" (UID: \"92174a3a-511b-43fd-ac12-662d7d419640\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.136421 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92174a3a-511b-43fd-ac12-662d7d419640-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj\" (UID: \"92174a3a-511b-43fd-ac12-662d7d419640\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.136465 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17fb334f-b958-4fba-b22b-f18ba52f29ae-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq\" (UID: \"17fb334f-b958-4fba-b22b-f18ba52f29ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.136538 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17fb334f-b958-4fba-b22b-f18ba52f29ae-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq\" (UID: \"17fb334f-b958-4fba-b22b-f18ba52f29ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.140004 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17fb334f-b958-4fba-b22b-f18ba52f29ae-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq\" (UID: \"17fb334f-b958-4fba-b22b-f18ba52f29ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.152702 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17fb334f-b958-4fba-b22b-f18ba52f29ae-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq\" (UID: \"17fb334f-b958-4fba-b22b-f18ba52f29ae\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.197829 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2wznd"] Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.199740 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2wznd" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.209932 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-gqdzg" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.210336 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.221188 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2wznd"] Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.226828 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.238671 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6dks\" (UniqueName: \"kubernetes.io/projected/1b496827-29a0-4c5b-9d77-764163a0ddf7-kube-api-access-x6dks\") pod \"observability-operator-59bdc8b94-2wznd\" (UID: \"1b496827-29a0-4c5b-9d77-764163a0ddf7\") " pod="openshift-operators/observability-operator-59bdc8b94-2wznd" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.239016 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92174a3a-511b-43fd-ac12-662d7d419640-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj\" (UID: \"92174a3a-511b-43fd-ac12-662d7d419640\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.239052 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92174a3a-511b-43fd-ac12-662d7d419640-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj\" (UID: \"92174a3a-511b-43fd-ac12-662d7d419640\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.239085 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b496827-29a0-4c5b-9d77-764163a0ddf7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2wznd\" (UID: \"1b496827-29a0-4c5b-9d77-764163a0ddf7\") " pod="openshift-operators/observability-operator-59bdc8b94-2wznd" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.247066 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92174a3a-511b-43fd-ac12-662d7d419640-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj\" (UID: \"92174a3a-511b-43fd-ac12-662d7d419640\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.247286 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92174a3a-511b-43fd-ac12-662d7d419640-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj\" (UID: \"92174a3a-511b-43fd-ac12-662d7d419640\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.329934 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.342279 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6dks\" (UniqueName: \"kubernetes.io/projected/1b496827-29a0-4c5b-9d77-764163a0ddf7-kube-api-access-x6dks\") pod \"observability-operator-59bdc8b94-2wznd\" (UID: \"1b496827-29a0-4c5b-9d77-764163a0ddf7\") " pod="openshift-operators/observability-operator-59bdc8b94-2wznd" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.342389 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b496827-29a0-4c5b-9d77-764163a0ddf7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2wznd\" (UID: \"1b496827-29a0-4c5b-9d77-764163a0ddf7\") " pod="openshift-operators/observability-operator-59bdc8b94-2wznd" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.348637 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b496827-29a0-4c5b-9d77-764163a0ddf7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-2wznd\" (UID: \"1b496827-29a0-4c5b-9d77-764163a0ddf7\") " pod="openshift-operators/observability-operator-59bdc8b94-2wznd" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.359173 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.361144 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6dks\" (UniqueName: \"kubernetes.io/projected/1b496827-29a0-4c5b-9d77-764163a0ddf7-kube-api-access-x6dks\") pod \"observability-operator-59bdc8b94-2wznd\" (UID: \"1b496827-29a0-4c5b-9d77-764163a0ddf7\") " pod="openshift-operators/observability-operator-59bdc8b94-2wznd" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.411543 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dtfft"] Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.413010 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dtfft" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.421597 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-c4r25" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.425332 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dtfft"] Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.443602 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/97e1bcf9-32e9-4db5-8b2f-2113aadd6a86-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dtfft\" (UID: \"97e1bcf9-32e9-4db5-8b2f-2113aadd6a86\") " pod="openshift-operators/perses-operator-5bf474d74f-dtfft" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.443715 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5kmz\" (UniqueName: \"kubernetes.io/projected/97e1bcf9-32e9-4db5-8b2f-2113aadd6a86-kube-api-access-n5kmz\") pod \"perses-operator-5bf474d74f-dtfft\" (UID: \"97e1bcf9-32e9-4db5-8b2f-2113aadd6a86\") " pod="openshift-operators/perses-operator-5bf474d74f-dtfft" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.545172 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/97e1bcf9-32e9-4db5-8b2f-2113aadd6a86-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dtfft\" (UID: \"97e1bcf9-32e9-4db5-8b2f-2113aadd6a86\") " pod="openshift-operators/perses-operator-5bf474d74f-dtfft" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.545514 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5kmz\" (UniqueName: \"kubernetes.io/projected/97e1bcf9-32e9-4db5-8b2f-2113aadd6a86-kube-api-access-n5kmz\") pod \"perses-operator-5bf474d74f-dtfft\" (UID: \"97e1bcf9-32e9-4db5-8b2f-2113aadd6a86\") " pod="openshift-operators/perses-operator-5bf474d74f-dtfft" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.546159 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/97e1bcf9-32e9-4db5-8b2f-2113aadd6a86-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dtfft\" (UID: \"97e1bcf9-32e9-4db5-8b2f-2113aadd6a86\") " pod="openshift-operators/perses-operator-5bf474d74f-dtfft" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.573444 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5kmz\" (UniqueName: \"kubernetes.io/projected/97e1bcf9-32e9-4db5-8b2f-2113aadd6a86-kube-api-access-n5kmz\") pod \"perses-operator-5bf474d74f-dtfft\" (UID: \"97e1bcf9-32e9-4db5-8b2f-2113aadd6a86\") " pod="openshift-operators/perses-operator-5bf474d74f-dtfft" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.639694 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-2wznd" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.772809 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dtfft" Feb 23 08:54:40 crc kubenswrapper[5118]: I0223 08:54:40.906304 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv"] Feb 23 08:54:40 crc kubenswrapper[5118]: W0223 08:54:40.938158 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde0aafc9_dc3e_4457_96c6_5ef2ba36efbc.slice/crio-3e1be64a5ae8a70e88d64402245bc609180c9055a61959bc4691aa590eccc5d0 WatchSource:0}: Error finding container 3e1be64a5ae8a70e88d64402245bc609180c9055a61959bc4691aa590eccc5d0: Status 404 returned error can't find the container with id 3e1be64a5ae8a70e88d64402245bc609180c9055a61959bc4691aa590eccc5d0 Feb 23 08:54:41 crc kubenswrapper[5118]: W0223 08:54:41.032153 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17fb334f_b958_4fba_b22b_f18ba52f29ae.slice/crio-2a1c750af9a57f2dc8192ba6d0976b46cad9bc508a4306bb706ef0d969c57168 WatchSource:0}: Error finding container 2a1c750af9a57f2dc8192ba6d0976b46cad9bc508a4306bb706ef0d969c57168: Status 404 returned error can't find the container with id 2a1c750af9a57f2dc8192ba6d0976b46cad9bc508a4306bb706ef0d969c57168 Feb 23 08:54:41 crc kubenswrapper[5118]: I0223 08:54:41.032400 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq"] Feb 23 08:54:41 crc kubenswrapper[5118]: I0223 08:54:41.069665 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" event={"ID":"17fb334f-b958-4fba-b22b-f18ba52f29ae","Type":"ContainerStarted","Data":"2a1c750af9a57f2dc8192ba6d0976b46cad9bc508a4306bb706ef0d969c57168"} Feb 23 08:54:41 crc kubenswrapper[5118]: I0223 08:54:41.071652 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv" event={"ID":"de0aafc9-dc3e-4457-96c6-5ef2ba36efbc","Type":"ContainerStarted","Data":"3e1be64a5ae8a70e88d64402245bc609180c9055a61959bc4691aa590eccc5d0"} Feb 23 08:54:41 crc kubenswrapper[5118]: I0223 08:54:41.103656 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj"] Feb 23 08:54:41 crc kubenswrapper[5118]: W0223 08:54:41.114670 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92174a3a_511b_43fd_ac12_662d7d419640.slice/crio-c1009a455bf752df8ae431de9dc5ef77ced03fb724ee043657d2ee57df40e8bc WatchSource:0}: Error finding container c1009a455bf752df8ae431de9dc5ef77ced03fb724ee043657d2ee57df40e8bc: Status 404 returned error can't find the container with id c1009a455bf752df8ae431de9dc5ef77ced03fb724ee043657d2ee57df40e8bc Feb 23 08:54:41 crc kubenswrapper[5118]: W0223 08:54:41.290072 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b496827_29a0_4c5b_9d77_764163a0ddf7.slice/crio-75b78acfa71b9c25e873e9a13225c90189d1225b3d1ff629a0a8e6c9ef74f431 WatchSource:0}: Error finding container 75b78acfa71b9c25e873e9a13225c90189d1225b3d1ff629a0a8e6c9ef74f431: Status 404 returned error can't find the container with id 75b78acfa71b9c25e873e9a13225c90189d1225b3d1ff629a0a8e6c9ef74f431 Feb 23 08:54:41 crc kubenswrapper[5118]: I0223 08:54:41.293213 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-2wznd"] Feb 23 08:54:41 crc kubenswrapper[5118]: I0223 08:54:41.488488 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dtfft"] Feb 23 08:54:41 crc kubenswrapper[5118]: W0223 08:54:41.518394 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e1bcf9_32e9_4db5_8b2f_2113aadd6a86.slice/crio-3858451cf75db39496ca60e07d9485e245b060d05d37815b64c4b6ac604c28c0 WatchSource:0}: Error finding container 3858451cf75db39496ca60e07d9485e245b060d05d37815b64c4b6ac604c28c0: Status 404 returned error can't find the container with id 3858451cf75db39496ca60e07d9485e245b060d05d37815b64c4b6ac604c28c0 Feb 23 08:54:42 crc kubenswrapper[5118]: I0223 08:54:42.103008 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dtfft" event={"ID":"97e1bcf9-32e9-4db5-8b2f-2113aadd6a86","Type":"ContainerStarted","Data":"3858451cf75db39496ca60e07d9485e245b060d05d37815b64c4b6ac604c28c0"} Feb 23 08:54:42 crc kubenswrapper[5118]: I0223 08:54:42.107952 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" event={"ID":"92174a3a-511b-43fd-ac12-662d7d419640","Type":"ContainerStarted","Data":"c1009a455bf752df8ae431de9dc5ef77ced03fb724ee043657d2ee57df40e8bc"} Feb 23 08:54:42 crc kubenswrapper[5118]: I0223 08:54:42.119226 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2wznd" event={"ID":"1b496827-29a0-4c5b-9d77-764163a0ddf7","Type":"ContainerStarted","Data":"75b78acfa71b9c25e873e9a13225c90189d1225b3d1ff629a0a8e6c9ef74f431"} Feb 23 08:54:43 crc kubenswrapper[5118]: E0223 08:54:43.737190 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e0f825_b01b_441d_a14c_3fd421cc67ff.slice/crio-conmon-edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e0f825_b01b_441d_a14c_3fd421cc67ff.slice/crio-edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c.scope\": RecentStats: unable to find data in memory cache]" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.106907 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.202239 5118 generic.go:334] "Generic (PLEG): container finished" podID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerID="edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c" exitCode=137 Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.202333 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8695445c8f-brjrf" event={"ID":"84e0f825-b01b-441d-a14c-3fd421cc67ff","Type":"ContainerDied","Data":"edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c"} Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.202379 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8695445c8f-brjrf" event={"ID":"84e0f825-b01b-441d-a14c-3fd421cc67ff","Type":"ContainerDied","Data":"50dd73f37f1138e8963930ac40a6e54498b69835deed35141dbb1160e85cb1b8"} Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.202402 5118 scope.go:117] "RemoveContainer" containerID="4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.202442 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8695445c8f-brjrf" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.243754 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-scripts\") pod \"84e0f825-b01b-441d-a14c-3fd421cc67ff\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.243853 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84e0f825-b01b-441d-a14c-3fd421cc67ff-logs\") pod \"84e0f825-b01b-441d-a14c-3fd421cc67ff\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.244017 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s4nx\" (UniqueName: \"kubernetes.io/projected/84e0f825-b01b-441d-a14c-3fd421cc67ff-kube-api-access-6s4nx\") pod \"84e0f825-b01b-441d-a14c-3fd421cc67ff\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.244061 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-config-data\") pod \"84e0f825-b01b-441d-a14c-3fd421cc67ff\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.244177 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84e0f825-b01b-441d-a14c-3fd421cc67ff-horizon-secret-key\") pod \"84e0f825-b01b-441d-a14c-3fd421cc67ff\" (UID: \"84e0f825-b01b-441d-a14c-3fd421cc67ff\") " Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.244297 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84e0f825-b01b-441d-a14c-3fd421cc67ff-logs" (OuterVolumeSpecName: "logs") pod "84e0f825-b01b-441d-a14c-3fd421cc67ff" (UID: "84e0f825-b01b-441d-a14c-3fd421cc67ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.245128 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84e0f825-b01b-441d-a14c-3fd421cc67ff-logs\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.255335 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e0f825-b01b-441d-a14c-3fd421cc67ff-kube-api-access-6s4nx" (OuterVolumeSpecName: "kube-api-access-6s4nx") pod "84e0f825-b01b-441d-a14c-3fd421cc67ff" (UID: "84e0f825-b01b-441d-a14c-3fd421cc67ff"). InnerVolumeSpecName "kube-api-access-6s4nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.257302 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84e0f825-b01b-441d-a14c-3fd421cc67ff-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "84e0f825-b01b-441d-a14c-3fd421cc67ff" (UID: "84e0f825-b01b-441d-a14c-3fd421cc67ff"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.277826 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-scripts" (OuterVolumeSpecName: "scripts") pod "84e0f825-b01b-441d-a14c-3fd421cc67ff" (UID: "84e0f825-b01b-441d-a14c-3fd421cc67ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.340153 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-config-data" (OuterVolumeSpecName: "config-data") pod "84e0f825-b01b-441d-a14c-3fd421cc67ff" (UID: "84e0f825-b01b-441d-a14c-3fd421cc67ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.347296 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s4nx\" (UniqueName: \"kubernetes.io/projected/84e0f825-b01b-441d-a14c-3fd421cc67ff-kube-api-access-6s4nx\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.347334 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.347344 5118 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/84e0f825-b01b-441d-a14c-3fd421cc67ff-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.347353 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84e0f825-b01b-441d-a14c-3fd421cc67ff-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.563706 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8695445c8f-brjrf"] Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.574187 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8695445c8f-brjrf"] Feb 23 08:54:44 crc kubenswrapper[5118]: I0223 08:54:44.984250 5118 scope.go:117] "RemoveContainer" containerID="edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c" Feb 23 08:54:45 crc kubenswrapper[5118]: I0223 08:54:45.133877 5118 scope.go:117] "RemoveContainer" containerID="4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad" Feb 23 08:54:45 crc kubenswrapper[5118]: E0223 08:54:45.135612 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad\": container with ID starting with 4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad not found: ID does not exist" containerID="4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad" Feb 23 08:54:45 crc kubenswrapper[5118]: I0223 08:54:45.135675 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad"} err="failed to get container status \"4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad\": rpc error: code = NotFound desc = could not find container \"4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad\": container with ID starting with 4b08165fc22f4833b45640cdba602148e76c8e737c3db525821d45f7969302ad not found: ID does not exist" Feb 23 08:54:45 crc kubenswrapper[5118]: I0223 08:54:45.135713 5118 scope.go:117] "RemoveContainer" containerID="edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c" Feb 23 08:54:45 crc kubenswrapper[5118]: E0223 08:54:45.136273 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c\": container with ID starting with edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c not found: ID does not exist" containerID="edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c" Feb 23 08:54:45 crc kubenswrapper[5118]: I0223 08:54:45.136295 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c"} err="failed to get container status \"edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c\": rpc error: code = NotFound desc = could not find container \"edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c\": container with ID starting with edfbd0ac2939304623b2f06e6e2fcd3e25b623708cb08d1adbffad1e4565457c not found: ID does not exist" Feb 23 08:54:45 crc kubenswrapper[5118]: I0223 08:54:45.701138 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:54:45 crc kubenswrapper[5118]: E0223 08:54:45.701916 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:54:45 crc kubenswrapper[5118]: I0223 08:54:45.721646 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" path="/var/lib/kubelet/pods/84e0f825-b01b-441d-a14c-3fd421cc67ff/volumes" Feb 23 08:54:57 crc kubenswrapper[5118]: E0223 08:54:57.056936 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a" Feb 23 08:54:57 crc kubenswrapper[5118]: E0223 08:54:57.057793 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator --watch-referenced-objects-in-all-namespaces=true --disable-unmanaged-prometheus-configuration=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g4vv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-68bc856cb9-5w9fv_openshift-operators(de0aafc9-dc3e-4457-96c6-5ef2ba36efbc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 08:54:57 crc kubenswrapper[5118]: E0223 08:54:57.059043 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv" podUID="de0aafc9-dc3e-4457-96c6-5ef2ba36efbc" Feb 23 08:54:57 crc kubenswrapper[5118]: E0223 08:54:57.436088 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:e7e5f4c5e8ab0ba298ef0295a7137d438a42eb177d9322212cde6ba8f367912a\\\"\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv" podUID="de0aafc9-dc3e-4457-96c6-5ef2ba36efbc" Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.443968 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dtfft" event={"ID":"97e1bcf9-32e9-4db5-8b2f-2113aadd6a86","Type":"ContainerStarted","Data":"3dbbbaa5a91e43cfe9abfaf71d2455a8c24249e8dfbb8099ebd6c89010a75064"} Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.444288 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-dtfft" Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.445593 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" event={"ID":"92174a3a-511b-43fd-ac12-662d7d419640","Type":"ContainerStarted","Data":"b82121462d952e591cdb5eb234078d63befe28c058d9e6fd4a71eaa4e2861024"} Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.447252 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-2wznd" event={"ID":"1b496827-29a0-4c5b-9d77-764163a0ddf7","Type":"ContainerStarted","Data":"0b9c6a7f24288b2998ce474ccf007c0919013d49dc8816c44686b69b958ea022"} Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.447501 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-2wznd" Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.449480 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" event={"ID":"17fb334f-b958-4fba-b22b-f18ba52f29ae","Type":"ContainerStarted","Data":"88a55fcd613c60d06bb6bf729451b9db0e7e721a19c17e69facf2139691c5959"} Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.449949 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-2wznd" Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.470859 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-dtfft" podStartSLOduration=2.8724918649999998 podStartE2EDuration="18.470839946s" podCreationTimestamp="2026-02-23 08:54:40 +0000 UTC" firstStartedPulling="2026-02-23 08:54:41.521730433 +0000 UTC m=+7744.525515006" lastFinishedPulling="2026-02-23 08:54:57.120078514 +0000 UTC m=+7760.123863087" observedRunningTime="2026-02-23 08:54:58.466651325 +0000 UTC m=+7761.470435888" watchObservedRunningTime="2026-02-23 08:54:58.470839946 +0000 UTC m=+7761.474624519" Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.499671 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-2wznd" podStartSLOduration=2.579697333 podStartE2EDuration="18.49964844s" podCreationTimestamp="2026-02-23 08:54:40 +0000 UTC" firstStartedPulling="2026-02-23 08:54:41.293751092 +0000 UTC m=+7744.297535665" lastFinishedPulling="2026-02-23 08:54:57.213702189 +0000 UTC m=+7760.217486772" observedRunningTime="2026-02-23 08:54:58.488135463 +0000 UTC m=+7761.491920056" watchObservedRunningTime="2026-02-23 08:54:58.49964844 +0000 UTC m=+7761.503433013" Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.512833 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq" podStartSLOduration=3.43673339 podStartE2EDuration="19.512813627s" podCreationTimestamp="2026-02-23 08:54:39 +0000 UTC" firstStartedPulling="2026-02-23 08:54:41.042551592 +0000 UTC m=+7744.046336165" lastFinishedPulling="2026-02-23 08:54:57.118631809 +0000 UTC m=+7760.122416402" observedRunningTime="2026-02-23 08:54:58.509974608 +0000 UTC m=+7761.513759171" watchObservedRunningTime="2026-02-23 08:54:58.512813627 +0000 UTC m=+7761.516598190" Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.559205 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj" podStartSLOduration=3.558557853 podStartE2EDuration="19.559182553s" podCreationTimestamp="2026-02-23 08:54:39 +0000 UTC" firstStartedPulling="2026-02-23 08:54:41.129624189 +0000 UTC m=+7744.133408762" lastFinishedPulling="2026-02-23 08:54:57.130248889 +0000 UTC m=+7760.134033462" observedRunningTime="2026-02-23 08:54:58.545001232 +0000 UTC m=+7761.548785805" watchObservedRunningTime="2026-02-23 08:54:58.559182553 +0000 UTC m=+7761.562967126" Feb 23 08:54:58 crc kubenswrapper[5118]: I0223 08:54:58.698372 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:54:58 crc kubenswrapper[5118]: E0223 08:54:58.698736 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:55:03 crc kubenswrapper[5118]: I0223 08:55:03.049462 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-9mmtr"] Feb 23 08:55:03 crc kubenswrapper[5118]: I0223 08:55:03.059273 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-9mmtr"] Feb 23 08:55:03 crc kubenswrapper[5118]: I0223 08:55:03.710387 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff365ce8-8d93-410b-a97d-5f5b947652d9" path="/var/lib/kubelet/pods/ff365ce8-8d93-410b-a97d-5f5b947652d9/volumes" Feb 23 08:55:04 crc kubenswrapper[5118]: I0223 08:55:04.022062 5118 scope.go:117] "RemoveContainer" containerID="9ff3711eb0b440cc3fc36d5622e4325e7b5eb58682bbe5f6e167110cf8a7c7be" Feb 23 08:55:04 crc kubenswrapper[5118]: I0223 08:55:04.094225 5118 scope.go:117] "RemoveContainer" containerID="2528c65314fe9fefc01886e924b1e15c1e14b905f86bd5ce7affd82fafe9704e" Feb 23 08:55:04 crc kubenswrapper[5118]: I0223 08:55:04.129338 5118 scope.go:117] "RemoveContainer" containerID="86dd6def0e4fea1e79f818bbe1560fa124fc10038c2e66f922c5f5e605a928b5" Feb 23 08:55:09 crc kubenswrapper[5118]: I0223 08:55:09.707682 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 08:55:10 crc kubenswrapper[5118]: I0223 08:55:10.698710 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:55:10 crc kubenswrapper[5118]: E0223 08:55:10.699480 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:55:10 crc kubenswrapper[5118]: I0223 08:55:10.777925 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-dtfft" Feb 23 08:55:11 crc kubenswrapper[5118]: I0223 08:55:11.595500 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv" event={"ID":"de0aafc9-dc3e-4457-96c6-5ef2ba36efbc","Type":"ContainerStarted","Data":"91bd5d1ae1bde1a4383e4d1734491dfd6379319c6c0e0d7eb3224fc160f18fc5"} Feb 23 08:55:11 crc kubenswrapper[5118]: I0223 08:55:11.626696 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5w9fv" podStartSLOduration=3.214176059 podStartE2EDuration="32.626668991s" podCreationTimestamp="2026-02-23 08:54:39 +0000 UTC" firstStartedPulling="2026-02-23 08:54:40.962182286 +0000 UTC m=+7743.965966859" lastFinishedPulling="2026-02-23 08:55:10.374675188 +0000 UTC m=+7773.378459791" observedRunningTime="2026-02-23 08:55:11.610747448 +0000 UTC m=+7774.614532031" watchObservedRunningTime="2026-02-23 08:55:11.626668991 +0000 UTC m=+7774.630453574" Feb 23 08:55:13 crc kubenswrapper[5118]: I0223 08:55:13.935840 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 23 08:55:13 crc kubenswrapper[5118]: I0223 08:55:13.936561 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f8be430c-8919-4285-b27c-1df1a6765f35" containerName="openstackclient" containerID="cri-o://0a741d03c4a5a884278bb363145d5230b6ced0174f1221aeef81de5d08cf0361" gracePeriod=2 Feb 23 08:55:13 crc kubenswrapper[5118]: I0223 08:55:13.959868 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.111498 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 08:55:14 crc kubenswrapper[5118]: E0223 08:55:14.203299 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon-log" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.203358 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon-log" Feb 23 08:55:14 crc kubenswrapper[5118]: E0223 08:55:14.203373 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.203382 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon" Feb 23 08:55:14 crc kubenswrapper[5118]: E0223 08:55:14.203486 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8be430c-8919-4285-b27c-1df1a6765f35" containerName="openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.203498 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8be430c-8919-4285-b27c-1df1a6765f35" containerName="openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.204669 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8be430c-8919-4285-b27c-1df1a6765f35" containerName="openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.204703 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.204746 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e0f825-b01b-441d-a14c-3fd421cc67ff" containerName="horizon-log" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.206053 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.242596 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="207eaf10-0008-4eb2-879e-55063bb9f189" podUID="78cdaa37-27be-4446-b098-11f5a3b136e9" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.251337 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.265564 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f8be430c-8919-4285-b27c-1df1a6765f35" podUID="78cdaa37-27be-4446-b098-11f5a3b136e9" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.318461 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 23 08:55:14 crc kubenswrapper[5118]: E0223 08:55:14.320105 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-q6szl openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="207eaf10-0008-4eb2-879e-55063bb9f189" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.343437 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6szl\" (UniqueName: \"kubernetes.io/projected/207eaf10-0008-4eb2-879e-55063bb9f189-kube-api-access-q6szl\") pod \"openstackclient\" (UID: \"207eaf10-0008-4eb2-879e-55063bb9f189\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.343528 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config-secret\") pod \"openstackclient\" (UID: \"207eaf10-0008-4eb2-879e-55063bb9f189\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.343582 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config\") pod \"openstackclient\" (UID: \"207eaf10-0008-4eb2-879e-55063bb9f189\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.364878 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.384165 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.385884 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.396964 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.411365 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.413219 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.416689 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xlgqw" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.435271 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.470026 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config\") pod \"openstackclient\" (UID: \"207eaf10-0008-4eb2-879e-55063bb9f189\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.470149 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/78cdaa37-27be-4446-b098-11f5a3b136e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"78cdaa37-27be-4446-b098-11f5a3b136e9\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.470184 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h5tf\" (UniqueName: \"kubernetes.io/projected/78cdaa37-27be-4446-b098-11f5a3b136e9-kube-api-access-8h5tf\") pod \"openstackclient\" (UID: \"78cdaa37-27be-4446-b098-11f5a3b136e9\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.470224 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6szl\" (UniqueName: \"kubernetes.io/projected/207eaf10-0008-4eb2-879e-55063bb9f189-kube-api-access-q6szl\") pod \"openstackclient\" (UID: \"207eaf10-0008-4eb2-879e-55063bb9f189\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.470249 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw7ht\" (UniqueName: \"kubernetes.io/projected/70317625-f015-449e-bee0-152cd305ffea-kube-api-access-fw7ht\") pod \"kube-state-metrics-0\" (UID: \"70317625-f015-449e-bee0-152cd305ffea\") " pod="openstack/kube-state-metrics-0" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.470275 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/78cdaa37-27be-4446-b098-11f5a3b136e9-openstack-config\") pod \"openstackclient\" (UID: \"78cdaa37-27be-4446-b098-11f5a3b136e9\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.470323 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config-secret\") pod \"openstackclient\" (UID: \"207eaf10-0008-4eb2-879e-55063bb9f189\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.475132 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config\") pod \"openstackclient\" (UID: \"207eaf10-0008-4eb2-879e-55063bb9f189\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: E0223 08:55:14.485575 5118 projected.go:194] Error preparing data for projected volume kube-api-access-q6szl for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (207eaf10-0008-4eb2-879e-55063bb9f189) does not match the UID in record. The object might have been deleted and then recreated Feb 23 08:55:14 crc kubenswrapper[5118]: E0223 08:55:14.485716 5118 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/207eaf10-0008-4eb2-879e-55063bb9f189-kube-api-access-q6szl podName:207eaf10-0008-4eb2-879e-55063bb9f189 nodeName:}" failed. No retries permitted until 2026-02-23 08:55:14.985675618 +0000 UTC m=+7777.989460201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-q6szl" (UniqueName: "kubernetes.io/projected/207eaf10-0008-4eb2-879e-55063bb9f189-kube-api-access-q6szl") pod "openstackclient" (UID: "207eaf10-0008-4eb2-879e-55063bb9f189") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (207eaf10-0008-4eb2-879e-55063bb9f189) does not match the UID in record. The object might have been deleted and then recreated Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.487830 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config-secret\") pod \"openstackclient\" (UID: \"207eaf10-0008-4eb2-879e-55063bb9f189\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.573525 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/78cdaa37-27be-4446-b098-11f5a3b136e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"78cdaa37-27be-4446-b098-11f5a3b136e9\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.573598 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h5tf\" (UniqueName: \"kubernetes.io/projected/78cdaa37-27be-4446-b098-11f5a3b136e9-kube-api-access-8h5tf\") pod \"openstackclient\" (UID: \"78cdaa37-27be-4446-b098-11f5a3b136e9\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.573694 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw7ht\" (UniqueName: \"kubernetes.io/projected/70317625-f015-449e-bee0-152cd305ffea-kube-api-access-fw7ht\") pod \"kube-state-metrics-0\" (UID: \"70317625-f015-449e-bee0-152cd305ffea\") " pod="openstack/kube-state-metrics-0" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.573723 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/78cdaa37-27be-4446-b098-11f5a3b136e9-openstack-config\") pod \"openstackclient\" (UID: \"78cdaa37-27be-4446-b098-11f5a3b136e9\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.574856 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/78cdaa37-27be-4446-b098-11f5a3b136e9-openstack-config\") pod \"openstackclient\" (UID: \"78cdaa37-27be-4446-b098-11f5a3b136e9\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.586597 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/78cdaa37-27be-4446-b098-11f5a3b136e9-openstack-config-secret\") pod \"openstackclient\" (UID: \"78cdaa37-27be-4446-b098-11f5a3b136e9\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.624190 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h5tf\" (UniqueName: \"kubernetes.io/projected/78cdaa37-27be-4446-b098-11f5a3b136e9-kube-api-access-8h5tf\") pod \"openstackclient\" (UID: \"78cdaa37-27be-4446-b098-11f5a3b136e9\") " pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.625010 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw7ht\" (UniqueName: \"kubernetes.io/projected/70317625-f015-449e-bee0-152cd305ffea-kube-api-access-fw7ht\") pod \"kube-state-metrics-0\" (UID: \"70317625-f015-449e-bee0-152cd305ffea\") " pod="openstack/kube-state-metrics-0" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.655955 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.667144 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="207eaf10-0008-4eb2-879e-55063bb9f189" podUID="78cdaa37-27be-4446-b098-11f5a3b136e9" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.726213 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.815159 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.841472 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.860414 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="207eaf10-0008-4eb2-879e-55063bb9f189" podUID="78cdaa37-27be-4446-b098-11f5a3b136e9" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.888988 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config-secret\") pod \"207eaf10-0008-4eb2-879e-55063bb9f189\" (UID: \"207eaf10-0008-4eb2-879e-55063bb9f189\") " Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.889152 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config\") pod \"207eaf10-0008-4eb2-879e-55063bb9f189\" (UID: \"207eaf10-0008-4eb2-879e-55063bb9f189\") " Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.889555 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6szl\" (UniqueName: \"kubernetes.io/projected/207eaf10-0008-4eb2-879e-55063bb9f189-kube-api-access-q6szl\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.890254 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "207eaf10-0008-4eb2-879e-55063bb9f189" (UID: "207eaf10-0008-4eb2-879e-55063bb9f189"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.906410 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "207eaf10-0008-4eb2-879e-55063bb9f189" (UID: "207eaf10-0008-4eb2-879e-55063bb9f189"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.991795 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:14 crc kubenswrapper[5118]: I0223 08:55:14.991842 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/207eaf10-0008-4eb2-879e-55063bb9f189-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.323253 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.325645 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.329506 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.329710 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.329818 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.329883 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.331925 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-trp9f" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.350186 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.407925 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.408011 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.408066 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.408101 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.408198 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpd9x\" (UniqueName: \"kubernetes.io/projected/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-kube-api-access-jpd9x\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.408242 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.408281 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.511328 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.511401 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.511432 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.511463 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpd9x\" (UniqueName: \"kubernetes.io/projected/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-kube-api-access-jpd9x\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.511495 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.511530 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.511588 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.511940 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.538205 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.543064 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.543612 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.543687 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.554733 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.578240 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpd9x\" (UniqueName: \"kubernetes.io/projected/b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2-kube-api-access-jpd9x\") pod \"alertmanager-metric-storage-0\" (UID: \"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2\") " pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.583900 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.672633 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.696383 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.705420 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.716486 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="207eaf10-0008-4eb2-879e-55063bb9f189" podUID="78cdaa37-27be-4446-b098-11f5a3b136e9" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.742107 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="207eaf10-0008-4eb2-879e-55063bb9f189" path="/var/lib/kubelet/pods/207eaf10-0008-4eb2-879e-55063bb9f189/volumes" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.742737 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"70317625-f015-449e-bee0-152cd305ffea","Type":"ContainerStarted","Data":"0beac8b2b8bb7b59fe6945bd27441567fa1a54622a380bcd99024ab0ad82dc9b"} Feb 23 08:55:15 crc kubenswrapper[5118]: W0223 08:55:15.743554 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78cdaa37_27be_4446_b098_11f5a3b136e9.slice/crio-f7b18663c0a9adb44db02efd5f85ff45856bf8d1800724bbd52ee5a8a2640671 WatchSource:0}: Error finding container f7b18663c0a9adb44db02efd5f85ff45856bf8d1800724bbd52ee5a8a2640671: Status 404 returned error can't find the container with id f7b18663c0a9adb44db02efd5f85ff45856bf8d1800724bbd52ee5a8a2640671 Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.855194 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.857716 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.861733 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.861884 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.861981 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.862079 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-8c5bv" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.862204 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.862316 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.862396 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.862447 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.882578 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.904443 5118 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="207eaf10-0008-4eb2-879e-55063bb9f189" podUID="78cdaa37-27be-4446-b098-11f5a3b136e9" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.928360 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.928411 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.928433 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.928467 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-83fe7450-699c-48de-bcc1-fac6eaa85c82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83fe7450-699c-48de-bcc1-fac6eaa85c82\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.928503 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.928520 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.928569 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clskf\" (UniqueName: \"kubernetes.io/projected/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-kube-api-access-clskf\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.928612 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.928633 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:15 crc kubenswrapper[5118]: I0223 08:55:15.928654 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.031085 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.031172 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.031200 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.031251 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-83fe7450-699c-48de-bcc1-fac6eaa85c82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83fe7450-699c-48de-bcc1-fac6eaa85c82\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.031298 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.031319 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.031382 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clskf\" (UniqueName: \"kubernetes.io/projected/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-kube-api-access-clskf\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.031550 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.031609 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.031639 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.032918 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.033650 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.034357 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.051127 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.051475 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.052449 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.056352 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.087020 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.088558 5118 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.088586 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-83fe7450-699c-48de-bcc1-fac6eaa85c82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83fe7450-699c-48de-bcc1-fac6eaa85c82\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8f85af7097fb9e21707316bef040f258a18afada4c40226ba4b47c024c9b26e3/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.119227 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clskf\" (UniqueName: \"kubernetes.io/projected/fb3e17bb-d95e-4bff-8e39-325f7f0517f8-kube-api-access-clskf\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.339890 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-83fe7450-699c-48de-bcc1-fac6eaa85c82\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83fe7450-699c-48de-bcc1-fac6eaa85c82\") pod \"prometheus-metric-storage-0\" (UID: \"fb3e17bb-d95e-4bff-8e39-325f7f0517f8\") " pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.522353 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.537131 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.758542 5118 generic.go:334] "Generic (PLEG): container finished" podID="f8be430c-8919-4285-b27c-1df1a6765f35" containerID="0a741d03c4a5a884278bb363145d5230b6ced0174f1221aeef81de5d08cf0361" exitCode=137 Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.783584 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"78cdaa37-27be-4446-b098-11f5a3b136e9","Type":"ContainerStarted","Data":"21b998fe07c85d5f25361b7463a800d90df47493ae2782996471cbf24f006853"} Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.783631 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"78cdaa37-27be-4446-b098-11f5a3b136e9","Type":"ContainerStarted","Data":"f7b18663c0a9adb44db02efd5f85ff45856bf8d1800724bbd52ee5a8a2640671"} Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.785267 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2","Type":"ContainerStarted","Data":"d3a5856dcd3be7da645c5d2f1e2ae6da286f046e6556400c813ef34ec9dcb65f"} Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.795434 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w9d6s"] Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.806414 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.870693 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.870673659 podStartE2EDuration="2.870673659s" podCreationTimestamp="2026-02-23 08:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:55:16.845492222 +0000 UTC m=+7779.849276795" watchObservedRunningTime="2026-02-23 08:55:16.870673659 +0000 UTC m=+7779.874458232" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.871288 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9d6s"] Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.913147 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-utilities\") pod \"certified-operators-w9d6s\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.913203 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-catalog-content\") pod \"certified-operators-w9d6s\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.913381 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24tgx\" (UniqueName: \"kubernetes.io/projected/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-kube-api-access-24tgx\") pod \"certified-operators-w9d6s\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:16 crc kubenswrapper[5118]: I0223 08:55:16.958558 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.015812 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24tgx\" (UniqueName: \"kubernetes.io/projected/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-kube-api-access-24tgx\") pod \"certified-operators-w9d6s\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.016009 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-utilities\") pod \"certified-operators-w9d6s\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.016055 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-catalog-content\") pod \"certified-operators-w9d6s\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.016804 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-utilities\") pod \"certified-operators-w9d6s\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.017036 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-catalog-content\") pod \"certified-operators-w9d6s\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.052961 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24tgx\" (UniqueName: \"kubernetes.io/projected/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-kube-api-access-24tgx\") pod \"certified-operators-w9d6s\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.117370 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvk4d\" (UniqueName: \"kubernetes.io/projected/f8be430c-8919-4285-b27c-1df1a6765f35-kube-api-access-jvk4d\") pod \"f8be430c-8919-4285-b27c-1df1a6765f35\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.117814 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config\") pod \"f8be430c-8919-4285-b27c-1df1a6765f35\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.118008 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config-secret\") pod \"f8be430c-8919-4285-b27c-1df1a6765f35\" (UID: \"f8be430c-8919-4285-b27c-1df1a6765f35\") " Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.131215 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8be430c-8919-4285-b27c-1df1a6765f35-kube-api-access-jvk4d" (OuterVolumeSpecName: "kube-api-access-jvk4d") pod "f8be430c-8919-4285-b27c-1df1a6765f35" (UID: "f8be430c-8919-4285-b27c-1df1a6765f35"). InnerVolumeSpecName "kube-api-access-jvk4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.169942 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f8be430c-8919-4285-b27c-1df1a6765f35" (UID: "f8be430c-8919-4285-b27c-1df1a6765f35"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.181157 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f8be430c-8919-4285-b27c-1df1a6765f35" (UID: "f8be430c-8919-4285-b27c-1df1a6765f35"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.221320 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvk4d\" (UniqueName: \"kubernetes.io/projected/f8be430c-8919-4285-b27c-1df1a6765f35-kube-api-access-jvk4d\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.221372 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.221386 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8be430c-8919-4285-b27c-1df1a6765f35-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.258168 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.353544 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.717058 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8be430c-8919-4285-b27c-1df1a6765f35" path="/var/lib/kubelet/pods/f8be430c-8919-4285-b27c-1df1a6765f35/volumes" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.802628 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb3e17bb-d95e-4bff-8e39-325f7f0517f8","Type":"ContainerStarted","Data":"e128156cc7b86f9c4462543020366cece063bde227d4a2eab93841733d5f4e7e"} Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.806395 5118 scope.go:117] "RemoveContainer" containerID="0a741d03c4a5a884278bb363145d5230b6ced0174f1221aeef81de5d08cf0361" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.806590 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.811124 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"70317625-f015-449e-bee0-152cd305ffea","Type":"ContainerStarted","Data":"4530cf83f595229328ed842b69cf5fa6c6287245257888e9fd13bae164d5f5ad"} Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.811159 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.871231 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.233931507 podStartE2EDuration="3.871206115s" podCreationTimestamp="2026-02-23 08:55:14 +0000 UTC" firstStartedPulling="2026-02-23 08:55:15.59855715 +0000 UTC m=+7778.602341723" lastFinishedPulling="2026-02-23 08:55:16.235831758 +0000 UTC m=+7779.239616331" observedRunningTime="2026-02-23 08:55:17.856414079 +0000 UTC m=+7780.860198652" watchObservedRunningTime="2026-02-23 08:55:17.871206115 +0000 UTC m=+7780.874990688" Feb 23 08:55:17 crc kubenswrapper[5118]: I0223 08:55:17.903010 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w9d6s"] Feb 23 08:55:18 crc kubenswrapper[5118]: I0223 08:55:18.825054 5118 generic.go:334] "Generic (PLEG): container finished" podID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerID="610513cfbc89339c22038d47653fe69e5cdce7ebd1777a3202860e435aba11dc" exitCode=0 Feb 23 08:55:18 crc kubenswrapper[5118]: I0223 08:55:18.825738 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9d6s" event={"ID":"1b6e0629-8b0a-421e-a7a7-44fa552aafa3","Type":"ContainerDied","Data":"610513cfbc89339c22038d47653fe69e5cdce7ebd1777a3202860e435aba11dc"} Feb 23 08:55:18 crc kubenswrapper[5118]: I0223 08:55:18.828405 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9d6s" event={"ID":"1b6e0629-8b0a-421e-a7a7-44fa552aafa3","Type":"ContainerStarted","Data":"1599c64f2f585a61c5640a0e35f9291b6bd49fb46ad617c18b9c68df476ebc33"} Feb 23 08:55:19 crc kubenswrapper[5118]: I0223 08:55:19.844837 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9d6s" event={"ID":"1b6e0629-8b0a-421e-a7a7-44fa552aafa3","Type":"ContainerStarted","Data":"f7598be78885f29293f697e35e2c00f3350d6948acee49ff03a13da850c7e277"} Feb 23 08:55:22 crc kubenswrapper[5118]: I0223 08:55:22.880407 5118 generic.go:334] "Generic (PLEG): container finished" podID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerID="f7598be78885f29293f697e35e2c00f3350d6948acee49ff03a13da850c7e277" exitCode=0 Feb 23 08:55:22 crc kubenswrapper[5118]: I0223 08:55:22.880455 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9d6s" event={"ID":"1b6e0629-8b0a-421e-a7a7-44fa552aafa3","Type":"ContainerDied","Data":"f7598be78885f29293f697e35e2c00f3350d6948acee49ff03a13da850c7e277"} Feb 23 08:55:24 crc kubenswrapper[5118]: I0223 08:55:24.698008 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:55:24 crc kubenswrapper[5118]: E0223 08:55:24.698914 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 08:55:24 crc kubenswrapper[5118]: I0223 08:55:24.821301 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 23 08:55:24 crc kubenswrapper[5118]: I0223 08:55:24.909447 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb3e17bb-d95e-4bff-8e39-325f7f0517f8","Type":"ContainerStarted","Data":"33e85697d99769eb2f63b19c9fa666059c8fd3601a847d93af7fb309aee32e47"} Feb 23 08:55:24 crc kubenswrapper[5118]: I0223 08:55:24.913959 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9d6s" event={"ID":"1b6e0629-8b0a-421e-a7a7-44fa552aafa3","Type":"ContainerStarted","Data":"9281329a3517dd0c98e904067bfb0d17683ed29e80b1883777dbef9dc0bb2f65"} Feb 23 08:55:24 crc kubenswrapper[5118]: I0223 08:55:24.916812 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2","Type":"ContainerStarted","Data":"f488bb398552baaa3f4e31473df7b94e807d1654b5f2e6830a9b02bb023cc042"} Feb 23 08:55:24 crc kubenswrapper[5118]: I0223 08:55:24.973355 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w9d6s" podStartSLOduration=4.704312677 podStartE2EDuration="8.973336043s" podCreationTimestamp="2026-02-23 08:55:16 +0000 UTC" firstStartedPulling="2026-02-23 08:55:19.053300214 +0000 UTC m=+7782.057084787" lastFinishedPulling="2026-02-23 08:55:23.32232358 +0000 UTC m=+7786.326108153" observedRunningTime="2026-02-23 08:55:24.967229686 +0000 UTC m=+7787.971014279" watchObservedRunningTime="2026-02-23 08:55:24.973336043 +0000 UTC m=+7787.977120616" Feb 23 08:55:27 crc kubenswrapper[5118]: I0223 08:55:27.259413 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:27 crc kubenswrapper[5118]: I0223 08:55:27.259800 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:27 crc kubenswrapper[5118]: I0223 08:55:27.354726 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:32 crc kubenswrapper[5118]: I0223 08:55:32.002457 5118 generic.go:334] "Generic (PLEG): container finished" podID="b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2" containerID="f488bb398552baaa3f4e31473df7b94e807d1654b5f2e6830a9b02bb023cc042" exitCode=0 Feb 23 08:55:32 crc kubenswrapper[5118]: I0223 08:55:32.002584 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2","Type":"ContainerDied","Data":"f488bb398552baaa3f4e31473df7b94e807d1654b5f2e6830a9b02bb023cc042"} Feb 23 08:55:34 crc kubenswrapper[5118]: I0223 08:55:34.030092 5118 generic.go:334] "Generic (PLEG): container finished" podID="fb3e17bb-d95e-4bff-8e39-325f7f0517f8" containerID="33e85697d99769eb2f63b19c9fa666059c8fd3601a847d93af7fb309aee32e47" exitCode=0 Feb 23 08:55:34 crc kubenswrapper[5118]: I0223 08:55:34.030293 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb3e17bb-d95e-4bff-8e39-325f7f0517f8","Type":"ContainerDied","Data":"33e85697d99769eb2f63b19c9fa666059c8fd3601a847d93af7fb309aee32e47"} Feb 23 08:55:35 crc kubenswrapper[5118]: I0223 08:55:35.054266 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2","Type":"ContainerStarted","Data":"4f94f176df90bbdc7aeba32a90462a21583394f207dc93a7484882e5752bc4a7"} Feb 23 08:55:35 crc kubenswrapper[5118]: I0223 08:55:35.057277 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-lv8ql"] Feb 23 08:55:35 crc kubenswrapper[5118]: I0223 08:55:35.072775 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9edb-account-create-update-njmtp"] Feb 23 08:55:35 crc kubenswrapper[5118]: I0223 08:55:35.085327 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-lv8ql"] Feb 23 08:55:35 crc kubenswrapper[5118]: I0223 08:55:35.095813 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9edb-account-create-update-njmtp"] Feb 23 08:55:35 crc kubenswrapper[5118]: I0223 08:55:35.725513 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f07931-7cb9-4a0e-a08d-1bcb570dc50e" path="/var/lib/kubelet/pods/70f07931-7cb9-4a0e-a08d-1bcb570dc50e/volumes" Feb 23 08:55:35 crc kubenswrapper[5118]: I0223 08:55:35.727265 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2" path="/var/lib/kubelet/pods/c1da00b9-26a7-4bed-8ac6-f79d47d1f2e2/volumes" Feb 23 08:55:36 crc kubenswrapper[5118]: I0223 08:55:36.697737 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:55:37 crc kubenswrapper[5118]: I0223 08:55:37.084599 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"a926957ef80dc4661611201421a5011ea66ab9aea8a66b566fdd4ea493c9ceae"} Feb 23 08:55:37 crc kubenswrapper[5118]: I0223 08:55:37.308237 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:37 crc kubenswrapper[5118]: I0223 08:55:37.355394 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9d6s"] Feb 23 08:55:38 crc kubenswrapper[5118]: I0223 08:55:38.099339 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2","Type":"ContainerStarted","Data":"b2c8348422e404f8e47be24fa847ab75ee2878de64199d66ceeacde6132c3e63"} Feb 23 08:55:38 crc kubenswrapper[5118]: I0223 08:55:38.099451 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w9d6s" podUID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerName="registry-server" containerID="cri-o://9281329a3517dd0c98e904067bfb0d17683ed29e80b1883777dbef9dc0bb2f65" gracePeriod=2 Feb 23 08:55:38 crc kubenswrapper[5118]: I0223 08:55:38.139414 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.649407196 podStartE2EDuration="23.139394945s" podCreationTimestamp="2026-02-23 08:55:15 +0000 UTC" firstStartedPulling="2026-02-23 08:55:16.568374678 +0000 UTC m=+7779.572159251" lastFinishedPulling="2026-02-23 08:55:34.058362427 +0000 UTC m=+7797.062147000" observedRunningTime="2026-02-23 08:55:38.128421721 +0000 UTC m=+7801.132206294" watchObservedRunningTime="2026-02-23 08:55:38.139394945 +0000 UTC m=+7801.143179518" Feb 23 08:55:39 crc kubenswrapper[5118]: I0223 08:55:39.112835 5118 generic.go:334] "Generic (PLEG): container finished" podID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerID="9281329a3517dd0c98e904067bfb0d17683ed29e80b1883777dbef9dc0bb2f65" exitCode=0 Feb 23 08:55:39 crc kubenswrapper[5118]: I0223 08:55:39.112959 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9d6s" event={"ID":"1b6e0629-8b0a-421e-a7a7-44fa552aafa3","Type":"ContainerDied","Data":"9281329a3517dd0c98e904067bfb0d17683ed29e80b1883777dbef9dc0bb2f65"} Feb 23 08:55:39 crc kubenswrapper[5118]: I0223 08:55:39.113364 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:39 crc kubenswrapper[5118]: I0223 08:55:39.117180 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 23 08:55:40 crc kubenswrapper[5118]: I0223 08:55:40.822366 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:40 crc kubenswrapper[5118]: I0223 08:55:40.879714 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24tgx\" (UniqueName: \"kubernetes.io/projected/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-kube-api-access-24tgx\") pod \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " Feb 23 08:55:40 crc kubenswrapper[5118]: I0223 08:55:40.879836 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-catalog-content\") pod \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " Feb 23 08:55:40 crc kubenswrapper[5118]: I0223 08:55:40.879919 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-utilities\") pod \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\" (UID: \"1b6e0629-8b0a-421e-a7a7-44fa552aafa3\") " Feb 23 08:55:40 crc kubenswrapper[5118]: I0223 08:55:40.881689 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-utilities" (OuterVolumeSpecName: "utilities") pod "1b6e0629-8b0a-421e-a7a7-44fa552aafa3" (UID: "1b6e0629-8b0a-421e-a7a7-44fa552aafa3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:55:40 crc kubenswrapper[5118]: I0223 08:55:40.887401 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-kube-api-access-24tgx" (OuterVolumeSpecName: "kube-api-access-24tgx") pod "1b6e0629-8b0a-421e-a7a7-44fa552aafa3" (UID: "1b6e0629-8b0a-421e-a7a7-44fa552aafa3"). InnerVolumeSpecName "kube-api-access-24tgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:55:40 crc kubenswrapper[5118]: I0223 08:55:40.936829 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b6e0629-8b0a-421e-a7a7-44fa552aafa3" (UID: "1b6e0629-8b0a-421e-a7a7-44fa552aafa3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:55:40 crc kubenswrapper[5118]: I0223 08:55:40.982743 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:40 crc kubenswrapper[5118]: I0223 08:55:40.983206 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:40 crc kubenswrapper[5118]: I0223 08:55:40.983220 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24tgx\" (UniqueName: \"kubernetes.io/projected/1b6e0629-8b0a-421e-a7a7-44fa552aafa3-kube-api-access-24tgx\") on node \"crc\" DevicePath \"\"" Feb 23 08:55:41 crc kubenswrapper[5118]: I0223 08:55:41.138842 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb3e17bb-d95e-4bff-8e39-325f7f0517f8","Type":"ContainerStarted","Data":"6cf50562636ae0d5e69d36e885c6eff089141f251e75a298d7909b169c4e5afc"} Feb 23 08:55:41 crc kubenswrapper[5118]: I0223 08:55:41.143441 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w9d6s" event={"ID":"1b6e0629-8b0a-421e-a7a7-44fa552aafa3","Type":"ContainerDied","Data":"1599c64f2f585a61c5640a0e35f9291b6bd49fb46ad617c18b9c68df476ebc33"} Feb 23 08:55:41 crc kubenswrapper[5118]: I0223 08:55:41.143487 5118 scope.go:117] "RemoveContainer" containerID="9281329a3517dd0c98e904067bfb0d17683ed29e80b1883777dbef9dc0bb2f65" Feb 23 08:55:41 crc kubenswrapper[5118]: I0223 08:55:41.143687 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w9d6s" Feb 23 08:55:41 crc kubenswrapper[5118]: I0223 08:55:41.169774 5118 scope.go:117] "RemoveContainer" containerID="f7598be78885f29293f697e35e2c00f3350d6948acee49ff03a13da850c7e277" Feb 23 08:55:41 crc kubenswrapper[5118]: I0223 08:55:41.190801 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w9d6s"] Feb 23 08:55:41 crc kubenswrapper[5118]: I0223 08:55:41.201556 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w9d6s"] Feb 23 08:55:41 crc kubenswrapper[5118]: I0223 08:55:41.210487 5118 scope.go:117] "RemoveContainer" containerID="610513cfbc89339c22038d47653fe69e5cdce7ebd1777a3202860e435aba11dc" Feb 23 08:55:41 crc kubenswrapper[5118]: I0223 08:55:41.708687 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" path="/var/lib/kubelet/pods/1b6e0629-8b0a-421e-a7a7-44fa552aafa3/volumes" Feb 23 08:55:45 crc kubenswrapper[5118]: I0223 08:55:45.039406 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-br4mg"] Feb 23 08:55:45 crc kubenswrapper[5118]: I0223 08:55:45.048578 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-br4mg"] Feb 23 08:55:45 crc kubenswrapper[5118]: I0223 08:55:45.197515 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb3e17bb-d95e-4bff-8e39-325f7f0517f8","Type":"ContainerStarted","Data":"c1ebd386480d80976509333cc1507ee9085de336c0c56005423e2e98e56725c4"} Feb 23 08:55:45 crc kubenswrapper[5118]: I0223 08:55:45.723259 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d79556d-66cd-44c2-84b0-d20e1c8768b0" path="/var/lib/kubelet/pods/8d79556d-66cd-44c2-84b0-d20e1c8768b0/volumes" Feb 23 08:55:48 crc kubenswrapper[5118]: I0223 08:55:48.237213 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fb3e17bb-d95e-4bff-8e39-325f7f0517f8","Type":"ContainerStarted","Data":"b690d2f79316d4a023d523a03deec0abc408982984858e8dd322ac0d635b6893"} Feb 23 08:55:48 crc kubenswrapper[5118]: I0223 08:55:48.268577 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.371287249 podStartE2EDuration="34.268558016s" podCreationTimestamp="2026-02-23 08:55:14 +0000 UTC" firstStartedPulling="2026-02-23 08:55:17.367413211 +0000 UTC m=+7780.371197784" lastFinishedPulling="2026-02-23 08:55:47.264683968 +0000 UTC m=+7810.268468551" observedRunningTime="2026-02-23 08:55:48.265297328 +0000 UTC m=+7811.269081911" watchObservedRunningTime="2026-02-23 08:55:48.268558016 +0000 UTC m=+7811.272342589" Feb 23 08:55:51 crc kubenswrapper[5118]: I0223 08:55:51.537970 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.660469 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:55:53 crc kubenswrapper[5118]: E0223 08:55:53.661566 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerName="extract-content" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.661583 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerName="extract-content" Feb 23 08:55:53 crc kubenswrapper[5118]: E0223 08:55:53.661617 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerName="registry-server" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.661627 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerName="registry-server" Feb 23 08:55:53 crc kubenswrapper[5118]: E0223 08:55:53.661657 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerName="extract-utilities" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.661666 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerName="extract-utilities" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.661930 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b6e0629-8b0a-421e-a7a7-44fa552aafa3" containerName="registry-server" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.664598 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.667289 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.670767 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.673217 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.778879 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-log-httpd\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.778976 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.779228 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6wx9\" (UniqueName: \"kubernetes.io/projected/e626a607-7d5e-41f2-8ba5-80b8177fc042-kube-api-access-q6wx9\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.779326 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-run-httpd\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.779540 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.779881 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-scripts\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.779920 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-config-data\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.882106 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.882256 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-scripts\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.882289 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-config-data\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.883591 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-log-httpd\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.883844 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.884087 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6wx9\" (UniqueName: \"kubernetes.io/projected/e626a607-7d5e-41f2-8ba5-80b8177fc042-kube-api-access-q6wx9\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.884183 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-run-httpd\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.884328 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-log-httpd\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.884590 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-run-httpd\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.890894 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-config-data\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.891801 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-scripts\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.892095 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.895012 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.906593 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6wx9\" (UniqueName: \"kubernetes.io/projected/e626a607-7d5e-41f2-8ba5-80b8177fc042-kube-api-access-q6wx9\") pod \"ceilometer-0\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " pod="openstack/ceilometer-0" Feb 23 08:55:53 crc kubenswrapper[5118]: I0223 08:55:53.989670 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:55:54 crc kubenswrapper[5118]: I0223 08:55:54.490716 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:55:54 crc kubenswrapper[5118]: I0223 08:55:54.901911 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9vvbs"] Feb 23 08:55:54 crc kubenswrapper[5118]: I0223 08:55:54.910421 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:54 crc kubenswrapper[5118]: I0223 08:55:54.928886 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9vvbs"] Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.032548 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcg9p\" (UniqueName: \"kubernetes.io/projected/2e4b3873-7378-41ff-969c-0fb14d67b3bb-kube-api-access-zcg9p\") pod \"redhat-operators-9vvbs\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.032641 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-utilities\") pod \"redhat-operators-9vvbs\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.032715 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-catalog-content\") pod \"redhat-operators-9vvbs\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.134194 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-catalog-content\") pod \"redhat-operators-9vvbs\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.134343 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcg9p\" (UniqueName: \"kubernetes.io/projected/2e4b3873-7378-41ff-969c-0fb14d67b3bb-kube-api-access-zcg9p\") pod \"redhat-operators-9vvbs\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.134395 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-utilities\") pod \"redhat-operators-9vvbs\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.134829 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-utilities\") pod \"redhat-operators-9vvbs\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.135684 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-catalog-content\") pod \"redhat-operators-9vvbs\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.172364 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcg9p\" (UniqueName: \"kubernetes.io/projected/2e4b3873-7378-41ff-969c-0fb14d67b3bb-kube-api-access-zcg9p\") pod \"redhat-operators-9vvbs\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.242701 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.336470 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e626a607-7d5e-41f2-8ba5-80b8177fc042","Type":"ContainerStarted","Data":"25166adce8032b73d359e4b7f4992ca50c3acc4ae6fe919c089d575642dc8b4e"} Feb 23 08:55:55 crc kubenswrapper[5118]: I0223 08:55:55.821929 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9vvbs"] Feb 23 08:55:56 crc kubenswrapper[5118]: I0223 08:55:56.349668 5118 generic.go:334] "Generic (PLEG): container finished" podID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerID="4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27" exitCode=0 Feb 23 08:55:56 crc kubenswrapper[5118]: I0223 08:55:56.350596 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vvbs" event={"ID":"2e4b3873-7378-41ff-969c-0fb14d67b3bb","Type":"ContainerDied","Data":"4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27"} Feb 23 08:55:56 crc kubenswrapper[5118]: I0223 08:55:56.350711 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vvbs" event={"ID":"2e4b3873-7378-41ff-969c-0fb14d67b3bb","Type":"ContainerStarted","Data":"280424da1910b6bbd3e3f7b17c3092f66b12bc70725f8b436af0273eb1511ea9"} Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.453225 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g6fgs"] Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.455656 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.478379 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6fgs"] Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.490801 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr7kk\" (UniqueName: \"kubernetes.io/projected/a86e4ee3-789f-4889-96b4-63dd7c385a36-kube-api-access-xr7kk\") pod \"community-operators-g6fgs\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.491021 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-catalog-content\") pod \"community-operators-g6fgs\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.491202 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-utilities\") pod \"community-operators-g6fgs\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.593850 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr7kk\" (UniqueName: \"kubernetes.io/projected/a86e4ee3-789f-4889-96b4-63dd7c385a36-kube-api-access-xr7kk\") pod \"community-operators-g6fgs\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.593924 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-catalog-content\") pod \"community-operators-g6fgs\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.594336 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-utilities\") pod \"community-operators-g6fgs\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.594532 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-catalog-content\") pod \"community-operators-g6fgs\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.594972 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-utilities\") pod \"community-operators-g6fgs\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.619314 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr7kk\" (UniqueName: \"kubernetes.io/projected/a86e4ee3-789f-4889-96b4-63dd7c385a36-kube-api-access-xr7kk\") pod \"community-operators-g6fgs\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:55:57 crc kubenswrapper[5118]: I0223 08:55:57.794795 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:56:00 crc kubenswrapper[5118]: I0223 08:56:00.050188 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g6fgs"] Feb 23 08:56:00 crc kubenswrapper[5118]: W0223 08:56:00.100843 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda86e4ee3_789f_4889_96b4_63dd7c385a36.slice/crio-0bdc28cd1619953f2b77305e559a36246a0590eecc9c057cde10703a7c48a480 WatchSource:0}: Error finding container 0bdc28cd1619953f2b77305e559a36246a0590eecc9c057cde10703a7c48a480: Status 404 returned error can't find the container with id 0bdc28cd1619953f2b77305e559a36246a0590eecc9c057cde10703a7c48a480 Feb 23 08:56:00 crc kubenswrapper[5118]: I0223 08:56:00.396761 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e626a607-7d5e-41f2-8ba5-80b8177fc042","Type":"ContainerStarted","Data":"03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60"} Feb 23 08:56:00 crc kubenswrapper[5118]: I0223 08:56:00.398381 5118 generic.go:334] "Generic (PLEG): container finished" podID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerID="f7b2da5417eb0cce5ac61a976ff690553a21646d17d138d1622d4e432406bec6" exitCode=0 Feb 23 08:56:00 crc kubenswrapper[5118]: I0223 08:56:00.398425 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6fgs" event={"ID":"a86e4ee3-789f-4889-96b4-63dd7c385a36","Type":"ContainerDied","Data":"f7b2da5417eb0cce5ac61a976ff690553a21646d17d138d1622d4e432406bec6"} Feb 23 08:56:00 crc kubenswrapper[5118]: I0223 08:56:00.398442 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6fgs" event={"ID":"a86e4ee3-789f-4889-96b4-63dd7c385a36","Type":"ContainerStarted","Data":"0bdc28cd1619953f2b77305e559a36246a0590eecc9c057cde10703a7c48a480"} Feb 23 08:56:00 crc kubenswrapper[5118]: I0223 08:56:00.404816 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vvbs" event={"ID":"2e4b3873-7378-41ff-969c-0fb14d67b3bb","Type":"ContainerStarted","Data":"313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9"} Feb 23 08:56:01 crc kubenswrapper[5118]: I0223 08:56:01.537927 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 23 08:56:01 crc kubenswrapper[5118]: I0223 08:56:01.545357 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 23 08:56:02 crc kubenswrapper[5118]: I0223 08:56:02.436227 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 23 08:56:03 crc kubenswrapper[5118]: I0223 08:56:03.453225 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vvbs" event={"ID":"2e4b3873-7378-41ff-969c-0fb14d67b3bb","Type":"ContainerDied","Data":"313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9"} Feb 23 08:56:03 crc kubenswrapper[5118]: I0223 08:56:03.453410 5118 generic.go:334] "Generic (PLEG): container finished" podID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerID="313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9" exitCode=0 Feb 23 08:56:03 crc kubenswrapper[5118]: I0223 08:56:03.462023 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e626a607-7d5e-41f2-8ba5-80b8177fc042","Type":"ContainerStarted","Data":"bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75"} Feb 23 08:56:03 crc kubenswrapper[5118]: I0223 08:56:03.465358 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6fgs" event={"ID":"a86e4ee3-789f-4889-96b4-63dd7c385a36","Type":"ContainerStarted","Data":"94d6af968a6e2b4c3496865323efe2ff0cf44f8a3995b77c390b7b49c235a3ed"} Feb 23 08:56:04 crc kubenswrapper[5118]: I0223 08:56:04.300607 5118 scope.go:117] "RemoveContainer" containerID="75beeca2c1a18c4bee7e944f7edb32ec5389bcec31df6b1cab9f3caff2e0274c" Feb 23 08:56:04 crc kubenswrapper[5118]: I0223 08:56:04.337042 5118 scope.go:117] "RemoveContainer" containerID="f2c8e7a7bb3db481c12653260835aa44391f71ce1f64ab8bda764963bb758289" Feb 23 08:56:04 crc kubenswrapper[5118]: I0223 08:56:04.390012 5118 scope.go:117] "RemoveContainer" containerID="10a7ae64db688633500efa2e11ec05d691ccca7ba6abfcacaef23f6f841acdcf" Feb 23 08:56:04 crc kubenswrapper[5118]: I0223 08:56:04.482256 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vvbs" event={"ID":"2e4b3873-7378-41ff-969c-0fb14d67b3bb","Type":"ContainerStarted","Data":"af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e"} Feb 23 08:56:04 crc kubenswrapper[5118]: I0223 08:56:04.489348 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e626a607-7d5e-41f2-8ba5-80b8177fc042","Type":"ContainerStarted","Data":"1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f"} Feb 23 08:56:04 crc kubenswrapper[5118]: I0223 08:56:04.492158 5118 generic.go:334] "Generic (PLEG): container finished" podID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerID="94d6af968a6e2b4c3496865323efe2ff0cf44f8a3995b77c390b7b49c235a3ed" exitCode=0 Feb 23 08:56:04 crc kubenswrapper[5118]: I0223 08:56:04.492194 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6fgs" event={"ID":"a86e4ee3-789f-4889-96b4-63dd7c385a36","Type":"ContainerDied","Data":"94d6af968a6e2b4c3496865323efe2ff0cf44f8a3995b77c390b7b49c235a3ed"} Feb 23 08:56:04 crc kubenswrapper[5118]: I0223 08:56:04.541002 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9vvbs" podStartSLOduration=5.519243447 podStartE2EDuration="10.540980352s" podCreationTimestamp="2026-02-23 08:55:54 +0000 UTC" firstStartedPulling="2026-02-23 08:55:58.82843281 +0000 UTC m=+7821.832217393" lastFinishedPulling="2026-02-23 08:56:03.850169735 +0000 UTC m=+7826.853954298" observedRunningTime="2026-02-23 08:56:04.508642763 +0000 UTC m=+7827.512427356" watchObservedRunningTime="2026-02-23 08:56:04.540980352 +0000 UTC m=+7827.544764925" Feb 23 08:56:05 crc kubenswrapper[5118]: I0223 08:56:05.243399 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:56:05 crc kubenswrapper[5118]: I0223 08:56:05.243724 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:56:05 crc kubenswrapper[5118]: I0223 08:56:05.505674 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e626a607-7d5e-41f2-8ba5-80b8177fc042","Type":"ContainerStarted","Data":"e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01"} Feb 23 08:56:05 crc kubenswrapper[5118]: I0223 08:56:05.505861 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 08:56:05 crc kubenswrapper[5118]: I0223 08:56:05.509566 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6fgs" event={"ID":"a86e4ee3-789f-4889-96b4-63dd7c385a36","Type":"ContainerStarted","Data":"226e1b6e359605446e910af19ce6cb284015b4de75a8a95725e92b7640fa98e4"} Feb 23 08:56:05 crc kubenswrapper[5118]: I0223 08:56:05.543070 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.79127931 podStartE2EDuration="12.543034736s" podCreationTimestamp="2026-02-23 08:55:53 +0000 UTC" firstStartedPulling="2026-02-23 08:55:54.498700233 +0000 UTC m=+7817.502484806" lastFinishedPulling="2026-02-23 08:56:05.250455659 +0000 UTC m=+7828.254240232" observedRunningTime="2026-02-23 08:56:05.527259276 +0000 UTC m=+7828.531043849" watchObservedRunningTime="2026-02-23 08:56:05.543034736 +0000 UTC m=+7828.546819309" Feb 23 08:56:05 crc kubenswrapper[5118]: I0223 08:56:05.551808 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g6fgs" podStartSLOduration=4.083057211 podStartE2EDuration="8.551787396s" podCreationTimestamp="2026-02-23 08:55:57 +0000 UTC" firstStartedPulling="2026-02-23 08:56:00.400040772 +0000 UTC m=+7823.403825345" lastFinishedPulling="2026-02-23 08:56:04.868770957 +0000 UTC m=+7827.872555530" observedRunningTime="2026-02-23 08:56:05.54571024 +0000 UTC m=+7828.549494803" watchObservedRunningTime="2026-02-23 08:56:05.551787396 +0000 UTC m=+7828.555571969" Feb 23 08:56:06 crc kubenswrapper[5118]: I0223 08:56:06.298541 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9vvbs" podUID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerName="registry-server" probeResult="failure" output=< Feb 23 08:56:06 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 08:56:06 crc kubenswrapper[5118]: > Feb 23 08:56:07 crc kubenswrapper[5118]: I0223 08:56:07.796210 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:56:07 crc kubenswrapper[5118]: I0223 08:56:07.796624 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:56:07 crc kubenswrapper[5118]: I0223 08:56:07.855134 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:56:10 crc kubenswrapper[5118]: I0223 08:56:10.870981 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-rd62q"] Feb 23 08:56:10 crc kubenswrapper[5118]: I0223 08:56:10.873185 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rd62q" Feb 23 08:56:10 crc kubenswrapper[5118]: I0223 08:56:10.884805 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-rd62q"] Feb 23 08:56:10 crc kubenswrapper[5118]: I0223 08:56:10.986289 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk67t\" (UniqueName: \"kubernetes.io/projected/2a17dcef-17b9-4b97-8d8e-5793a42993fa-kube-api-access-sk67t\") pod \"aodh-db-create-rd62q\" (UID: \"2a17dcef-17b9-4b97-8d8e-5793a42993fa\") " pod="openstack/aodh-db-create-rd62q" Feb 23 08:56:10 crc kubenswrapper[5118]: I0223 08:56:10.986413 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a17dcef-17b9-4b97-8d8e-5793a42993fa-operator-scripts\") pod \"aodh-db-create-rd62q\" (UID: \"2a17dcef-17b9-4b97-8d8e-5793a42993fa\") " pod="openstack/aodh-db-create-rd62q" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.023974 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-1fc5-account-create-update-jpknx"] Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.025864 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1fc5-account-create-update-jpknx" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.033876 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1fc5-account-create-update-jpknx"] Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.038683 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.089976 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67jfr\" (UniqueName: \"kubernetes.io/projected/724cca70-923b-4ffd-9acd-8091e7e33632-kube-api-access-67jfr\") pod \"aodh-1fc5-account-create-update-jpknx\" (UID: \"724cca70-923b-4ffd-9acd-8091e7e33632\") " pod="openstack/aodh-1fc5-account-create-update-jpknx" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.090038 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a17dcef-17b9-4b97-8d8e-5793a42993fa-operator-scripts\") pod \"aodh-db-create-rd62q\" (UID: \"2a17dcef-17b9-4b97-8d8e-5793a42993fa\") " pod="openstack/aodh-db-create-rd62q" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.090154 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/724cca70-923b-4ffd-9acd-8091e7e33632-operator-scripts\") pod \"aodh-1fc5-account-create-update-jpknx\" (UID: \"724cca70-923b-4ffd-9acd-8091e7e33632\") " pod="openstack/aodh-1fc5-account-create-update-jpknx" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.090174 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk67t\" (UniqueName: \"kubernetes.io/projected/2a17dcef-17b9-4b97-8d8e-5793a42993fa-kube-api-access-sk67t\") pod \"aodh-db-create-rd62q\" (UID: \"2a17dcef-17b9-4b97-8d8e-5793a42993fa\") " pod="openstack/aodh-db-create-rd62q" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.091204 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a17dcef-17b9-4b97-8d8e-5793a42993fa-operator-scripts\") pod \"aodh-db-create-rd62q\" (UID: \"2a17dcef-17b9-4b97-8d8e-5793a42993fa\") " pod="openstack/aodh-db-create-rd62q" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.115293 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk67t\" (UniqueName: \"kubernetes.io/projected/2a17dcef-17b9-4b97-8d8e-5793a42993fa-kube-api-access-sk67t\") pod \"aodh-db-create-rd62q\" (UID: \"2a17dcef-17b9-4b97-8d8e-5793a42993fa\") " pod="openstack/aodh-db-create-rd62q" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.192330 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/724cca70-923b-4ffd-9acd-8091e7e33632-operator-scripts\") pod \"aodh-1fc5-account-create-update-jpknx\" (UID: \"724cca70-923b-4ffd-9acd-8091e7e33632\") " pod="openstack/aodh-1fc5-account-create-update-jpknx" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.193458 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/724cca70-923b-4ffd-9acd-8091e7e33632-operator-scripts\") pod \"aodh-1fc5-account-create-update-jpknx\" (UID: \"724cca70-923b-4ffd-9acd-8091e7e33632\") " pod="openstack/aodh-1fc5-account-create-update-jpknx" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.194045 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67jfr\" (UniqueName: \"kubernetes.io/projected/724cca70-923b-4ffd-9acd-8091e7e33632-kube-api-access-67jfr\") pod \"aodh-1fc5-account-create-update-jpknx\" (UID: \"724cca70-923b-4ffd-9acd-8091e7e33632\") " pod="openstack/aodh-1fc5-account-create-update-jpknx" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.218844 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67jfr\" (UniqueName: \"kubernetes.io/projected/724cca70-923b-4ffd-9acd-8091e7e33632-kube-api-access-67jfr\") pod \"aodh-1fc5-account-create-update-jpknx\" (UID: \"724cca70-923b-4ffd-9acd-8091e7e33632\") " pod="openstack/aodh-1fc5-account-create-update-jpknx" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.247388 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rd62q" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.359713 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1fc5-account-create-update-jpknx" Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.805611 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-rd62q"] Feb 23 08:56:11 crc kubenswrapper[5118]: W0223 08:56:11.810193 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a17dcef_17b9_4b97_8d8e_5793a42993fa.slice/crio-a32edb2999fb4d504cdbc5ad0e56359b1876d2d24b739ff3db488169ca15bdfa WatchSource:0}: Error finding container a32edb2999fb4d504cdbc5ad0e56359b1876d2d24b739ff3db488169ca15bdfa: Status 404 returned error can't find the container with id a32edb2999fb4d504cdbc5ad0e56359b1876d2d24b739ff3db488169ca15bdfa Feb 23 08:56:11 crc kubenswrapper[5118]: I0223 08:56:11.937958 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-1fc5-account-create-update-jpknx"] Feb 23 08:56:11 crc kubenswrapper[5118]: W0223 08:56:11.941045 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724cca70_923b_4ffd_9acd_8091e7e33632.slice/crio-820e01e135daed9e8cb4ede6cdfe8b9204ab6aefec2dbd996f09f53f77a2be67 WatchSource:0}: Error finding container 820e01e135daed9e8cb4ede6cdfe8b9204ab6aefec2dbd996f09f53f77a2be67: Status 404 returned error can't find the container with id 820e01e135daed9e8cb4ede6cdfe8b9204ab6aefec2dbd996f09f53f77a2be67 Feb 23 08:56:12 crc kubenswrapper[5118]: I0223 08:56:12.596976 5118 generic.go:334] "Generic (PLEG): container finished" podID="2a17dcef-17b9-4b97-8d8e-5793a42993fa" containerID="03d8b73093ced66b25c2400e2eb2d65570e3f23cec2116c02c6f866b0efb2804" exitCode=0 Feb 23 08:56:12 crc kubenswrapper[5118]: I0223 08:56:12.597173 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rd62q" event={"ID":"2a17dcef-17b9-4b97-8d8e-5793a42993fa","Type":"ContainerDied","Data":"03d8b73093ced66b25c2400e2eb2d65570e3f23cec2116c02c6f866b0efb2804"} Feb 23 08:56:12 crc kubenswrapper[5118]: I0223 08:56:12.597420 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rd62q" event={"ID":"2a17dcef-17b9-4b97-8d8e-5793a42993fa","Type":"ContainerStarted","Data":"a32edb2999fb4d504cdbc5ad0e56359b1876d2d24b739ff3db488169ca15bdfa"} Feb 23 08:56:12 crc kubenswrapper[5118]: I0223 08:56:12.599382 5118 generic.go:334] "Generic (PLEG): container finished" podID="724cca70-923b-4ffd-9acd-8091e7e33632" containerID="11b9f61934120e6a994398b7c68652af10d3cac440288682549d3130cefee46c" exitCode=0 Feb 23 08:56:12 crc kubenswrapper[5118]: I0223 08:56:12.599417 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1fc5-account-create-update-jpknx" event={"ID":"724cca70-923b-4ffd-9acd-8091e7e33632","Type":"ContainerDied","Data":"11b9f61934120e6a994398b7c68652af10d3cac440288682549d3130cefee46c"} Feb 23 08:56:12 crc kubenswrapper[5118]: I0223 08:56:12.599441 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1fc5-account-create-update-jpknx" event={"ID":"724cca70-923b-4ffd-9acd-8091e7e33632","Type":"ContainerStarted","Data":"820e01e135daed9e8cb4ede6cdfe8b9204ab6aefec2dbd996f09f53f77a2be67"} Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.128150 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1fc5-account-create-update-jpknx" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.136259 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rd62q" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.263560 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67jfr\" (UniqueName: \"kubernetes.io/projected/724cca70-923b-4ffd-9acd-8091e7e33632-kube-api-access-67jfr\") pod \"724cca70-923b-4ffd-9acd-8091e7e33632\" (UID: \"724cca70-923b-4ffd-9acd-8091e7e33632\") " Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.263704 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a17dcef-17b9-4b97-8d8e-5793a42993fa-operator-scripts\") pod \"2a17dcef-17b9-4b97-8d8e-5793a42993fa\" (UID: \"2a17dcef-17b9-4b97-8d8e-5793a42993fa\") " Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.263783 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk67t\" (UniqueName: \"kubernetes.io/projected/2a17dcef-17b9-4b97-8d8e-5793a42993fa-kube-api-access-sk67t\") pod \"2a17dcef-17b9-4b97-8d8e-5793a42993fa\" (UID: \"2a17dcef-17b9-4b97-8d8e-5793a42993fa\") " Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.263810 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/724cca70-923b-4ffd-9acd-8091e7e33632-operator-scripts\") pod \"724cca70-923b-4ffd-9acd-8091e7e33632\" (UID: \"724cca70-923b-4ffd-9acd-8091e7e33632\") " Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.264582 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a17dcef-17b9-4b97-8d8e-5793a42993fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a17dcef-17b9-4b97-8d8e-5793a42993fa" (UID: "2a17dcef-17b9-4b97-8d8e-5793a42993fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.264615 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/724cca70-923b-4ffd-9acd-8091e7e33632-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "724cca70-923b-4ffd-9acd-8091e7e33632" (UID: "724cca70-923b-4ffd-9acd-8091e7e33632"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.264912 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a17dcef-17b9-4b97-8d8e-5793a42993fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.264942 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/724cca70-923b-4ffd-9acd-8091e7e33632-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.275571 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724cca70-923b-4ffd-9acd-8091e7e33632-kube-api-access-67jfr" (OuterVolumeSpecName: "kube-api-access-67jfr") pod "724cca70-923b-4ffd-9acd-8091e7e33632" (UID: "724cca70-923b-4ffd-9acd-8091e7e33632"). InnerVolumeSpecName "kube-api-access-67jfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.275641 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a17dcef-17b9-4b97-8d8e-5793a42993fa-kube-api-access-sk67t" (OuterVolumeSpecName: "kube-api-access-sk67t") pod "2a17dcef-17b9-4b97-8d8e-5793a42993fa" (UID: "2a17dcef-17b9-4b97-8d8e-5793a42993fa"). InnerVolumeSpecName "kube-api-access-sk67t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.366735 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67jfr\" (UniqueName: \"kubernetes.io/projected/724cca70-923b-4ffd-9acd-8091e7e33632-kube-api-access-67jfr\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.366776 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk67t\" (UniqueName: \"kubernetes.io/projected/2a17dcef-17b9-4b97-8d8e-5793a42993fa-kube-api-access-sk67t\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.642873 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-1fc5-account-create-update-jpknx" event={"ID":"724cca70-923b-4ffd-9acd-8091e7e33632","Type":"ContainerDied","Data":"820e01e135daed9e8cb4ede6cdfe8b9204ab6aefec2dbd996f09f53f77a2be67"} Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.642949 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="820e01e135daed9e8cb4ede6cdfe8b9204ab6aefec2dbd996f09f53f77a2be67" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.643054 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-1fc5-account-create-update-jpknx" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.680381 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rd62q" event={"ID":"2a17dcef-17b9-4b97-8d8e-5793a42993fa","Type":"ContainerDied","Data":"a32edb2999fb4d504cdbc5ad0e56359b1876d2d24b739ff3db488169ca15bdfa"} Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.680434 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a32edb2999fb4d504cdbc5ad0e56359b1876d2d24b739ff3db488169ca15bdfa" Feb 23 08:56:14 crc kubenswrapper[5118]: I0223 08:56:14.680519 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rd62q" Feb 23 08:56:15 crc kubenswrapper[5118]: I0223 08:56:15.305448 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:56:15 crc kubenswrapper[5118]: I0223 08:56:15.359860 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:56:15 crc kubenswrapper[5118]: I0223 08:56:15.541470 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9vvbs"] Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.322946 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-pmzh9"] Feb 23 08:56:16 crc kubenswrapper[5118]: E0223 08:56:16.323425 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a17dcef-17b9-4b97-8d8e-5793a42993fa" containerName="mariadb-database-create" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.323438 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a17dcef-17b9-4b97-8d8e-5793a42993fa" containerName="mariadb-database-create" Feb 23 08:56:16 crc kubenswrapper[5118]: E0223 08:56:16.323451 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724cca70-923b-4ffd-9acd-8091e7e33632" containerName="mariadb-account-create-update" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.323456 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="724cca70-923b-4ffd-9acd-8091e7e33632" containerName="mariadb-account-create-update" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.323645 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="724cca70-923b-4ffd-9acd-8091e7e33632" containerName="mariadb-account-create-update" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.323664 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a17dcef-17b9-4b97-8d8e-5793a42993fa" containerName="mariadb-database-create" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.324387 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.326212 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-wkqm4" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.326795 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.327139 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.327333 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.345957 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-pmzh9"] Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.420388 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-config-data\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.420460 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4qdj\" (UniqueName: \"kubernetes.io/projected/651f79aa-be9a-40aa-afae-ebb8d8497a7d-kube-api-access-m4qdj\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.420514 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-scripts\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.420579 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-combined-ca-bundle\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.522693 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-combined-ca-bundle\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.522832 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-config-data\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.522901 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4qdj\" (UniqueName: \"kubernetes.io/projected/651f79aa-be9a-40aa-afae-ebb8d8497a7d-kube-api-access-m4qdj\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.522978 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-scripts\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.527886 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-scripts\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.528472 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-config-data\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.530045 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-combined-ca-bundle\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.543549 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4qdj\" (UniqueName: \"kubernetes.io/projected/651f79aa-be9a-40aa-afae-ebb8d8497a7d-kube-api-access-m4qdj\") pod \"aodh-db-sync-pmzh9\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.690857 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:16 crc kubenswrapper[5118]: I0223 08:56:16.698655 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9vvbs" podUID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerName="registry-server" containerID="cri-o://af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e" gracePeriod=2 Feb 23 08:56:17 crc kubenswrapper[5118]: E0223 08:56:17.084501 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e4b3873_7378_41ff_969c_0fb14d67b3bb.slice/crio-af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e4b3873_7378_41ff_969c_0fb14d67b3bb.slice/crio-conmon-af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e.scope\": RecentStats: unable to find data in memory cache]" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.222753 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.245076 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-pmzh9"] Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.342706 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-catalog-content\") pod \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.342965 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-utilities\") pod \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.343042 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcg9p\" (UniqueName: \"kubernetes.io/projected/2e4b3873-7378-41ff-969c-0fb14d67b3bb-kube-api-access-zcg9p\") pod \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\" (UID: \"2e4b3873-7378-41ff-969c-0fb14d67b3bb\") " Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.344511 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-utilities" (OuterVolumeSpecName: "utilities") pod "2e4b3873-7378-41ff-969c-0fb14d67b3bb" (UID: "2e4b3873-7378-41ff-969c-0fb14d67b3bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.349006 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4b3873-7378-41ff-969c-0fb14d67b3bb-kube-api-access-zcg9p" (OuterVolumeSpecName: "kube-api-access-zcg9p") pod "2e4b3873-7378-41ff-969c-0fb14d67b3bb" (UID: "2e4b3873-7378-41ff-969c-0fb14d67b3bb"). InnerVolumeSpecName "kube-api-access-zcg9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.445827 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.446088 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcg9p\" (UniqueName: \"kubernetes.io/projected/2e4b3873-7378-41ff-969c-0fb14d67b3bb-kube-api-access-zcg9p\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.465155 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e4b3873-7378-41ff-969c-0fb14d67b3bb" (UID: "2e4b3873-7378-41ff-969c-0fb14d67b3bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.550943 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e4b3873-7378-41ff-969c-0fb14d67b3bb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.719421 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pmzh9" event={"ID":"651f79aa-be9a-40aa-afae-ebb8d8497a7d","Type":"ContainerStarted","Data":"abb860c365c09dd5f4e07a1088c5d481de7ab142666a0d75b66a93687997d638"} Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.732423 5118 generic.go:334] "Generic (PLEG): container finished" podID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerID="af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e" exitCode=0 Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.732468 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vvbs" event={"ID":"2e4b3873-7378-41ff-969c-0fb14d67b3bb","Type":"ContainerDied","Data":"af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e"} Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.732498 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vvbs" event={"ID":"2e4b3873-7378-41ff-969c-0fb14d67b3bb","Type":"ContainerDied","Data":"280424da1910b6bbd3e3f7b17c3092f66b12bc70725f8b436af0273eb1511ea9"} Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.732516 5118 scope.go:117] "RemoveContainer" containerID="af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.732656 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vvbs" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.762708 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9vvbs"] Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.763534 5118 scope.go:117] "RemoveContainer" containerID="313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.773288 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9vvbs"] Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.791741 5118 scope.go:117] "RemoveContainer" containerID="4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.830401 5118 scope.go:117] "RemoveContainer" containerID="af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e" Feb 23 08:56:17 crc kubenswrapper[5118]: E0223 08:56:17.831162 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e\": container with ID starting with af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e not found: ID does not exist" containerID="af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.831269 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e"} err="failed to get container status \"af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e\": rpc error: code = NotFound desc = could not find container \"af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e\": container with ID starting with af343bc5129d7a9755af9dd2dd064e37658a17206a8aa8f3c6a195ff3704f32e not found: ID does not exist" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.831297 5118 scope.go:117] "RemoveContainer" containerID="313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9" Feb 23 08:56:17 crc kubenswrapper[5118]: E0223 08:56:17.831681 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9\": container with ID starting with 313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9 not found: ID does not exist" containerID="313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.831707 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9"} err="failed to get container status \"313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9\": rpc error: code = NotFound desc = could not find container \"313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9\": container with ID starting with 313a5075a2251481fa23af97689c1d066ebe6b7e76c90f54a97a9f4484dfafe9 not found: ID does not exist" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.831755 5118 scope.go:117] "RemoveContainer" containerID="4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27" Feb 23 08:56:17 crc kubenswrapper[5118]: E0223 08:56:17.832266 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27\": container with ID starting with 4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27 not found: ID does not exist" containerID="4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.832311 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27"} err="failed to get container status \"4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27\": rpc error: code = NotFound desc = could not find container \"4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27\": container with ID starting with 4924d22dccf44ae80e118a744a9da7811d69a2d6b0abc754518184f40a6a8b27 not found: ID does not exist" Feb 23 08:56:17 crc kubenswrapper[5118]: I0223 08:56:17.903399 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:56:19 crc kubenswrapper[5118]: I0223 08:56:19.716423 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" path="/var/lib/kubelet/pods/2e4b3873-7378-41ff-969c-0fb14d67b3bb/volumes" Feb 23 08:56:20 crc kubenswrapper[5118]: I0223 08:56:20.138625 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6fgs"] Feb 23 08:56:20 crc kubenswrapper[5118]: I0223 08:56:20.138868 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g6fgs" podUID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerName="registry-server" containerID="cri-o://226e1b6e359605446e910af19ce6cb284015b4de75a8a95725e92b7640fa98e4" gracePeriod=2 Feb 23 08:56:20 crc kubenswrapper[5118]: I0223 08:56:20.955390 5118 generic.go:334] "Generic (PLEG): container finished" podID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerID="226e1b6e359605446e910af19ce6cb284015b4de75a8a95725e92b7640fa98e4" exitCode=0 Feb 23 08:56:20 crc kubenswrapper[5118]: I0223 08:56:20.955723 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6fgs" event={"ID":"a86e4ee3-789f-4889-96b4-63dd7c385a36","Type":"ContainerDied","Data":"226e1b6e359605446e910af19ce6cb284015b4de75a8a95725e92b7640fa98e4"} Feb 23 08:56:22 crc kubenswrapper[5118]: I0223 08:56:22.468049 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:56:22 crc kubenswrapper[5118]: I0223 08:56:22.474070 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr7kk\" (UniqueName: \"kubernetes.io/projected/a86e4ee3-789f-4889-96b4-63dd7c385a36-kube-api-access-xr7kk\") pod \"a86e4ee3-789f-4889-96b4-63dd7c385a36\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " Feb 23 08:56:22 crc kubenswrapper[5118]: I0223 08:56:22.474194 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-catalog-content\") pod \"a86e4ee3-789f-4889-96b4-63dd7c385a36\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " Feb 23 08:56:22 crc kubenswrapper[5118]: I0223 08:56:22.474384 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-utilities\") pod \"a86e4ee3-789f-4889-96b4-63dd7c385a36\" (UID: \"a86e4ee3-789f-4889-96b4-63dd7c385a36\") " Feb 23 08:56:22 crc kubenswrapper[5118]: I0223 08:56:22.475077 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-utilities" (OuterVolumeSpecName: "utilities") pod "a86e4ee3-789f-4889-96b4-63dd7c385a36" (UID: "a86e4ee3-789f-4889-96b4-63dd7c385a36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:56:22 crc kubenswrapper[5118]: I0223 08:56:22.483429 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86e4ee3-789f-4889-96b4-63dd7c385a36-kube-api-access-xr7kk" (OuterVolumeSpecName: "kube-api-access-xr7kk") pod "a86e4ee3-789f-4889-96b4-63dd7c385a36" (UID: "a86e4ee3-789f-4889-96b4-63dd7c385a36"). InnerVolumeSpecName "kube-api-access-xr7kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:22 crc kubenswrapper[5118]: I0223 08:56:22.533021 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a86e4ee3-789f-4889-96b4-63dd7c385a36" (UID: "a86e4ee3-789f-4889-96b4-63dd7c385a36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:56:22 crc kubenswrapper[5118]: I0223 08:56:22.577966 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr7kk\" (UniqueName: \"kubernetes.io/projected/a86e4ee3-789f-4889-96b4-63dd7c385a36-kube-api-access-xr7kk\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:22 crc kubenswrapper[5118]: I0223 08:56:22.578014 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:22 crc kubenswrapper[5118]: I0223 08:56:22.578026 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86e4ee3-789f-4889-96b4-63dd7c385a36-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:23 crc kubenswrapper[5118]: I0223 08:56:23.000564 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g6fgs" event={"ID":"a86e4ee3-789f-4889-96b4-63dd7c385a36","Type":"ContainerDied","Data":"0bdc28cd1619953f2b77305e559a36246a0590eecc9c057cde10703a7c48a480"} Feb 23 08:56:23 crc kubenswrapper[5118]: I0223 08:56:23.000830 5118 scope.go:117] "RemoveContainer" containerID="226e1b6e359605446e910af19ce6cb284015b4de75a8a95725e92b7640fa98e4" Feb 23 08:56:23 crc kubenswrapper[5118]: I0223 08:56:23.000614 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g6fgs" Feb 23 08:56:23 crc kubenswrapper[5118]: I0223 08:56:23.021350 5118 scope.go:117] "RemoveContainer" containerID="94d6af968a6e2b4c3496865323efe2ff0cf44f8a3995b77c390b7b49c235a3ed" Feb 23 08:56:23 crc kubenswrapper[5118]: I0223 08:56:23.038015 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g6fgs"] Feb 23 08:56:23 crc kubenswrapper[5118]: I0223 08:56:23.048177 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g6fgs"] Feb 23 08:56:23 crc kubenswrapper[5118]: I0223 08:56:23.063105 5118 scope.go:117] "RemoveContainer" containerID="f7b2da5417eb0cce5ac61a976ff690553a21646d17d138d1622d4e432406bec6" Feb 23 08:56:23 crc kubenswrapper[5118]: I0223 08:56:23.712340 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a86e4ee3-789f-4889-96b4-63dd7c385a36" path="/var/lib/kubelet/pods/a86e4ee3-789f-4889-96b4-63dd7c385a36/volumes" Feb 23 08:56:24 crc kubenswrapper[5118]: I0223 08:56:24.005275 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 08:56:24 crc kubenswrapper[5118]: I0223 08:56:24.015566 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pmzh9" event={"ID":"651f79aa-be9a-40aa-afae-ebb8d8497a7d","Type":"ContainerStarted","Data":"77f6c7533e8f9c84f7533b4d524125c76f2b22db261909594b9fb3ba4155fd34"} Feb 23 08:56:24 crc kubenswrapper[5118]: I0223 08:56:24.084235 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-pmzh9" podStartSLOduration=2.518397725 podStartE2EDuration="8.084188012s" podCreationTimestamp="2026-02-23 08:56:16 +0000 UTC" firstStartedPulling="2026-02-23 08:56:17.250484099 +0000 UTC m=+7840.254268672" lastFinishedPulling="2026-02-23 08:56:22.816274386 +0000 UTC m=+7845.820058959" observedRunningTime="2026-02-23 08:56:24.055601663 +0000 UTC m=+7847.059386266" watchObservedRunningTime="2026-02-23 08:56:24.084188012 +0000 UTC m=+7847.087972585" Feb 23 08:56:26 crc kubenswrapper[5118]: I0223 08:56:26.038028 5118 generic.go:334] "Generic (PLEG): container finished" podID="651f79aa-be9a-40aa-afae-ebb8d8497a7d" containerID="77f6c7533e8f9c84f7533b4d524125c76f2b22db261909594b9fb3ba4155fd34" exitCode=0 Feb 23 08:56:26 crc kubenswrapper[5118]: I0223 08:56:26.038169 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pmzh9" event={"ID":"651f79aa-be9a-40aa-afae-ebb8d8497a7d","Type":"ContainerDied","Data":"77f6c7533e8f9c84f7533b4d524125c76f2b22db261909594b9fb3ba4155fd34"} Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.473663 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.578260 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-scripts\") pod \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.578391 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-combined-ca-bundle\") pod \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.578452 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4qdj\" (UniqueName: \"kubernetes.io/projected/651f79aa-be9a-40aa-afae-ebb8d8497a7d-kube-api-access-m4qdj\") pod \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.578588 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-config-data\") pod \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\" (UID: \"651f79aa-be9a-40aa-afae-ebb8d8497a7d\") " Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.586018 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651f79aa-be9a-40aa-afae-ebb8d8497a7d-kube-api-access-m4qdj" (OuterVolumeSpecName: "kube-api-access-m4qdj") pod "651f79aa-be9a-40aa-afae-ebb8d8497a7d" (UID: "651f79aa-be9a-40aa-afae-ebb8d8497a7d"). InnerVolumeSpecName "kube-api-access-m4qdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.586676 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-scripts" (OuterVolumeSpecName: "scripts") pod "651f79aa-be9a-40aa-afae-ebb8d8497a7d" (UID: "651f79aa-be9a-40aa-afae-ebb8d8497a7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.610421 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "651f79aa-be9a-40aa-afae-ebb8d8497a7d" (UID: "651f79aa-be9a-40aa-afae-ebb8d8497a7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.629824 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-config-data" (OuterVolumeSpecName: "config-data") pod "651f79aa-be9a-40aa-afae-ebb8d8497a7d" (UID: "651f79aa-be9a-40aa-afae-ebb8d8497a7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.681475 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4qdj\" (UniqueName: \"kubernetes.io/projected/651f79aa-be9a-40aa-afae-ebb8d8497a7d-kube-api-access-m4qdj\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.681507 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.681517 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:27 crc kubenswrapper[5118]: I0223 08:56:27.681526 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651f79aa-be9a-40aa-afae-ebb8d8497a7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:28 crc kubenswrapper[5118]: I0223 08:56:28.059547 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pmzh9" event={"ID":"651f79aa-be9a-40aa-afae-ebb8d8497a7d","Type":"ContainerDied","Data":"abb860c365c09dd5f4e07a1088c5d481de7ab142666a0d75b66a93687997d638"} Feb 23 08:56:28 crc kubenswrapper[5118]: I0223 08:56:28.059620 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abb860c365c09dd5f4e07a1088c5d481de7ab142666a0d75b66a93687997d638" Feb 23 08:56:28 crc kubenswrapper[5118]: I0223 08:56:28.059725 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pmzh9" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.131621 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 23 08:56:31 crc kubenswrapper[5118]: E0223 08:56:31.132432 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerName="registry-server" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.132446 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerName="registry-server" Feb 23 08:56:31 crc kubenswrapper[5118]: E0223 08:56:31.132481 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerName="extract-utilities" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.132487 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerName="extract-utilities" Feb 23 08:56:31 crc kubenswrapper[5118]: E0223 08:56:31.132499 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerName="registry-server" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.132505 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerName="registry-server" Feb 23 08:56:31 crc kubenswrapper[5118]: E0223 08:56:31.132518 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651f79aa-be9a-40aa-afae-ebb8d8497a7d" containerName="aodh-db-sync" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.132525 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="651f79aa-be9a-40aa-afae-ebb8d8497a7d" containerName="aodh-db-sync" Feb 23 08:56:31 crc kubenswrapper[5118]: E0223 08:56:31.132543 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerName="extract-utilities" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.132549 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerName="extract-utilities" Feb 23 08:56:31 crc kubenswrapper[5118]: E0223 08:56:31.132565 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerName="extract-content" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.132571 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerName="extract-content" Feb 23 08:56:31 crc kubenswrapper[5118]: E0223 08:56:31.132581 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerName="extract-content" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.132586 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerName="extract-content" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.132765 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a86e4ee3-789f-4889-96b4-63dd7c385a36" containerName="registry-server" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.132785 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4b3873-7378-41ff-969c-0fb14d67b3bb" containerName="registry-server" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.132794 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="651f79aa-be9a-40aa-afae-ebb8d8497a7d" containerName="aodh-db-sync" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.138223 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.143630 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-wkqm4" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.143862 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.143974 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.151590 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.264705 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ckcf\" (UniqueName: \"kubernetes.io/projected/7b334d5c-a053-474f-8395-432faf152c91-kube-api-access-4ckcf\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.264795 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b334d5c-a053-474f-8395-432faf152c91-scripts\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.264855 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b334d5c-a053-474f-8395-432faf152c91-config-data\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.265391 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b334d5c-a053-474f-8395-432faf152c91-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.367337 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ckcf\" (UniqueName: \"kubernetes.io/projected/7b334d5c-a053-474f-8395-432faf152c91-kube-api-access-4ckcf\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.367435 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b334d5c-a053-474f-8395-432faf152c91-scripts\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.367505 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b334d5c-a053-474f-8395-432faf152c91-config-data\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.367588 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b334d5c-a053-474f-8395-432faf152c91-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.378302 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b334d5c-a053-474f-8395-432faf152c91-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.380248 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b334d5c-a053-474f-8395-432faf152c91-scripts\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.386127 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b334d5c-a053-474f-8395-432faf152c91-config-data\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.393567 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ckcf\" (UniqueName: \"kubernetes.io/projected/7b334d5c-a053-474f-8395-432faf152c91-kube-api-access-4ckcf\") pod \"aodh-0\" (UID: \"7b334d5c-a053-474f-8395-432faf152c91\") " pod="openstack/aodh-0" Feb 23 08:56:31 crc kubenswrapper[5118]: I0223 08:56:31.484538 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 23 08:56:33 crc kubenswrapper[5118]: I0223 08:56:33.007925 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 23 08:56:33 crc kubenswrapper[5118]: I0223 08:56:33.112384 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7b334d5c-a053-474f-8395-432faf152c91","Type":"ContainerStarted","Data":"4330630f24def1312e894d33500a4481bf2cb8ff047e6a4f8844b8b11d6fd1df"} Feb 23 08:56:33 crc kubenswrapper[5118]: I0223 08:56:33.566123 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:56:33 crc kubenswrapper[5118]: I0223 08:56:33.566888 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="ceilometer-central-agent" containerID="cri-o://03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60" gracePeriod=30 Feb 23 08:56:33 crc kubenswrapper[5118]: I0223 08:56:33.566997 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="sg-core" containerID="cri-o://1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f" gracePeriod=30 Feb 23 08:56:33 crc kubenswrapper[5118]: I0223 08:56:33.567011 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="proxy-httpd" containerID="cri-o://e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01" gracePeriod=30 Feb 23 08:56:33 crc kubenswrapper[5118]: I0223 08:56:33.567026 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="ceilometer-notification-agent" containerID="cri-o://bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75" gracePeriod=30 Feb 23 08:56:34 crc kubenswrapper[5118]: I0223 08:56:34.128278 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7b334d5c-a053-474f-8395-432faf152c91","Type":"ContainerStarted","Data":"7dca6bcc83b762ee28e9b448afb271a42fec471dde26478c307c0ea382d3a28e"} Feb 23 08:56:34 crc kubenswrapper[5118]: I0223 08:56:34.131590 5118 generic.go:334] "Generic (PLEG): container finished" podID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerID="e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01" exitCode=0 Feb 23 08:56:34 crc kubenswrapper[5118]: I0223 08:56:34.131636 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e626a607-7d5e-41f2-8ba5-80b8177fc042","Type":"ContainerDied","Data":"e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01"} Feb 23 08:56:34 crc kubenswrapper[5118]: I0223 08:56:34.131659 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e626a607-7d5e-41f2-8ba5-80b8177fc042","Type":"ContainerDied","Data":"1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f"} Feb 23 08:56:34 crc kubenswrapper[5118]: I0223 08:56:34.131614 5118 generic.go:334] "Generic (PLEG): container finished" podID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerID="1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f" exitCode=2 Feb 23 08:56:34 crc kubenswrapper[5118]: I0223 08:56:34.131727 5118 generic.go:334] "Generic (PLEG): container finished" podID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerID="03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60" exitCode=0 Feb 23 08:56:34 crc kubenswrapper[5118]: I0223 08:56:34.131746 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e626a607-7d5e-41f2-8ba5-80b8177fc042","Type":"ContainerDied","Data":"03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60"} Feb 23 08:56:35 crc kubenswrapper[5118]: I0223 08:56:35.149081 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7b334d5c-a053-474f-8395-432faf152c91","Type":"ContainerStarted","Data":"c3a4bd39df38de943df978b0d785fd402737b35290a33cf90eb709f654954822"} Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.169273 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7b334d5c-a053-474f-8395-432faf152c91","Type":"ContainerStarted","Data":"03ca06993e003d51a59109c82020dad0ef40947b0d46f61a418bdf39f5165d34"} Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.896223 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.909772 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-combined-ca-bundle\") pod \"e626a607-7d5e-41f2-8ba5-80b8177fc042\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.909926 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6wx9\" (UniqueName: \"kubernetes.io/projected/e626a607-7d5e-41f2-8ba5-80b8177fc042-kube-api-access-q6wx9\") pod \"e626a607-7d5e-41f2-8ba5-80b8177fc042\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.910005 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-run-httpd\") pod \"e626a607-7d5e-41f2-8ba5-80b8177fc042\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.910068 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-scripts\") pod \"e626a607-7d5e-41f2-8ba5-80b8177fc042\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.910178 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-config-data\") pod \"e626a607-7d5e-41f2-8ba5-80b8177fc042\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.910254 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-sg-core-conf-yaml\") pod \"e626a607-7d5e-41f2-8ba5-80b8177fc042\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.910285 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-log-httpd\") pod \"e626a607-7d5e-41f2-8ba5-80b8177fc042\" (UID: \"e626a607-7d5e-41f2-8ba5-80b8177fc042\") " Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.911799 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e626a607-7d5e-41f2-8ba5-80b8177fc042" (UID: "e626a607-7d5e-41f2-8ba5-80b8177fc042"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.914007 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e626a607-7d5e-41f2-8ba5-80b8177fc042" (UID: "e626a607-7d5e-41f2-8ba5-80b8177fc042"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.914426 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-scripts" (OuterVolumeSpecName: "scripts") pod "e626a607-7d5e-41f2-8ba5-80b8177fc042" (UID: "e626a607-7d5e-41f2-8ba5-80b8177fc042"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.921326 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e626a607-7d5e-41f2-8ba5-80b8177fc042-kube-api-access-q6wx9" (OuterVolumeSpecName: "kube-api-access-q6wx9") pod "e626a607-7d5e-41f2-8ba5-80b8177fc042" (UID: "e626a607-7d5e-41f2-8ba5-80b8177fc042"). InnerVolumeSpecName "kube-api-access-q6wx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:36 crc kubenswrapper[5118]: I0223 08:56:36.987919 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e626a607-7d5e-41f2-8ba5-80b8177fc042" (UID: "e626a607-7d5e-41f2-8ba5-80b8177fc042"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.012933 5118 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.012970 5118 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.012980 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6wx9\" (UniqueName: \"kubernetes.io/projected/e626a607-7d5e-41f2-8ba5-80b8177fc042-kube-api-access-q6wx9\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.012990 5118 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e626a607-7d5e-41f2-8ba5-80b8177fc042-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.013000 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.034463 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-config-data" (OuterVolumeSpecName: "config-data") pod "e626a607-7d5e-41f2-8ba5-80b8177fc042" (UID: "e626a607-7d5e-41f2-8ba5-80b8177fc042"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.064231 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e626a607-7d5e-41f2-8ba5-80b8177fc042" (UID: "e626a607-7d5e-41f2-8ba5-80b8177fc042"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.114072 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.114117 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e626a607-7d5e-41f2-8ba5-80b8177fc042-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.189034 5118 generic.go:334] "Generic (PLEG): container finished" podID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerID="bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75" exitCode=0 Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.189236 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e626a607-7d5e-41f2-8ba5-80b8177fc042","Type":"ContainerDied","Data":"bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75"} Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.189620 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e626a607-7d5e-41f2-8ba5-80b8177fc042","Type":"ContainerDied","Data":"25166adce8032b73d359e4b7f4992ca50c3acc4ae6fe919c089d575642dc8b4e"} Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.189432 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.189646 5118 scope.go:117] "RemoveContainer" containerID="e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.219085 5118 scope.go:117] "RemoveContainer" containerID="1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.254454 5118 scope.go:117] "RemoveContainer" containerID="bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.263159 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.274391 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.286752 5118 scope.go:117] "RemoveContainer" containerID="03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.296157 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:56:37 crc kubenswrapper[5118]: E0223 08:56:37.296648 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="proxy-httpd" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.296661 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="proxy-httpd" Feb 23 08:56:37 crc kubenswrapper[5118]: E0223 08:56:37.296672 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="ceilometer-notification-agent" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.296678 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="ceilometer-notification-agent" Feb 23 08:56:37 crc kubenswrapper[5118]: E0223 08:56:37.296691 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="sg-core" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.296697 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="sg-core" Feb 23 08:56:37 crc kubenswrapper[5118]: E0223 08:56:37.296717 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="ceilometer-central-agent" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.296723 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="ceilometer-central-agent" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.296939 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="ceilometer-central-agent" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.296955 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="proxy-httpd" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.296966 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="ceilometer-notification-agent" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.296973 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" containerName="sg-core" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.298868 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.303079 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.303738 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.309051 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.322296 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-log-httpd\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.322543 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.322619 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-scripts\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.322674 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-config-data\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.322754 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmp5s\" (UniqueName: \"kubernetes.io/projected/5170f948-8cb6-4560-be4e-41de9e6e49a6-kube-api-access-jmp5s\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.322892 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.322983 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-run-httpd\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.330041 5118 scope.go:117] "RemoveContainer" containerID="e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01" Feb 23 08:56:37 crc kubenswrapper[5118]: E0223 08:56:37.330704 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01\": container with ID starting with e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01 not found: ID does not exist" containerID="e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.330755 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01"} err="failed to get container status \"e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01\": rpc error: code = NotFound desc = could not find container \"e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01\": container with ID starting with e0eea12301b3e0a6b807a36af93c43703d7995df965eb7d5b90d5f5a5a378f01 not found: ID does not exist" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.330786 5118 scope.go:117] "RemoveContainer" containerID="1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f" Feb 23 08:56:37 crc kubenswrapper[5118]: E0223 08:56:37.338293 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f\": container with ID starting with 1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f not found: ID does not exist" containerID="1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.338347 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f"} err="failed to get container status \"1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f\": rpc error: code = NotFound desc = could not find container \"1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f\": container with ID starting with 1fc44b220c4ac255c3532910f7d45f73ab4fbb6152969692f5c1def62d80584f not found: ID does not exist" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.339477 5118 scope.go:117] "RemoveContainer" containerID="bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75" Feb 23 08:56:37 crc kubenswrapper[5118]: E0223 08:56:37.340072 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75\": container with ID starting with bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75 not found: ID does not exist" containerID="bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.340115 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75"} err="failed to get container status \"bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75\": rpc error: code = NotFound desc = could not find container \"bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75\": container with ID starting with bfd590abdedc7be1e1fc8a70a85a8197fa725999dd2043b5a2513d4514deea75 not found: ID does not exist" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.340130 5118 scope.go:117] "RemoveContainer" containerID="03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60" Feb 23 08:56:37 crc kubenswrapper[5118]: E0223 08:56:37.340604 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60\": container with ID starting with 03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60 not found: ID does not exist" containerID="03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.340646 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60"} err="failed to get container status \"03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60\": rpc error: code = NotFound desc = could not find container \"03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60\": container with ID starting with 03b6806c8046197b1714c044fe4102544056db6c7023be558b4b9d738b4cec60 not found: ID does not exist" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.428002 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-scripts\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.428078 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-config-data\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.428157 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmp5s\" (UniqueName: \"kubernetes.io/projected/5170f948-8cb6-4560-be4e-41de9e6e49a6-kube-api-access-jmp5s\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.428191 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.428235 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-run-httpd\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.428335 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-log-httpd\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.428429 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.435816 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.441536 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-run-httpd\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.441779 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-log-httpd\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.442206 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-scripts\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.442313 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.443283 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-config-data\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.449460 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmp5s\" (UniqueName: \"kubernetes.io/projected/5170f948-8cb6-4560-be4e-41de9e6e49a6-kube-api-access-jmp5s\") pod \"ceilometer-0\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.619437 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:56:37 crc kubenswrapper[5118]: I0223 08:56:37.716172 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e626a607-7d5e-41f2-8ba5-80b8177fc042" path="/var/lib/kubelet/pods/e626a607-7d5e-41f2-8ba5-80b8177fc042/volumes" Feb 23 08:56:38 crc kubenswrapper[5118]: I0223 08:56:38.197704 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:56:38 crc kubenswrapper[5118]: W0223 08:56:38.198875 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5170f948_8cb6_4560_be4e_41de9e6e49a6.slice/crio-9c091a10157114699f0caa73fa2b0aece7880384d71e3a3dd2e36a20eaa518f0 WatchSource:0}: Error finding container 9c091a10157114699f0caa73fa2b0aece7880384d71e3a3dd2e36a20eaa518f0: Status 404 returned error can't find the container with id 9c091a10157114699f0caa73fa2b0aece7880384d71e3a3dd2e36a20eaa518f0 Feb 23 08:56:38 crc kubenswrapper[5118]: I0223 08:56:38.209800 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7b334d5c-a053-474f-8395-432faf152c91","Type":"ContainerStarted","Data":"344ffcf82464cd9308a86a1f0294fa6b44131f4d77e7eecb834a63bf3b18b3d1"} Feb 23 08:56:38 crc kubenswrapper[5118]: I0223 08:56:38.244733 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.376550894 podStartE2EDuration="7.244710165s" podCreationTimestamp="2026-02-23 08:56:31 +0000 UTC" firstStartedPulling="2026-02-23 08:56:33.020930465 +0000 UTC m=+7856.024715038" lastFinishedPulling="2026-02-23 08:56:36.889089726 +0000 UTC m=+7859.892874309" observedRunningTime="2026-02-23 08:56:38.233358631 +0000 UTC m=+7861.237143214" watchObservedRunningTime="2026-02-23 08:56:38.244710165 +0000 UTC m=+7861.248494748" Feb 23 08:56:39 crc kubenswrapper[5118]: I0223 08:56:39.228508 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5170f948-8cb6-4560-be4e-41de9e6e49a6","Type":"ContainerStarted","Data":"e7303bc84e9624baac87e86dd74736b72b566f9bcf51667c3b4c3ff2a9bd1409"} Feb 23 08:56:39 crc kubenswrapper[5118]: I0223 08:56:39.228922 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5170f948-8cb6-4560-be4e-41de9e6e49a6","Type":"ContainerStarted","Data":"3498354cb376790d12c92d15320b4ae2ef3ce4601b1b2b8c467b0f579a83dd37"} Feb 23 08:56:39 crc kubenswrapper[5118]: I0223 08:56:39.228942 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5170f948-8cb6-4560-be4e-41de9e6e49a6","Type":"ContainerStarted","Data":"9c091a10157114699f0caa73fa2b0aece7880384d71e3a3dd2e36a20eaa518f0"} Feb 23 08:56:40 crc kubenswrapper[5118]: I0223 08:56:40.251628 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5170f948-8cb6-4560-be4e-41de9e6e49a6","Type":"ContainerStarted","Data":"2f178eaf3589cebc8d2d9ea75fb5105d01bf79b926699bb2b226f240f3f32518"} Feb 23 08:56:41 crc kubenswrapper[5118]: I0223 08:56:41.265294 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5170f948-8cb6-4560-be4e-41de9e6e49a6","Type":"ContainerStarted","Data":"c4bc8a9d49cd7f6d0e4e8f1ac5dacec12e0c351f66bc7bf5b9ed4264ff40f96c"} Feb 23 08:56:41 crc kubenswrapper[5118]: I0223 08:56:41.267627 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 08:56:41 crc kubenswrapper[5118]: I0223 08:56:41.294644 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.558785289 podStartE2EDuration="4.294621899s" podCreationTimestamp="2026-02-23 08:56:37 +0000 UTC" firstStartedPulling="2026-02-23 08:56:38.20213901 +0000 UTC m=+7861.205923583" lastFinishedPulling="2026-02-23 08:56:40.93797563 +0000 UTC m=+7863.941760193" observedRunningTime="2026-02-23 08:56:41.291834692 +0000 UTC m=+7864.295619305" watchObservedRunningTime="2026-02-23 08:56:41.294621899 +0000 UTC m=+7864.298406512" Feb 23 08:56:42 crc kubenswrapper[5118]: I0223 08:56:42.943386 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-ctz6l"] Feb 23 08:56:42 crc kubenswrapper[5118]: I0223 08:56:42.945253 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-ctz6l" Feb 23 08:56:42 crc kubenswrapper[5118]: I0223 08:56:42.960711 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-4dde-account-create-update-h7lbk"] Feb 23 08:56:42 crc kubenswrapper[5118]: I0223 08:56:42.962440 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4dde-account-create-update-h7lbk" Feb 23 08:56:42 crc kubenswrapper[5118]: I0223 08:56:42.965615 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Feb 23 08:56:42 crc kubenswrapper[5118]: I0223 08:56:42.979892 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-ctz6l"] Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.011167 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4dde-account-create-update-h7lbk"] Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.068626 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ba0846-0019-4e22-83da-01867d8f5605-operator-scripts\") pod \"manila-4dde-account-create-update-h7lbk\" (UID: \"06ba0846-0019-4e22-83da-01867d8f5605\") " pod="openstack/manila-4dde-account-create-update-h7lbk" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.068734 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj8hf\" (UniqueName: \"kubernetes.io/projected/d758df02-271f-469b-b9c4-ef4b56b0f745-kube-api-access-tj8hf\") pod \"manila-db-create-ctz6l\" (UID: \"d758df02-271f-469b-b9c4-ef4b56b0f745\") " pod="openstack/manila-db-create-ctz6l" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.068988 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48pt4\" (UniqueName: \"kubernetes.io/projected/06ba0846-0019-4e22-83da-01867d8f5605-kube-api-access-48pt4\") pod \"manila-4dde-account-create-update-h7lbk\" (UID: \"06ba0846-0019-4e22-83da-01867d8f5605\") " pod="openstack/manila-4dde-account-create-update-h7lbk" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.069085 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d758df02-271f-469b-b9c4-ef4b56b0f745-operator-scripts\") pod \"manila-db-create-ctz6l\" (UID: \"d758df02-271f-469b-b9c4-ef4b56b0f745\") " pod="openstack/manila-db-create-ctz6l" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.172330 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48pt4\" (UniqueName: \"kubernetes.io/projected/06ba0846-0019-4e22-83da-01867d8f5605-kube-api-access-48pt4\") pod \"manila-4dde-account-create-update-h7lbk\" (UID: \"06ba0846-0019-4e22-83da-01867d8f5605\") " pod="openstack/manila-4dde-account-create-update-h7lbk" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.172402 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d758df02-271f-469b-b9c4-ef4b56b0f745-operator-scripts\") pod \"manila-db-create-ctz6l\" (UID: \"d758df02-271f-469b-b9c4-ef4b56b0f745\") " pod="openstack/manila-db-create-ctz6l" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.172478 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ba0846-0019-4e22-83da-01867d8f5605-operator-scripts\") pod \"manila-4dde-account-create-update-h7lbk\" (UID: \"06ba0846-0019-4e22-83da-01867d8f5605\") " pod="openstack/manila-4dde-account-create-update-h7lbk" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.172563 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj8hf\" (UniqueName: \"kubernetes.io/projected/d758df02-271f-469b-b9c4-ef4b56b0f745-kube-api-access-tj8hf\") pod \"manila-db-create-ctz6l\" (UID: \"d758df02-271f-469b-b9c4-ef4b56b0f745\") " pod="openstack/manila-db-create-ctz6l" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.173356 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d758df02-271f-469b-b9c4-ef4b56b0f745-operator-scripts\") pod \"manila-db-create-ctz6l\" (UID: \"d758df02-271f-469b-b9c4-ef4b56b0f745\") " pod="openstack/manila-db-create-ctz6l" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.173443 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ba0846-0019-4e22-83da-01867d8f5605-operator-scripts\") pod \"manila-4dde-account-create-update-h7lbk\" (UID: \"06ba0846-0019-4e22-83da-01867d8f5605\") " pod="openstack/manila-4dde-account-create-update-h7lbk" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.194562 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48pt4\" (UniqueName: \"kubernetes.io/projected/06ba0846-0019-4e22-83da-01867d8f5605-kube-api-access-48pt4\") pod \"manila-4dde-account-create-update-h7lbk\" (UID: \"06ba0846-0019-4e22-83da-01867d8f5605\") " pod="openstack/manila-4dde-account-create-update-h7lbk" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.196825 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj8hf\" (UniqueName: \"kubernetes.io/projected/d758df02-271f-469b-b9c4-ef4b56b0f745-kube-api-access-tj8hf\") pod \"manila-db-create-ctz6l\" (UID: \"d758df02-271f-469b-b9c4-ef4b56b0f745\") " pod="openstack/manila-db-create-ctz6l" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.266263 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-ctz6l" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.288993 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4dde-account-create-update-h7lbk" Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.767744 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-ctz6l"] Feb 23 08:56:43 crc kubenswrapper[5118]: W0223 08:56:43.769267 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd758df02_271f_469b_b9c4_ef4b56b0f745.slice/crio-5d5f07beaf90384fb641a08b80f375a2520050a920f537ffaa30001b9d859a53 WatchSource:0}: Error finding container 5d5f07beaf90384fb641a08b80f375a2520050a920f537ffaa30001b9d859a53: Status 404 returned error can't find the container with id 5d5f07beaf90384fb641a08b80f375a2520050a920f537ffaa30001b9d859a53 Feb 23 08:56:43 crc kubenswrapper[5118]: I0223 08:56:43.907875 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-4dde-account-create-update-h7lbk"] Feb 23 08:56:43 crc kubenswrapper[5118]: W0223 08:56:43.913555 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06ba0846_0019_4e22_83da_01867d8f5605.slice/crio-57f7c02df031fbf5df1cb2b45af0998cd3e0b26e654f18928316336d1d3d5587 WatchSource:0}: Error finding container 57f7c02df031fbf5df1cb2b45af0998cd3e0b26e654f18928316336d1d3d5587: Status 404 returned error can't find the container with id 57f7c02df031fbf5df1cb2b45af0998cd3e0b26e654f18928316336d1d3d5587 Feb 23 08:56:44 crc kubenswrapper[5118]: I0223 08:56:44.292137 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4dde-account-create-update-h7lbk" event={"ID":"06ba0846-0019-4e22-83da-01867d8f5605","Type":"ContainerStarted","Data":"568ca2fff95840aa64307a9d805e10c65cd0b3d649ce697e31dbb928c009b322"} Feb 23 08:56:44 crc kubenswrapper[5118]: I0223 08:56:44.293321 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4dde-account-create-update-h7lbk" event={"ID":"06ba0846-0019-4e22-83da-01867d8f5605","Type":"ContainerStarted","Data":"57f7c02df031fbf5df1cb2b45af0998cd3e0b26e654f18928316336d1d3d5587"} Feb 23 08:56:44 crc kubenswrapper[5118]: I0223 08:56:44.300705 5118 generic.go:334] "Generic (PLEG): container finished" podID="d758df02-271f-469b-b9c4-ef4b56b0f745" containerID="1085916e24138bbcf53f8ec5956bce19754b84760c910821a629bf32816f65f9" exitCode=0 Feb 23 08:56:44 crc kubenswrapper[5118]: I0223 08:56:44.300755 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-ctz6l" event={"ID":"d758df02-271f-469b-b9c4-ef4b56b0f745","Type":"ContainerDied","Data":"1085916e24138bbcf53f8ec5956bce19754b84760c910821a629bf32816f65f9"} Feb 23 08:56:44 crc kubenswrapper[5118]: I0223 08:56:44.300785 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-ctz6l" event={"ID":"d758df02-271f-469b-b9c4-ef4b56b0f745","Type":"ContainerStarted","Data":"5d5f07beaf90384fb641a08b80f375a2520050a920f537ffaa30001b9d859a53"} Feb 23 08:56:44 crc kubenswrapper[5118]: I0223 08:56:44.330217 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-4dde-account-create-update-h7lbk" podStartSLOduration=2.330193648 podStartE2EDuration="2.330193648s" podCreationTimestamp="2026-02-23 08:56:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:56:44.313440204 +0000 UTC m=+7867.317224777" watchObservedRunningTime="2026-02-23 08:56:44.330193648 +0000 UTC m=+7867.333978221" Feb 23 08:56:45 crc kubenswrapper[5118]: I0223 08:56:45.310883 5118 generic.go:334] "Generic (PLEG): container finished" podID="06ba0846-0019-4e22-83da-01867d8f5605" containerID="568ca2fff95840aa64307a9d805e10c65cd0b3d649ce697e31dbb928c009b322" exitCode=0 Feb 23 08:56:45 crc kubenswrapper[5118]: I0223 08:56:45.313001 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4dde-account-create-update-h7lbk" event={"ID":"06ba0846-0019-4e22-83da-01867d8f5605","Type":"ContainerDied","Data":"568ca2fff95840aa64307a9d805e10c65cd0b3d649ce697e31dbb928c009b322"} Feb 23 08:56:45 crc kubenswrapper[5118]: I0223 08:56:45.810285 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-ctz6l" Feb 23 08:56:45 crc kubenswrapper[5118]: I0223 08:56:45.936137 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d758df02-271f-469b-b9c4-ef4b56b0f745-operator-scripts\") pod \"d758df02-271f-469b-b9c4-ef4b56b0f745\" (UID: \"d758df02-271f-469b-b9c4-ef4b56b0f745\") " Feb 23 08:56:45 crc kubenswrapper[5118]: I0223 08:56:45.936301 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj8hf\" (UniqueName: \"kubernetes.io/projected/d758df02-271f-469b-b9c4-ef4b56b0f745-kube-api-access-tj8hf\") pod \"d758df02-271f-469b-b9c4-ef4b56b0f745\" (UID: \"d758df02-271f-469b-b9c4-ef4b56b0f745\") " Feb 23 08:56:45 crc kubenswrapper[5118]: I0223 08:56:45.936674 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d758df02-271f-469b-b9c4-ef4b56b0f745-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d758df02-271f-469b-b9c4-ef4b56b0f745" (UID: "d758df02-271f-469b-b9c4-ef4b56b0f745"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:56:45 crc kubenswrapper[5118]: I0223 08:56:45.937482 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d758df02-271f-469b-b9c4-ef4b56b0f745-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:45 crc kubenswrapper[5118]: I0223 08:56:45.944626 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d758df02-271f-469b-b9c4-ef4b56b0f745-kube-api-access-tj8hf" (OuterVolumeSpecName: "kube-api-access-tj8hf") pod "d758df02-271f-469b-b9c4-ef4b56b0f745" (UID: "d758df02-271f-469b-b9c4-ef4b56b0f745"). InnerVolumeSpecName "kube-api-access-tj8hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:46 crc kubenswrapper[5118]: I0223 08:56:46.039870 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj8hf\" (UniqueName: \"kubernetes.io/projected/d758df02-271f-469b-b9c4-ef4b56b0f745-kube-api-access-tj8hf\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:46 crc kubenswrapper[5118]: I0223 08:56:46.063738 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gj7x7"] Feb 23 08:56:46 crc kubenswrapper[5118]: I0223 08:56:46.075617 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gj7x7"] Feb 23 08:56:46 crc kubenswrapper[5118]: I0223 08:56:46.328651 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-ctz6l" event={"ID":"d758df02-271f-469b-b9c4-ef4b56b0f745","Type":"ContainerDied","Data":"5d5f07beaf90384fb641a08b80f375a2520050a920f537ffaa30001b9d859a53"} Feb 23 08:56:46 crc kubenswrapper[5118]: I0223 08:56:46.328713 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d5f07beaf90384fb641a08b80f375a2520050a920f537ffaa30001b9d859a53" Feb 23 08:56:46 crc kubenswrapper[5118]: I0223 08:56:46.328683 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-ctz6l" Feb 23 08:56:46 crc kubenswrapper[5118]: I0223 08:56:46.902759 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4dde-account-create-update-h7lbk" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.038277 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7m6lm"] Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.048270 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-41e6-account-create-update-4pfpl"] Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.059426 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kj9s6"] Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.064787 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ba0846-0019-4e22-83da-01867d8f5605-operator-scripts\") pod \"06ba0846-0019-4e22-83da-01867d8f5605\" (UID: \"06ba0846-0019-4e22-83da-01867d8f5605\") " Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.065113 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48pt4\" (UniqueName: \"kubernetes.io/projected/06ba0846-0019-4e22-83da-01867d8f5605-kube-api-access-48pt4\") pod \"06ba0846-0019-4e22-83da-01867d8f5605\" (UID: \"06ba0846-0019-4e22-83da-01867d8f5605\") " Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.065444 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06ba0846-0019-4e22-83da-01867d8f5605-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06ba0846-0019-4e22-83da-01867d8f5605" (UID: "06ba0846-0019-4e22-83da-01867d8f5605"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.065710 5118 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06ba0846-0019-4e22-83da-01867d8f5605-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.067149 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-f7e0-account-create-update-z7f5c"] Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.069140 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ba0846-0019-4e22-83da-01867d8f5605-kube-api-access-48pt4" (OuterVolumeSpecName: "kube-api-access-48pt4") pod "06ba0846-0019-4e22-83da-01867d8f5605" (UID: "06ba0846-0019-4e22-83da-01867d8f5605"). InnerVolumeSpecName "kube-api-access-48pt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.087004 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-41e6-account-create-update-4pfpl"] Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.098689 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-f7e0-account-create-update-z7f5c"] Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.110410 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c2f9-account-create-update-rrh2l"] Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.120327 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kj9s6"] Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.128735 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7m6lm"] Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.138433 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c2f9-account-create-update-rrh2l"] Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.167602 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48pt4\" (UniqueName: \"kubernetes.io/projected/06ba0846-0019-4e22-83da-01867d8f5605-kube-api-access-48pt4\") on node \"crc\" DevicePath \"\"" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.344748 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-4dde-account-create-update-h7lbk" event={"ID":"06ba0846-0019-4e22-83da-01867d8f5605","Type":"ContainerDied","Data":"57f7c02df031fbf5df1cb2b45af0998cd3e0b26e654f18928316336d1d3d5587"} Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.344809 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57f7c02df031fbf5df1cb2b45af0998cd3e0b26e654f18928316336d1d3d5587" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.344823 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-4dde-account-create-update-h7lbk" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.724965 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="060732fd-ca60-43ca-9760-18366ff75e31" path="/var/lib/kubelet/pods/060732fd-ca60-43ca-9760-18366ff75e31/volumes" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.726859 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65965047-113d-44a2-87f9-925f161e5c0c" path="/var/lib/kubelet/pods/65965047-113d-44a2-87f9-925f161e5c0c/volumes" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.728193 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d986e2-7735-490b-b3e5-4446205f094e" path="/var/lib/kubelet/pods/84d986e2-7735-490b-b3e5-4446205f094e/volumes" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.729493 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99fca003-f1bf-42ec-9bac-8c1bb4cc37c1" path="/var/lib/kubelet/pods/99fca003-f1bf-42ec-9bac-8c1bb4cc37c1/volumes" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.731347 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a7092c-8bd3-462b-8ce9-8dd40cc35067" path="/var/lib/kubelet/pods/a5a7092c-8bd3-462b-8ce9-8dd40cc35067/volumes" Feb 23 08:56:47 crc kubenswrapper[5118]: I0223 08:56:47.731941 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39e8379-74ca-4713-8494-81de13ec2734" path="/var/lib/kubelet/pods/c39e8379-74ca-4713-8494-81de13ec2734/volumes" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.311827 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-rv82j"] Feb 23 08:56:48 crc kubenswrapper[5118]: E0223 08:56:48.312355 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d758df02-271f-469b-b9c4-ef4b56b0f745" containerName="mariadb-database-create" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.312373 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d758df02-271f-469b-b9c4-ef4b56b0f745" containerName="mariadb-database-create" Feb 23 08:56:48 crc kubenswrapper[5118]: E0223 08:56:48.312391 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ba0846-0019-4e22-83da-01867d8f5605" containerName="mariadb-account-create-update" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.312398 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ba0846-0019-4e22-83da-01867d8f5605" containerName="mariadb-account-create-update" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.312602 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ba0846-0019-4e22-83da-01867d8f5605" containerName="mariadb-account-create-update" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.312625 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d758df02-271f-469b-b9c4-ef4b56b0f745" containerName="mariadb-database-create" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.313361 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.320189 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-pxh74" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.322827 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-rv82j"] Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.323002 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.398803 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-job-config-data\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.398866 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-899st\" (UniqueName: \"kubernetes.io/projected/1c07546f-84d3-41cf-a43a-f6cbe7129aca-kube-api-access-899st\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.398901 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-config-data\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.399193 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-combined-ca-bundle\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.501338 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-combined-ca-bundle\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.501658 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-job-config-data\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.501690 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-899st\" (UniqueName: \"kubernetes.io/projected/1c07546f-84d3-41cf-a43a-f6cbe7129aca-kube-api-access-899st\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.501722 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-config-data\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.507301 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-job-config-data\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.507644 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-config-data\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.510018 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-combined-ca-bundle\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.519874 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-899st\" (UniqueName: \"kubernetes.io/projected/1c07546f-84d3-41cf-a43a-f6cbe7129aca-kube-api-access-899st\") pod \"manila-db-sync-rv82j\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:48 crc kubenswrapper[5118]: I0223 08:56:48.633033 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rv82j" Feb 23 08:56:49 crc kubenswrapper[5118]: W0223 08:56:49.633696 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c07546f_84d3_41cf_a43a_f6cbe7129aca.slice/crio-159add5ca5aa116c97c870ec55a43ec0e75832cd0c800bcc718e3309bf789263 WatchSource:0}: Error finding container 159add5ca5aa116c97c870ec55a43ec0e75832cd0c800bcc718e3309bf789263: Status 404 returned error can't find the container with id 159add5ca5aa116c97c870ec55a43ec0e75832cd0c800bcc718e3309bf789263 Feb 23 08:56:49 crc kubenswrapper[5118]: I0223 08:56:49.635877 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-rv82j"] Feb 23 08:56:50 crc kubenswrapper[5118]: I0223 08:56:50.372839 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rv82j" event={"ID":"1c07546f-84d3-41cf-a43a-f6cbe7129aca","Type":"ContainerStarted","Data":"159add5ca5aa116c97c870ec55a43ec0e75832cd0c800bcc718e3309bf789263"} Feb 23 08:56:57 crc kubenswrapper[5118]: I0223 08:56:57.454630 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rv82j" event={"ID":"1c07546f-84d3-41cf-a43a-f6cbe7129aca","Type":"ContainerStarted","Data":"e85fee9128b7a1f5e96cf1fef037d6a1c38a7bfa72954553ede8111475ae186e"} Feb 23 08:56:57 crc kubenswrapper[5118]: I0223 08:56:57.491366 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-rv82j" podStartSLOduration=2.493152827 podStartE2EDuration="9.491337721s" podCreationTimestamp="2026-02-23 08:56:48 +0000 UTC" firstStartedPulling="2026-02-23 08:56:49.638890313 +0000 UTC m=+7872.642674886" lastFinishedPulling="2026-02-23 08:56:56.637075207 +0000 UTC m=+7879.640859780" observedRunningTime="2026-02-23 08:56:57.479035256 +0000 UTC m=+7880.482819889" watchObservedRunningTime="2026-02-23 08:56:57.491337721 +0000 UTC m=+7880.495122324" Feb 23 08:56:59 crc kubenswrapper[5118]: I0223 08:56:59.478466 5118 generic.go:334] "Generic (PLEG): container finished" podID="1c07546f-84d3-41cf-a43a-f6cbe7129aca" containerID="e85fee9128b7a1f5e96cf1fef037d6a1c38a7bfa72954553ede8111475ae186e" exitCode=0 Feb 23 08:56:59 crc kubenswrapper[5118]: I0223 08:56:59.478829 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rv82j" event={"ID":"1c07546f-84d3-41cf-a43a-f6cbe7129aca","Type":"ContainerDied","Data":"e85fee9128b7a1f5e96cf1fef037d6a1c38a7bfa72954553ede8111475ae186e"} Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.092820 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rv82j" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.155171 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-899st\" (UniqueName: \"kubernetes.io/projected/1c07546f-84d3-41cf-a43a-f6cbe7129aca-kube-api-access-899st\") pod \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.155312 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-job-config-data\") pod \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.155359 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-config-data\") pod \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.155487 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-combined-ca-bundle\") pod \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\" (UID: \"1c07546f-84d3-41cf-a43a-f6cbe7129aca\") " Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.163384 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c07546f-84d3-41cf-a43a-f6cbe7129aca-kube-api-access-899st" (OuterVolumeSpecName: "kube-api-access-899st") pod "1c07546f-84d3-41cf-a43a-f6cbe7129aca" (UID: "1c07546f-84d3-41cf-a43a-f6cbe7129aca"). InnerVolumeSpecName "kube-api-access-899st". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.167112 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "1c07546f-84d3-41cf-a43a-f6cbe7129aca" (UID: "1c07546f-84d3-41cf-a43a-f6cbe7129aca"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.171020 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-config-data" (OuterVolumeSpecName: "config-data") pod "1c07546f-84d3-41cf-a43a-f6cbe7129aca" (UID: "1c07546f-84d3-41cf-a43a-f6cbe7129aca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.195336 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c07546f-84d3-41cf-a43a-f6cbe7129aca" (UID: "1c07546f-84d3-41cf-a43a-f6cbe7129aca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.258442 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-899st\" (UniqueName: \"kubernetes.io/projected/1c07546f-84d3-41cf-a43a-f6cbe7129aca-kube-api-access-899st\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.258489 5118 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-job-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.258503 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.258516 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c07546f-84d3-41cf-a43a-f6cbe7129aca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.505654 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-rv82j" event={"ID":"1c07546f-84d3-41cf-a43a-f6cbe7129aca","Type":"ContainerDied","Data":"159add5ca5aa116c97c870ec55a43ec0e75832cd0c800bcc718e3309bf789263"} Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.505700 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="159add5ca5aa116c97c870ec55a43ec0e75832cd0c800bcc718e3309bf789263" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.505787 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-rv82j" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.900053 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Feb 23 08:57:01 crc kubenswrapper[5118]: E0223 08:57:01.900454 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c07546f-84d3-41cf-a43a-f6cbe7129aca" containerName="manila-db-sync" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.900471 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c07546f-84d3-41cf-a43a-f6cbe7129aca" containerName="manila-db-sync" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.900676 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c07546f-84d3-41cf-a43a-f6cbe7129aca" containerName="manila-db-sync" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.901722 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.905423 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.906190 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.906386 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-pxh74" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.906691 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.937496 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.978236 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7b8w\" (UniqueName: \"kubernetes.io/projected/e67a6d39-e2b4-4133-b908-717e9f957170-kube-api-access-c7b8w\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.978526 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.978554 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.978609 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-config-data\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.978679 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-scripts\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:01 crc kubenswrapper[5118]: I0223 08:57:01.978733 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e67a6d39-e2b4-4133-b908-717e9f957170-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.006905 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.008631 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.021486 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.022210 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083281 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-scripts\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083336 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e67a6d39-e2b4-4133-b908-717e9f957170-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083378 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-scripts\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083399 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083434 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7b8w\" (UniqueName: \"kubernetes.io/projected/e67a6d39-e2b4-4133-b908-717e9f957170-kube-api-access-c7b8w\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083455 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083477 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083497 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96pzz\" (UniqueName: \"kubernetes.io/projected/3a077e0c-4058-463d-a95c-5566479cd3af-kube-api-access-96pzz\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083526 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-config-data\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083558 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-config-data\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083592 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a077e0c-4058-463d-a95c-5566479cd3af-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083644 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083765 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3a077e0c-4058-463d-a95c-5566479cd3af-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.083788 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3a077e0c-4058-463d-a95c-5566479cd3af-ceph\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.084748 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e67a6d39-e2b4-4133-b908-717e9f957170-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.092639 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd8bf8f99-s7tfz"] Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.095357 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.099748 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-config-data\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.107942 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-scripts\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.108961 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.111220 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7b8w\" (UniqueName: \"kubernetes.io/projected/e67a6d39-e2b4-4133-b908-717e9f957170-kube-api-access-c7b8w\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.112730 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e67a6d39-e2b4-4133-b908-717e9f957170-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e67a6d39-e2b4-4133-b908-717e9f957170\") " pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.115853 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd8bf8f99-s7tfz"] Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.185066 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.185127 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3a077e0c-4058-463d-a95c-5566479cd3af-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.185152 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3a077e0c-4058-463d-a95c-5566479cd3af-ceph\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.185227 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-scripts\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.185244 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.185298 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96pzz\" (UniqueName: \"kubernetes.io/projected/3a077e0c-4058-463d-a95c-5566479cd3af-kube-api-access-96pzz\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.185328 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-config-data\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.185378 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a077e0c-4058-463d-a95c-5566479cd3af-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.185464 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a077e0c-4058-463d-a95c-5566479cd3af-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.187516 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/3a077e0c-4058-463d-a95c-5566479cd3af-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.192685 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.192972 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-config-data\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.194649 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3a077e0c-4058-463d-a95c-5566479cd3af-ceph\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.195758 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-scripts\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.207021 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a077e0c-4058-463d-a95c-5566479cd3af-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.215747 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96pzz\" (UniqueName: \"kubernetes.io/projected/3a077e0c-4058-463d-a95c-5566479cd3af-kube-api-access-96pzz\") pod \"manila-share-share1-0\" (UID: \"3a077e0c-4058-463d-a95c-5566479cd3af\") " pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.233933 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.252774 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.272231 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.272353 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.276341 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290655 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xphqz\" (UniqueName: \"kubernetes.io/projected/3970e01b-cedb-40fc-9594-55ec615ff971-kube-api-access-xphqz\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290705 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-config-data\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290749 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2bt5\" (UniqueName: \"kubernetes.io/projected/fb4d376f-4315-4f50-925f-9a88f42d12c2-kube-api-access-p2bt5\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290786 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-config\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290812 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-dns-svc\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290849 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-config-data-custom\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290872 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3970e01b-cedb-40fc-9594-55ec615ff971-etc-machine-id\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290914 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290939 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290957 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3970e01b-cedb-40fc-9594-55ec615ff971-logs\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.290994 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.291015 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-scripts\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.352679 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.392730 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-config-data-custom\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393081 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3970e01b-cedb-40fc-9594-55ec615ff971-etc-machine-id\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393144 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393167 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393184 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3970e01b-cedb-40fc-9594-55ec615ff971-logs\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393215 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393236 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-scripts\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393264 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xphqz\" (UniqueName: \"kubernetes.io/projected/3970e01b-cedb-40fc-9594-55ec615ff971-kube-api-access-xphqz\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393289 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-config-data\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393330 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2bt5\" (UniqueName: \"kubernetes.io/projected/fb4d376f-4315-4f50-925f-9a88f42d12c2-kube-api-access-p2bt5\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393358 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-config\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.393378 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-dns-svc\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.394971 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.395742 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-dns-svc\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.398493 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.398609 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3970e01b-cedb-40fc-9594-55ec615ff971-etc-machine-id\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.399819 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-config\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.401857 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-config-data-custom\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.403526 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3970e01b-cedb-40fc-9594-55ec615ff971-logs\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.416422 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.418405 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-config-data\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.429750 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2bt5\" (UniqueName: \"kubernetes.io/projected/fb4d376f-4315-4f50-925f-9a88f42d12c2-kube-api-access-p2bt5\") pod \"dnsmasq-dns-5fd8bf8f99-s7tfz\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.430716 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xphqz\" (UniqueName: \"kubernetes.io/projected/3970e01b-cedb-40fc-9594-55ec615ff971-kube-api-access-xphqz\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.433351 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3970e01b-cedb-40fc-9594-55ec615ff971-scripts\") pod \"manila-api-0\" (UID: \"3970e01b-cedb-40fc-9594-55ec615ff971\") " pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.622213 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.631994 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 23 08:57:02 crc kubenswrapper[5118]: I0223 08:57:02.837069 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 23 08:57:03 crc kubenswrapper[5118]: I0223 08:57:03.077757 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 23 08:57:03 crc kubenswrapper[5118]: W0223 08:57:03.223447 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a077e0c_4058_463d_a95c_5566479cd3af.slice/crio-86ddc508fcbef08c78f9c81f2c17197587aa00b1c3e4eb1c0094a429e34b3af2 WatchSource:0}: Error finding container 86ddc508fcbef08c78f9c81f2c17197587aa00b1c3e4eb1c0094a429e34b3af2: Status 404 returned error can't find the container with id 86ddc508fcbef08c78f9c81f2c17197587aa00b1c3e4eb1c0094a429e34b3af2 Feb 23 08:57:03 crc kubenswrapper[5118]: I0223 08:57:03.267652 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd8bf8f99-s7tfz"] Feb 23 08:57:03 crc kubenswrapper[5118]: I0223 08:57:03.419907 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 23 08:57:03 crc kubenswrapper[5118]: I0223 08:57:03.560277 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3a077e0c-4058-463d-a95c-5566479cd3af","Type":"ContainerStarted","Data":"86ddc508fcbef08c78f9c81f2c17197587aa00b1c3e4eb1c0094a429e34b3af2"} Feb 23 08:57:03 crc kubenswrapper[5118]: I0223 08:57:03.570816 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"3970e01b-cedb-40fc-9594-55ec615ff971","Type":"ContainerStarted","Data":"de2e1d8fd840991b5201ff442328ed39b55a636127cbd4433a189be0aa1de43e"} Feb 23 08:57:03 crc kubenswrapper[5118]: I0223 08:57:03.582848 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" event={"ID":"fb4d376f-4315-4f50-925f-9a88f42d12c2","Type":"ContainerStarted","Data":"4ec0f882eef60f97f02eaa29969c29beafbd33b19a01ee1c351194377dbb9ae6"} Feb 23 08:57:03 crc kubenswrapper[5118]: I0223 08:57:03.585692 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e67a6d39-e2b4-4133-b908-717e9f957170","Type":"ContainerStarted","Data":"fa154af640e06402f2c768adc90df2147222e45f064a3bdcdf3183bfe193e090"} Feb 23 08:57:04 crc kubenswrapper[5118]: I0223 08:57:04.600370 5118 scope.go:117] "RemoveContainer" containerID="2da11e32d4e311d7141f7671012e3579e2c7a1c554c749bc940b3fdc4d67c6b5" Feb 23 08:57:04 crc kubenswrapper[5118]: I0223 08:57:04.622526 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e67a6d39-e2b4-4133-b908-717e9f957170","Type":"ContainerStarted","Data":"5540e2e5852eefec91b0af779a456b9b34a9448fe40a088a1d1c1717930aff1b"} Feb 23 08:57:04 crc kubenswrapper[5118]: I0223 08:57:04.636881 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"3970e01b-cedb-40fc-9594-55ec615ff971","Type":"ContainerStarted","Data":"b6cc475b067ab6e1d682ece59d07cfca18999c9ccd22f648fde7b595a8414ded"} Feb 23 08:57:04 crc kubenswrapper[5118]: I0223 08:57:04.647622 5118 scope.go:117] "RemoveContainer" containerID="8f21f817072d9e4dad2a061e5e39b5779c18192efd1982a628161d3a1bbddb29" Feb 23 08:57:04 crc kubenswrapper[5118]: I0223 08:57:04.665249 5118 generic.go:334] "Generic (PLEG): container finished" podID="fb4d376f-4315-4f50-925f-9a88f42d12c2" containerID="486b85afb16fd0af16ddd940bf606202f136994781b28465549d41bdc3558c9d" exitCode=0 Feb 23 08:57:04 crc kubenswrapper[5118]: I0223 08:57:04.665317 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" event={"ID":"fb4d376f-4315-4f50-925f-9a88f42d12c2","Type":"ContainerDied","Data":"486b85afb16fd0af16ddd940bf606202f136994781b28465549d41bdc3558c9d"} Feb 23 08:57:04 crc kubenswrapper[5118]: I0223 08:57:04.734669 5118 scope.go:117] "RemoveContainer" containerID="26fc64a0bfae4863969afe929650aa6eac6e469c0339accb15675b4c355ed0b4" Feb 23 08:57:04 crc kubenswrapper[5118]: I0223 08:57:04.796847 5118 scope.go:117] "RemoveContainer" containerID="1654a3c8278af0612f23b34c40534afbac9ab069b0e5238f98524ae4c22a5f98" Feb 23 08:57:04 crc kubenswrapper[5118]: I0223 08:57:04.829829 5118 scope.go:117] "RemoveContainer" containerID="ebb7ab81d7ff1968e30477440f577825c218f87601d3aa92e1d7ee194427fba0" Feb 23 08:57:04 crc kubenswrapper[5118]: I0223 08:57:04.859053 5118 scope.go:117] "RemoveContainer" containerID="e2841cd7138b6367db8df98e2cd4100108182c6f0a2dbbf5a0bf8aaf87fa2bfe" Feb 23 08:57:05 crc kubenswrapper[5118]: I0223 08:57:05.684600 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"3970e01b-cedb-40fc-9594-55ec615ff971","Type":"ContainerStarted","Data":"5da9419faefc9486759a61aa91f8a8bb5298eb749e752ee896ed7a4bce317ac3"} Feb 23 08:57:05 crc kubenswrapper[5118]: I0223 08:57:05.686942 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Feb 23 08:57:05 crc kubenswrapper[5118]: I0223 08:57:05.692782 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" event={"ID":"fb4d376f-4315-4f50-925f-9a88f42d12c2","Type":"ContainerStarted","Data":"69d2ee957e48bdee7c2ffc1e9effb3c71c81c1f1ac38d8c40c330606bb072c98"} Feb 23 08:57:05 crc kubenswrapper[5118]: I0223 08:57:05.692931 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:05 crc kubenswrapper[5118]: I0223 08:57:05.695005 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e67a6d39-e2b4-4133-b908-717e9f957170","Type":"ContainerStarted","Data":"cd08710ec84c9e1c41387e34ca323135445f67a6d9ebd877e23902f85d697b4c"} Feb 23 08:57:05 crc kubenswrapper[5118]: I0223 08:57:05.721289 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.721264201 podStartE2EDuration="3.721264201s" podCreationTimestamp="2026-02-23 08:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:57:05.705712357 +0000 UTC m=+7888.709496930" watchObservedRunningTime="2026-02-23 08:57:05.721264201 +0000 UTC m=+7888.725048774" Feb 23 08:57:05 crc kubenswrapper[5118]: I0223 08:57:05.734657 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" podStartSLOduration=3.734640334 podStartE2EDuration="3.734640334s" podCreationTimestamp="2026-02-23 08:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:57:05.733753332 +0000 UTC m=+7888.737537905" watchObservedRunningTime="2026-02-23 08:57:05.734640334 +0000 UTC m=+7888.738424907" Feb 23 08:57:05 crc kubenswrapper[5118]: I0223 08:57:05.754269 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.31927715 podStartE2EDuration="4.754252416s" podCreationTimestamp="2026-02-23 08:57:01 +0000 UTC" firstStartedPulling="2026-02-23 08:57:02.853905104 +0000 UTC m=+7885.857689677" lastFinishedPulling="2026-02-23 08:57:03.28888037 +0000 UTC m=+7886.292664943" observedRunningTime="2026-02-23 08:57:05.751630563 +0000 UTC m=+7888.755415166" watchObservedRunningTime="2026-02-23 08:57:05.754252416 +0000 UTC m=+7888.758036989" Feb 23 08:57:07 crc kubenswrapper[5118]: I0223 08:57:07.629457 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 08:57:10 crc kubenswrapper[5118]: I0223 08:57:10.046260 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6cvjz"] Feb 23 08:57:10 crc kubenswrapper[5118]: I0223 08:57:10.056997 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6cvjz"] Feb 23 08:57:11 crc kubenswrapper[5118]: I0223 08:57:11.719419 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de48519-b3d7-4123-8138-1151dbd0f25e" path="/var/lib/kubelet/pods/3de48519-b3d7-4123-8138-1151dbd0f25e/volumes" Feb 23 08:57:12 crc kubenswrapper[5118]: I0223 08:57:12.239633 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Feb 23 08:57:12 crc kubenswrapper[5118]: I0223 08:57:12.623244 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:57:12 crc kubenswrapper[5118]: I0223 08:57:12.733301 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b6d64ffc9-rwxpm"] Feb 23 08:57:12 crc kubenswrapper[5118]: I0223 08:57:12.733622 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" podUID="214a2d91-6efd-4585-951e-00ab57da645b" containerName="dnsmasq-dns" containerID="cri-o://3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b" gracePeriod=10 Feb 23 08:57:12 crc kubenswrapper[5118]: I0223 08:57:12.771541 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3a077e0c-4058-463d-a95c-5566479cd3af","Type":"ContainerStarted","Data":"8931731ec767909089baaba5eb5d0da4ded86d8e3acd126e8024c0fb09463378"} Feb 23 08:57:12 crc kubenswrapper[5118]: I0223 08:57:12.771616 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"3a077e0c-4058-463d-a95c-5566479cd3af","Type":"ContainerStarted","Data":"ca1ae14ce2377ed4e956bf686b0836000e6f033625c57822f210018b7d5cd3ae"} Feb 23 08:57:12 crc kubenswrapper[5118]: I0223 08:57:12.794428 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.401574428 podStartE2EDuration="11.794407511s" podCreationTimestamp="2026-02-23 08:57:01 +0000 UTC" firstStartedPulling="2026-02-23 08:57:03.225706449 +0000 UTC m=+7886.229491022" lastFinishedPulling="2026-02-23 08:57:11.618539532 +0000 UTC m=+7894.622324105" observedRunningTime="2026-02-23 08:57:12.791354878 +0000 UTC m=+7895.795139461" watchObservedRunningTime="2026-02-23 08:57:12.794407511 +0000 UTC m=+7895.798192084" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.333678 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.517931 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-dns-svc\") pod \"214a2d91-6efd-4585-951e-00ab57da645b\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.519925 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-config\") pod \"214a2d91-6efd-4585-951e-00ab57da645b\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.519984 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qlmh\" (UniqueName: \"kubernetes.io/projected/214a2d91-6efd-4585-951e-00ab57da645b-kube-api-access-2qlmh\") pod \"214a2d91-6efd-4585-951e-00ab57da645b\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.520189 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-nb\") pod \"214a2d91-6efd-4585-951e-00ab57da645b\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.521161 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-sb\") pod \"214a2d91-6efd-4585-951e-00ab57da645b\" (UID: \"214a2d91-6efd-4585-951e-00ab57da645b\") " Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.526221 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214a2d91-6efd-4585-951e-00ab57da645b-kube-api-access-2qlmh" (OuterVolumeSpecName: "kube-api-access-2qlmh") pod "214a2d91-6efd-4585-951e-00ab57da645b" (UID: "214a2d91-6efd-4585-951e-00ab57da645b"). InnerVolumeSpecName "kube-api-access-2qlmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.574767 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "214a2d91-6efd-4585-951e-00ab57da645b" (UID: "214a2d91-6efd-4585-951e-00ab57da645b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.582812 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-config" (OuterVolumeSpecName: "config") pod "214a2d91-6efd-4585-951e-00ab57da645b" (UID: "214a2d91-6efd-4585-951e-00ab57da645b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.584435 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "214a2d91-6efd-4585-951e-00ab57da645b" (UID: "214a2d91-6efd-4585-951e-00ab57da645b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.608487 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "214a2d91-6efd-4585-951e-00ab57da645b" (UID: "214a2d91-6efd-4585-951e-00ab57da645b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.624066 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.624296 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.624378 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.624443 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214a2d91-6efd-4585-951e-00ab57da645b-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.624501 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qlmh\" (UniqueName: \"kubernetes.io/projected/214a2d91-6efd-4585-951e-00ab57da645b-kube-api-access-2qlmh\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.781997 5118 generic.go:334] "Generic (PLEG): container finished" podID="214a2d91-6efd-4585-951e-00ab57da645b" containerID="3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b" exitCode=0 Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.782967 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.783490 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" event={"ID":"214a2d91-6efd-4585-951e-00ab57da645b","Type":"ContainerDied","Data":"3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b"} Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.783519 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d64ffc9-rwxpm" event={"ID":"214a2d91-6efd-4585-951e-00ab57da645b","Type":"ContainerDied","Data":"cc8a8ed5f04d1d0443062700e1bdfd6730efe5cef7e675c5bc6cbbab255cd06e"} Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.783536 5118 scope.go:117] "RemoveContainer" containerID="3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.805300 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b6d64ffc9-rwxpm"] Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.811623 5118 scope.go:117] "RemoveContainer" containerID="b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.814821 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b6d64ffc9-rwxpm"] Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.846182 5118 scope.go:117] "RemoveContainer" containerID="3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b" Feb 23 08:57:13 crc kubenswrapper[5118]: E0223 08:57:13.847435 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b\": container with ID starting with 3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b not found: ID does not exist" containerID="3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.847464 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b"} err="failed to get container status \"3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b\": rpc error: code = NotFound desc = could not find container \"3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b\": container with ID starting with 3b06f3295b52d8cde9b9b4c86fa56ab4443b2c3f1a13d8b7b54f159ba16a7c8b not found: ID does not exist" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.847486 5118 scope.go:117] "RemoveContainer" containerID="b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e" Feb 23 08:57:13 crc kubenswrapper[5118]: E0223 08:57:13.847693 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e\": container with ID starting with b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e not found: ID does not exist" containerID="b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e" Feb 23 08:57:13 crc kubenswrapper[5118]: I0223 08:57:13.847713 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e"} err="failed to get container status \"b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e\": rpc error: code = NotFound desc = could not find container \"b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e\": container with ID starting with b48f8d7988d55cd532ab207f90782387113e07d7b139dae8386346e75a9b724e not found: ID does not exist" Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.244637 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.245156 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="ceilometer-central-agent" containerID="cri-o://3498354cb376790d12c92d15320b4ae2ef3ce4601b1b2b8c467b0f579a83dd37" gracePeriod=30 Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.245270 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="ceilometer-notification-agent" containerID="cri-o://e7303bc84e9624baac87e86dd74736b72b566f9bcf51667c3b4c3ff2a9bd1409" gracePeriod=30 Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.245248 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="sg-core" containerID="cri-o://2f178eaf3589cebc8d2d9ea75fb5105d01bf79b926699bb2b226f240f3f32518" gracePeriod=30 Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.245458 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="proxy-httpd" containerID="cri-o://c4bc8a9d49cd7f6d0e4e8f1ac5dacec12e0c351f66bc7bf5b9ed4264ff40f96c" gracePeriod=30 Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.708526 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="214a2d91-6efd-4585-951e-00ab57da645b" path="/var/lib/kubelet/pods/214a2d91-6efd-4585-951e-00ab57da645b/volumes" Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.804416 5118 generic.go:334] "Generic (PLEG): container finished" podID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerID="c4bc8a9d49cd7f6d0e4e8f1ac5dacec12e0c351f66bc7bf5b9ed4264ff40f96c" exitCode=0 Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.804456 5118 generic.go:334] "Generic (PLEG): container finished" podID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerID="2f178eaf3589cebc8d2d9ea75fb5105d01bf79b926699bb2b226f240f3f32518" exitCode=2 Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.804468 5118 generic.go:334] "Generic (PLEG): container finished" podID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerID="3498354cb376790d12c92d15320b4ae2ef3ce4601b1b2b8c467b0f579a83dd37" exitCode=0 Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.804493 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5170f948-8cb6-4560-be4e-41de9e6e49a6","Type":"ContainerDied","Data":"c4bc8a9d49cd7f6d0e4e8f1ac5dacec12e0c351f66bc7bf5b9ed4264ff40f96c"} Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.804526 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5170f948-8cb6-4560-be4e-41de9e6e49a6","Type":"ContainerDied","Data":"2f178eaf3589cebc8d2d9ea75fb5105d01bf79b926699bb2b226f240f3f32518"} Feb 23 08:57:15 crc kubenswrapper[5118]: I0223 08:57:15.804539 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5170f948-8cb6-4560-be4e-41de9e6e49a6","Type":"ContainerDied","Data":"3498354cb376790d12c92d15320b4ae2ef3ce4601b1b2b8c467b0f579a83dd37"} Feb 23 08:57:21 crc kubenswrapper[5118]: I0223 08:57:21.870382 5118 generic.go:334] "Generic (PLEG): container finished" podID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerID="e7303bc84e9624baac87e86dd74736b72b566f9bcf51667c3b4c3ff2a9bd1409" exitCode=0 Feb 23 08:57:21 crc kubenswrapper[5118]: I0223 08:57:21.870462 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5170f948-8cb6-4560-be4e-41de9e6e49a6","Type":"ContainerDied","Data":"e7303bc84e9624baac87e86dd74736b72b566f9bcf51667c3b4c3ff2a9bd1409"} Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.138506 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.303085 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-scripts\") pod \"5170f948-8cb6-4560-be4e-41de9e6e49a6\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.303191 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-run-httpd\") pod \"5170f948-8cb6-4560-be4e-41de9e6e49a6\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.303348 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-combined-ca-bundle\") pod \"5170f948-8cb6-4560-be4e-41de9e6e49a6\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.303426 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-config-data\") pod \"5170f948-8cb6-4560-be4e-41de9e6e49a6\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.303572 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-sg-core-conf-yaml\") pod \"5170f948-8cb6-4560-be4e-41de9e6e49a6\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.303691 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmp5s\" (UniqueName: \"kubernetes.io/projected/5170f948-8cb6-4560-be4e-41de9e6e49a6-kube-api-access-jmp5s\") pod \"5170f948-8cb6-4560-be4e-41de9e6e49a6\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.303747 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-log-httpd\") pod \"5170f948-8cb6-4560-be4e-41de9e6e49a6\" (UID: \"5170f948-8cb6-4560-be4e-41de9e6e49a6\") " Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.304618 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5170f948-8cb6-4560-be4e-41de9e6e49a6" (UID: "5170f948-8cb6-4560-be4e-41de9e6e49a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.304669 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5170f948-8cb6-4560-be4e-41de9e6e49a6" (UID: "5170f948-8cb6-4560-be4e-41de9e6e49a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.309830 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5170f948-8cb6-4560-be4e-41de9e6e49a6-kube-api-access-jmp5s" (OuterVolumeSpecName: "kube-api-access-jmp5s") pod "5170f948-8cb6-4560-be4e-41de9e6e49a6" (UID: "5170f948-8cb6-4560-be4e-41de9e6e49a6"). InnerVolumeSpecName "kube-api-access-jmp5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.315521 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-scripts" (OuterVolumeSpecName: "scripts") pod "5170f948-8cb6-4560-be4e-41de9e6e49a6" (UID: "5170f948-8cb6-4560-be4e-41de9e6e49a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.339229 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5170f948-8cb6-4560-be4e-41de9e6e49a6" (UID: "5170f948-8cb6-4560-be4e-41de9e6e49a6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.353058 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.410823 5118 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.410872 5118 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-scripts\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.410884 5118 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5170f948-8cb6-4560-be4e-41de9e6e49a6-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.410894 5118 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.410906 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmp5s\" (UniqueName: \"kubernetes.io/projected/5170f948-8cb6-4560-be4e-41de9e6e49a6-kube-api-access-jmp5s\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.421865 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5170f948-8cb6-4560-be4e-41de9e6e49a6" (UID: "5170f948-8cb6-4560-be4e-41de9e6e49a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.453260 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-config-data" (OuterVolumeSpecName: "config-data") pod "5170f948-8cb6-4560-be4e-41de9e6e49a6" (UID: "5170f948-8cb6-4560-be4e-41de9e6e49a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.512549 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.512582 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5170f948-8cb6-4560-be4e-41de9e6e49a6-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.919417 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5170f948-8cb6-4560-be4e-41de9e6e49a6","Type":"ContainerDied","Data":"9c091a10157114699f0caa73fa2b0aece7880384d71e3a3dd2e36a20eaa518f0"} Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.919495 5118 scope.go:117] "RemoveContainer" containerID="c4bc8a9d49cd7f6d0e4e8f1ac5dacec12e0c351f66bc7bf5b9ed4264ff40f96c" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.919540 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.949294 5118 scope.go:117] "RemoveContainer" containerID="2f178eaf3589cebc8d2d9ea75fb5105d01bf79b926699bb2b226f240f3f32518" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.966643 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.982968 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.992673 5118 scope.go:117] "RemoveContainer" containerID="e7303bc84e9624baac87e86dd74736b72b566f9bcf51667c3b4c3ff2a9bd1409" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997144 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:57:22 crc kubenswrapper[5118]: E0223 08:57:22.997604 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="sg-core" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997624 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="sg-core" Feb 23 08:57:22 crc kubenswrapper[5118]: E0223 08:57:22.997647 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="ceilometer-notification-agent" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997654 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="ceilometer-notification-agent" Feb 23 08:57:22 crc kubenswrapper[5118]: E0223 08:57:22.997669 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="ceilometer-central-agent" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997675 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="ceilometer-central-agent" Feb 23 08:57:22 crc kubenswrapper[5118]: E0223 08:57:22.997691 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214a2d91-6efd-4585-951e-00ab57da645b" containerName="dnsmasq-dns" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997697 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="214a2d91-6efd-4585-951e-00ab57da645b" containerName="dnsmasq-dns" Feb 23 08:57:22 crc kubenswrapper[5118]: E0223 08:57:22.997704 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="proxy-httpd" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997710 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="proxy-httpd" Feb 23 08:57:22 crc kubenswrapper[5118]: E0223 08:57:22.997725 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214a2d91-6efd-4585-951e-00ab57da645b" containerName="init" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997731 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="214a2d91-6efd-4585-951e-00ab57da645b" containerName="init" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997914 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="sg-core" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997935 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="214a2d91-6efd-4585-951e-00ab57da645b" containerName="dnsmasq-dns" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997944 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="ceilometer-central-agent" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997957 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="ceilometer-notification-agent" Feb 23 08:57:22 crc kubenswrapper[5118]: I0223 08:57:22.997969 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" containerName="proxy-httpd" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.000452 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.008717 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.008951 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.025145 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.055367 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67gkq\" (UniqueName: \"kubernetes.io/projected/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-kube-api-access-67gkq\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.055576 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.055623 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-run-httpd\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.055669 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-scripts\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.059668 5118 scope.go:117] "RemoveContainer" containerID="3498354cb376790d12c92d15320b4ae2ef3ce4601b1b2b8c467b0f579a83dd37" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.060452 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-log-httpd\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.060612 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-config-data\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.060708 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.163443 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67gkq\" (UniqueName: \"kubernetes.io/projected/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-kube-api-access-67gkq\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.163489 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.163516 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-run-httpd\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.163543 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-scripts\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.163595 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-log-httpd\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.163643 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-config-data\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.163678 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.164271 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-log-httpd\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.164334 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-run-httpd\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.169582 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-scripts\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.170832 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.171210 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.176626 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-config-data\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.201404 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67gkq\" (UniqueName: \"kubernetes.io/projected/c23b5987-be35-42db-b1d5-cfaadfbdb9e0-kube-api-access-67gkq\") pod \"ceilometer-0\" (UID: \"c23b5987-be35-42db-b1d5-cfaadfbdb9e0\") " pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.340394 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.708326 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5170f948-8cb6-4560-be4e-41de9e6e49a6" path="/var/lib/kubelet/pods/5170f948-8cb6-4560-be4e-41de9e6e49a6/volumes" Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.914387 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.930993 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23b5987-be35-42db-b1d5-cfaadfbdb9e0","Type":"ContainerStarted","Data":"ccbe4277b33366cf1dec7f92b792f9d09df991eca87e86d89d77b45efa25dec9"} Feb 23 08:57:23 crc kubenswrapper[5118]: I0223 08:57:23.992197 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Feb 23 08:57:24 crc kubenswrapper[5118]: I0223 08:57:24.122361 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Feb 23 08:57:24 crc kubenswrapper[5118]: I0223 08:57:24.233457 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Feb 23 08:57:24 crc kubenswrapper[5118]: I0223 08:57:24.942516 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23b5987-be35-42db-b1d5-cfaadfbdb9e0","Type":"ContainerStarted","Data":"443ecb8a6a62e13eb189433f1f439206a70612506122bb0d89904d4a7a871c84"} Feb 23 08:57:25 crc kubenswrapper[5118]: I0223 08:57:25.959173 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23b5987-be35-42db-b1d5-cfaadfbdb9e0","Type":"ContainerStarted","Data":"fc1b2d783e03038b3ae8bdfe64b5e0972b2db2f34cd8cfd52b93881a2d08dbbf"} Feb 23 08:57:25 crc kubenswrapper[5118]: I0223 08:57:25.959993 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23b5987-be35-42db-b1d5-cfaadfbdb9e0","Type":"ContainerStarted","Data":"160877543d243e73bc20a153c371c35646be5defab176c7deddaca4f055aba2f"} Feb 23 08:57:28 crc kubenswrapper[5118]: I0223 08:57:28.008427 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23b5987-be35-42db-b1d5-cfaadfbdb9e0","Type":"ContainerStarted","Data":"7ced83174c74696df47ef4b3772adf7f5e0c8e5d3e2044473b58c11d1d6b2135"} Feb 23 08:57:28 crc kubenswrapper[5118]: I0223 08:57:28.009410 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 23 08:57:28 crc kubenswrapper[5118]: I0223 08:57:28.034323 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.216897687 podStartE2EDuration="6.034301721s" podCreationTimestamp="2026-02-23 08:57:22 +0000 UTC" firstStartedPulling="2026-02-23 08:57:23.913017033 +0000 UTC m=+7906.916801606" lastFinishedPulling="2026-02-23 08:57:26.730421057 +0000 UTC m=+7909.734205640" observedRunningTime="2026-02-23 08:57:28.032074537 +0000 UTC m=+7911.035859130" watchObservedRunningTime="2026-02-23 08:57:28.034301721 +0000 UTC m=+7911.038086294" Feb 23 08:57:29 crc kubenswrapper[5118]: I0223 08:57:29.052878 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-twmzg"] Feb 23 08:57:29 crc kubenswrapper[5118]: I0223 08:57:29.075171 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-twmzg"] Feb 23 08:57:29 crc kubenswrapper[5118]: I0223 08:57:29.714596 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c16330-f3b7-4cad-ad50-34a5e5f79b9b" path="/var/lib/kubelet/pods/10c16330-f3b7-4cad-ad50-34a5e5f79b9b/volumes" Feb 23 08:57:30 crc kubenswrapper[5118]: I0223 08:57:30.040457 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pn8cv"] Feb 23 08:57:30 crc kubenswrapper[5118]: I0223 08:57:30.048853 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pn8cv"] Feb 23 08:57:31 crc kubenswrapper[5118]: I0223 08:57:31.708847 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301b65e4-1307-4a85-a3fc-54d4b49508d6" path="/var/lib/kubelet/pods/301b65e4-1307-4a85-a3fc-54d4b49508d6/volumes" Feb 23 08:57:44 crc kubenswrapper[5118]: I0223 08:57:44.069229 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wxcjh"] Feb 23 08:57:44 crc kubenswrapper[5118]: I0223 08:57:44.078349 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wxcjh"] Feb 23 08:57:45 crc kubenswrapper[5118]: I0223 08:57:45.720284 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6582370-a366-423e-88a4-da5b540bb5bf" path="/var/lib/kubelet/pods/d6582370-a366-423e-88a4-da5b540bb5bf/volumes" Feb 23 08:57:53 crc kubenswrapper[5118]: I0223 08:57:53.344989 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 23 08:58:02 crc kubenswrapper[5118]: I0223 08:58:02.975259 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:58:02 crc kubenswrapper[5118]: I0223 08:58:02.975873 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:58:05 crc kubenswrapper[5118]: I0223 08:58:05.070590 5118 scope.go:117] "RemoveContainer" containerID="00e6e1f98f77a4e9c690fa82a16f640b300f205807b666add1533b5760dd00a5" Feb 23 08:58:05 crc kubenswrapper[5118]: I0223 08:58:05.311895 5118 scope.go:117] "RemoveContainer" containerID="cee75633c13bcd9438093bc06917d7cccea8cb048389510b162a73da2ad95fe3" Feb 23 08:58:05 crc kubenswrapper[5118]: I0223 08:58:05.345474 5118 scope.go:117] "RemoveContainer" containerID="e5770682a4a710cf5e4872bfff0f41b98c32bed8380f55063354b8a5b03f9cb4" Feb 23 08:58:05 crc kubenswrapper[5118]: I0223 08:58:05.415911 5118 scope.go:117] "RemoveContainer" containerID="c82a1db7dcb8c470362958480bd9c7af8a74826c7699f87017adf348392a7cc7" Feb 23 08:58:05 crc kubenswrapper[5118]: I0223 08:58:05.446121 5118 scope.go:117] "RemoveContainer" containerID="7fb0d63f9689552d1e43a3c320a8c7d479c783f5dccff98b23d71e33f1ed0c18" Feb 23 08:58:05 crc kubenswrapper[5118]: I0223 08:58:05.480904 5118 scope.go:117] "RemoveContainer" containerID="f88048c0d6a3e1ce4197293513987f53aaec3b8f66b4d42a3481a6306bb060da" Feb 23 08:58:08 crc kubenswrapper[5118]: I0223 08:58:08.856188 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8758d4797-r5ktc"] Feb 23 08:58:08 crc kubenswrapper[5118]: I0223 08:58:08.858982 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:08 crc kubenswrapper[5118]: I0223 08:58:08.861064 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Feb 23 08:58:08 crc kubenswrapper[5118]: I0223 08:58:08.879408 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8758d4797-r5ktc"] Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.042020 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-openstack-cell1\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.042077 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxqlb\" (UniqueName: \"kubernetes.io/projected/3dc62041-10c6-49aa-8e63-0709f7a4318b-kube-api-access-fxqlb\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.042203 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-nb\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.042224 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-dns-svc\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.042257 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-sb\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.042295 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-config\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.144242 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxqlb\" (UniqueName: \"kubernetes.io/projected/3dc62041-10c6-49aa-8e63-0709f7a4318b-kube-api-access-fxqlb\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.144381 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-nb\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.144405 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-dns-svc\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.144433 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-sb\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.144468 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-config\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.144534 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-openstack-cell1\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.145498 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-openstack-cell1\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.145651 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-dns-svc\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.145733 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-sb\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.146264 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-config\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.147737 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-nb\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.186211 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxqlb\" (UniqueName: \"kubernetes.io/projected/3dc62041-10c6-49aa-8e63-0709f7a4318b-kube-api-access-fxqlb\") pod \"dnsmasq-dns-8758d4797-r5ktc\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:09 crc kubenswrapper[5118]: I0223 08:58:09.481070 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:10 crc kubenswrapper[5118]: I0223 08:58:10.024610 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8758d4797-r5ktc"] Feb 23 08:58:10 crc kubenswrapper[5118]: I0223 08:58:10.728684 5118 generic.go:334] "Generic (PLEG): container finished" podID="3dc62041-10c6-49aa-8e63-0709f7a4318b" containerID="ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec" exitCode=0 Feb 23 08:58:10 crc kubenswrapper[5118]: I0223 08:58:10.728781 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" event={"ID":"3dc62041-10c6-49aa-8e63-0709f7a4318b","Type":"ContainerDied","Data":"ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec"} Feb 23 08:58:10 crc kubenswrapper[5118]: I0223 08:58:10.729294 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" event={"ID":"3dc62041-10c6-49aa-8e63-0709f7a4318b","Type":"ContainerStarted","Data":"9a3946eeca7a28710566fbc1bc792df93e9f6df0ead2a2ebf370b488922d8c4a"} Feb 23 08:58:11 crc kubenswrapper[5118]: I0223 08:58:11.745670 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" event={"ID":"3dc62041-10c6-49aa-8e63-0709f7a4318b","Type":"ContainerStarted","Data":"8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335"} Feb 23 08:58:11 crc kubenswrapper[5118]: I0223 08:58:11.746419 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:11 crc kubenswrapper[5118]: I0223 08:58:11.769178 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" podStartSLOduration=3.7691535529999998 podStartE2EDuration="3.769153553s" podCreationTimestamp="2026-02-23 08:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:58:11.76821567 +0000 UTC m=+7954.772000243" watchObservedRunningTime="2026-02-23 08:58:11.769153553 +0000 UTC m=+7954.772938136" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.483316 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.575754 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd8bf8f99-s7tfz"] Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.576267 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" podUID="fb4d376f-4315-4f50-925f-9a88f42d12c2" containerName="dnsmasq-dns" containerID="cri-o://69d2ee957e48bdee7c2ffc1e9effb3c71c81c1f1ac38d8c40c330606bb072c98" gracePeriod=10 Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.768707 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bb5756f9-zlz5k"] Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.772375 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.775853 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-networker" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.786568 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bb5756f9-zlz5k"] Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.816043 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv4cz\" (UniqueName: \"kubernetes.io/projected/b65126f0-29da-4233-bfde-c59711df1e33-kube-api-access-rv4cz\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.816129 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-cell1\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.816181 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-config\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.816203 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-sb\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.816249 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-networker\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.816282 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-nb\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.816309 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-dns-svc\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.856215 5118 generic.go:334] "Generic (PLEG): container finished" podID="fb4d376f-4315-4f50-925f-9a88f42d12c2" containerID="69d2ee957e48bdee7c2ffc1e9effb3c71c81c1f1ac38d8c40c330606bb072c98" exitCode=0 Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.856480 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" event={"ID":"fb4d376f-4315-4f50-925f-9a88f42d12c2","Type":"ContainerDied","Data":"69d2ee957e48bdee7c2ffc1e9effb3c71c81c1f1ac38d8c40c330606bb072c98"} Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.919506 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv4cz\" (UniqueName: \"kubernetes.io/projected/b65126f0-29da-4233-bfde-c59711df1e33-kube-api-access-rv4cz\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.919617 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-cell1\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.919668 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-config\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.919691 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-sb\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.919741 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-networker\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.919773 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-nb\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.919800 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-dns-svc\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.921288 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-dns-svc\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.921800 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-nb\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.922012 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-config\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.922600 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-cell1\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.922705 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-networker\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.924724 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-sb\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.939118 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bb5756f9-zlz5k"] Feb 23 08:58:19 crc kubenswrapper[5118]: E0223 08:58:19.940071 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-rv4cz], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" podUID="b65126f0-29da-4233-bfde-c59711df1e33" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.963446 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv4cz\" (UniqueName: \"kubernetes.io/projected/b65126f0-29da-4233-bfde-c59711df1e33-kube-api-access-rv4cz\") pod \"dnsmasq-dns-86bb5756f9-zlz5k\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.976755 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-665d6cc987-mjtg9"] Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.978609 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:19 crc kubenswrapper[5118]: I0223 08:58:19.992355 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-665d6cc987-mjtg9"] Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.022644 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-openstack-networker\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.023034 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-ovsdbserver-sb\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.023184 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-dns-svc\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.023309 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-ovsdbserver-nb\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.023458 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-config\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.024045 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qscm\" (UniqueName: \"kubernetes.io/projected/fcbae92c-436e-4555-bf63-c25cf526532a-kube-api-access-7qscm\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.024228 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-openstack-cell1\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.126973 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-openstack-cell1\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.127168 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-ovsdbserver-sb\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.127197 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-openstack-networker\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.127231 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-dns-svc\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.127274 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-ovsdbserver-nb\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.127323 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-config\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.127348 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qscm\" (UniqueName: \"kubernetes.io/projected/fcbae92c-436e-4555-bf63-c25cf526532a-kube-api-access-7qscm\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.127919 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-openstack-cell1\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.128542 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-ovsdbserver-nb\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.128612 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-openstack-networker\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.129233 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-config\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.130653 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-dns-svc\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.131802 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcbae92c-436e-4555-bf63-c25cf526532a-ovsdbserver-sb\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.149021 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qscm\" (UniqueName: \"kubernetes.io/projected/fcbae92c-436e-4555-bf63-c25cf526532a-kube-api-access-7qscm\") pod \"dnsmasq-dns-665d6cc987-mjtg9\" (UID: \"fcbae92c-436e-4555-bf63-c25cf526532a\") " pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.247207 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.331370 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2bt5\" (UniqueName: \"kubernetes.io/projected/fb4d376f-4315-4f50-925f-9a88f42d12c2-kube-api-access-p2bt5\") pod \"fb4d376f-4315-4f50-925f-9a88f42d12c2\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.331472 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-nb\") pod \"fb4d376f-4315-4f50-925f-9a88f42d12c2\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.331684 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-config\") pod \"fb4d376f-4315-4f50-925f-9a88f42d12c2\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.331808 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-dns-svc\") pod \"fb4d376f-4315-4f50-925f-9a88f42d12c2\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.331844 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-sb\") pod \"fb4d376f-4315-4f50-925f-9a88f42d12c2\" (UID: \"fb4d376f-4315-4f50-925f-9a88f42d12c2\") " Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.332018 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.337179 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb4d376f-4315-4f50-925f-9a88f42d12c2-kube-api-access-p2bt5" (OuterVolumeSpecName: "kube-api-access-p2bt5") pod "fb4d376f-4315-4f50-925f-9a88f42d12c2" (UID: "fb4d376f-4315-4f50-925f-9a88f42d12c2"). InnerVolumeSpecName "kube-api-access-p2bt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.387679 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-config" (OuterVolumeSpecName: "config") pod "fb4d376f-4315-4f50-925f-9a88f42d12c2" (UID: "fb4d376f-4315-4f50-925f-9a88f42d12c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.392608 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb4d376f-4315-4f50-925f-9a88f42d12c2" (UID: "fb4d376f-4315-4f50-925f-9a88f42d12c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.403629 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb4d376f-4315-4f50-925f-9a88f42d12c2" (UID: "fb4d376f-4315-4f50-925f-9a88f42d12c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.415760 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb4d376f-4315-4f50-925f-9a88f42d12c2" (UID: "fb4d376f-4315-4f50-925f-9a88f42d12c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.435187 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2bt5\" (UniqueName: \"kubernetes.io/projected/fb4d376f-4315-4f50-925f-9a88f42d12c2-kube-api-access-p2bt5\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.435231 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.435246 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.435258 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.435272 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb4d376f-4315-4f50-925f-9a88f42d12c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.869077 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" event={"ID":"fb4d376f-4315-4f50-925f-9a88f42d12c2","Type":"ContainerDied","Data":"4ec0f882eef60f97f02eaa29969c29beafbd33b19a01ee1c351194377dbb9ae6"} Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.869437 5118 scope.go:117] "RemoveContainer" containerID="69d2ee957e48bdee7c2ffc1e9effb3c71c81c1f1ac38d8c40c330606bb072c98" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.869122 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.869219 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd8bf8f99-s7tfz" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.884333 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.913057 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-665d6cc987-mjtg9"] Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.918319 5118 scope.go:117] "RemoveContainer" containerID="486b85afb16fd0af16ddd940bf606202f136994781b28465549d41bdc3558c9d" Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.923083 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd8bf8f99-s7tfz"] Feb 23 08:58:20 crc kubenswrapper[5118]: I0223 08:58:20.934152 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd8bf8f99-s7tfz"] Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.074308 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-nb\") pod \"b65126f0-29da-4233-bfde-c59711df1e33\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.074696 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv4cz\" (UniqueName: \"kubernetes.io/projected/b65126f0-29da-4233-bfde-c59711df1e33-kube-api-access-rv4cz\") pod \"b65126f0-29da-4233-bfde-c59711df1e33\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.074723 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-networker\") pod \"b65126f0-29da-4233-bfde-c59711df1e33\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.074795 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-sb\") pod \"b65126f0-29da-4233-bfde-c59711df1e33\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.074838 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-cell1\") pod \"b65126f0-29da-4233-bfde-c59711df1e33\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.074904 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-dns-svc\") pod \"b65126f0-29da-4233-bfde-c59711df1e33\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.074937 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-config\") pod \"b65126f0-29da-4233-bfde-c59711df1e33\" (UID: \"b65126f0-29da-4233-bfde-c59711df1e33\") " Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.074988 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b65126f0-29da-4233-bfde-c59711df1e33" (UID: "b65126f0-29da-4233-bfde-c59711df1e33"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.075352 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b65126f0-29da-4233-bfde-c59711df1e33" (UID: "b65126f0-29da-4233-bfde-c59711df1e33"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.075422 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.075892 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "b65126f0-29da-4233-bfde-c59711df1e33" (UID: "b65126f0-29da-4233-bfde-c59711df1e33"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.076224 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-config" (OuterVolumeSpecName: "config") pod "b65126f0-29da-4233-bfde-c59711df1e33" (UID: "b65126f0-29da-4233-bfde-c59711df1e33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.076323 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b65126f0-29da-4233-bfde-c59711df1e33" (UID: "b65126f0-29da-4233-bfde-c59711df1e33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.076448 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-networker" (OuterVolumeSpecName: "openstack-networker") pod "b65126f0-29da-4233-bfde-c59711df1e33" (UID: "b65126f0-29da-4233-bfde-c59711df1e33"). InnerVolumeSpecName "openstack-networker". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.078693 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65126f0-29da-4233-bfde-c59711df1e33-kube-api-access-rv4cz" (OuterVolumeSpecName: "kube-api-access-rv4cz") pod "b65126f0-29da-4233-bfde-c59711df1e33" (UID: "b65126f0-29da-4233-bfde-c59711df1e33"). InnerVolumeSpecName "kube-api-access-rv4cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.177698 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.177744 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.177757 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.177770 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv4cz\" (UniqueName: \"kubernetes.io/projected/b65126f0-29da-4233-bfde-c59711df1e33-kube-api-access-rv4cz\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.177786 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.177797 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b65126f0-29da-4233-bfde-c59711df1e33-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.709287 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb4d376f-4315-4f50-925f-9a88f42d12c2" path="/var/lib/kubelet/pods/fb4d376f-4315-4f50-925f-9a88f42d12c2/volumes" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.878965 5118 generic.go:334] "Generic (PLEG): container finished" podID="fcbae92c-436e-4555-bf63-c25cf526532a" containerID="ca80c9dd4f1c1aa1a3e3a8747b3e4664cfb481d53f79cbba5582aed2660396c1" exitCode=0 Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.879032 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" event={"ID":"fcbae92c-436e-4555-bf63-c25cf526532a","Type":"ContainerDied","Data":"ca80c9dd4f1c1aa1a3e3a8747b3e4664cfb481d53f79cbba5582aed2660396c1"} Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.880035 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" event={"ID":"fcbae92c-436e-4555-bf63-c25cf526532a","Type":"ContainerStarted","Data":"ff04b33b2e254cebe594bc2d70415a032867db0739d6cb4ab766a730f1a1c848"} Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.881856 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bb5756f9-zlz5k" Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.966485 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bb5756f9-zlz5k"] Feb 23 08:58:21 crc kubenswrapper[5118]: I0223 08:58:21.976187 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bb5756f9-zlz5k"] Feb 23 08:58:22 crc kubenswrapper[5118]: I0223 08:58:22.922041 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" event={"ID":"fcbae92c-436e-4555-bf63-c25cf526532a","Type":"ContainerStarted","Data":"5347e4e70964785ad61e719a84f76c09d723c6db7556475467142c9fd8a10d77"} Feb 23 08:58:22 crc kubenswrapper[5118]: I0223 08:58:22.922568 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:22 crc kubenswrapper[5118]: I0223 08:58:22.954449 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" podStartSLOduration=3.954428499 podStartE2EDuration="3.954428499s" podCreationTimestamp="2026-02-23 08:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 08:58:22.947592805 +0000 UTC m=+7965.951377378" watchObservedRunningTime="2026-02-23 08:58:22.954428499 +0000 UTC m=+7965.958213072" Feb 23 08:58:23 crc kubenswrapper[5118]: I0223 08:58:23.711167 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65126f0-29da-4233-bfde-c59711df1e33" path="/var/lib/kubelet/pods/b65126f0-29da-4233-bfde-c59711df1e33/volumes" Feb 23 08:58:28 crc kubenswrapper[5118]: I0223 08:58:28.065060 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sgvkz"] Feb 23 08:58:28 crc kubenswrapper[5118]: I0223 08:58:28.080079 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7106-account-create-update-8qzp6"] Feb 23 08:58:28 crc kubenswrapper[5118]: I0223 08:58:28.091288 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sgvkz"] Feb 23 08:58:28 crc kubenswrapper[5118]: I0223 08:58:28.101217 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7106-account-create-update-8qzp6"] Feb 23 08:58:29 crc kubenswrapper[5118]: I0223 08:58:29.715994 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f624a50-4192-4e8b-9759-9610d25bd843" path="/var/lib/kubelet/pods/2f624a50-4192-4e8b-9759-9610d25bd843/volumes" Feb 23 08:58:29 crc kubenswrapper[5118]: I0223 08:58:29.719554 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe01c183-1593-499e-a295-d927371fae2f" path="/var/lib/kubelet/pods/fe01c183-1593-499e-a295-d927371fae2f/volumes" Feb 23 08:58:30 crc kubenswrapper[5118]: I0223 08:58:30.334252 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-665d6cc987-mjtg9" Feb 23 08:58:30 crc kubenswrapper[5118]: I0223 08:58:30.418193 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8758d4797-r5ktc"] Feb 23 08:58:30 crc kubenswrapper[5118]: I0223 08:58:30.418470 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" podUID="3dc62041-10c6-49aa-8e63-0709f7a4318b" containerName="dnsmasq-dns" containerID="cri-o://8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335" gracePeriod=10 Feb 23 08:58:30 crc kubenswrapper[5118]: I0223 08:58:30.956255 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.027354 5118 generic.go:334] "Generic (PLEG): container finished" podID="3dc62041-10c6-49aa-8e63-0709f7a4318b" containerID="8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335" exitCode=0 Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.027400 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" event={"ID":"3dc62041-10c6-49aa-8e63-0709f7a4318b","Type":"ContainerDied","Data":"8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335"} Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.027431 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" event={"ID":"3dc62041-10c6-49aa-8e63-0709f7a4318b","Type":"ContainerDied","Data":"9a3946eeca7a28710566fbc1bc792df93e9f6df0ead2a2ebf370b488922d8c4a"} Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.027448 5118 scope.go:117] "RemoveContainer" containerID="8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.027674 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8758d4797-r5ktc" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.087491 5118 scope.go:117] "RemoveContainer" containerID="ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.103850 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-config\") pod \"3dc62041-10c6-49aa-8e63-0709f7a4318b\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.103906 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-dns-svc\") pod \"3dc62041-10c6-49aa-8e63-0709f7a4318b\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.103960 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxqlb\" (UniqueName: \"kubernetes.io/projected/3dc62041-10c6-49aa-8e63-0709f7a4318b-kube-api-access-fxqlb\") pod \"3dc62041-10c6-49aa-8e63-0709f7a4318b\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.104083 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-openstack-cell1\") pod \"3dc62041-10c6-49aa-8e63-0709f7a4318b\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.104181 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-nb\") pod \"3dc62041-10c6-49aa-8e63-0709f7a4318b\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.104219 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-sb\") pod \"3dc62041-10c6-49aa-8e63-0709f7a4318b\" (UID: \"3dc62041-10c6-49aa-8e63-0709f7a4318b\") " Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.122498 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc62041-10c6-49aa-8e63-0709f7a4318b-kube-api-access-fxqlb" (OuterVolumeSpecName: "kube-api-access-fxqlb") pod "3dc62041-10c6-49aa-8e63-0709f7a4318b" (UID: "3dc62041-10c6-49aa-8e63-0709f7a4318b"). InnerVolumeSpecName "kube-api-access-fxqlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.209059 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxqlb\" (UniqueName: \"kubernetes.io/projected/3dc62041-10c6-49aa-8e63-0709f7a4318b-kube-api-access-fxqlb\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.209981 5118 scope.go:117] "RemoveContainer" containerID="8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335" Feb 23 08:58:31 crc kubenswrapper[5118]: E0223 08:58:31.237024 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335\": container with ID starting with 8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335 not found: ID does not exist" containerID="8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.237071 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335"} err="failed to get container status \"8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335\": rpc error: code = NotFound desc = could not find container \"8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335\": container with ID starting with 8581207f97cec6e03e20b012312485d1e0e1842d342bcd4f2e083e9c3b850335 not found: ID does not exist" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.237118 5118 scope.go:117] "RemoveContainer" containerID="ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec" Feb 23 08:58:31 crc kubenswrapper[5118]: E0223 08:58:31.243279 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec\": container with ID starting with ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec not found: ID does not exist" containerID="ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.243330 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec"} err="failed to get container status \"ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec\": rpc error: code = NotFound desc = could not find container \"ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec\": container with ID starting with ec9e8a1439e4f057ac953e80acf203478b140ad2c922ba72ce900cb75d3af4ec not found: ID does not exist" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.243757 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3dc62041-10c6-49aa-8e63-0709f7a4318b" (UID: "3dc62041-10c6-49aa-8e63-0709f7a4318b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.251946 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "3dc62041-10c6-49aa-8e63-0709f7a4318b" (UID: "3dc62041-10c6-49aa-8e63-0709f7a4318b"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.278988 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3dc62041-10c6-49aa-8e63-0709f7a4318b" (UID: "3dc62041-10c6-49aa-8e63-0709f7a4318b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.282648 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3dc62041-10c6-49aa-8e63-0709f7a4318b" (UID: "3dc62041-10c6-49aa-8e63-0709f7a4318b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.288594 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-config" (OuterVolumeSpecName: "config") pod "3dc62041-10c6-49aa-8e63-0709f7a4318b" (UID: "3dc62041-10c6-49aa-8e63-0709f7a4318b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.311276 5118 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-config\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.311322 5118 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.311335 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.311352 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.311360 5118 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc62041-10c6-49aa-8e63-0709f7a4318b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.360333 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8758d4797-r5ktc"] Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.368778 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8758d4797-r5ktc"] Feb 23 08:58:31 crc kubenswrapper[5118]: I0223 08:58:31.710112 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc62041-10c6-49aa-8e63-0709f7a4318b" path="/var/lib/kubelet/pods/3dc62041-10c6-49aa-8e63-0709f7a4318b/volumes" Feb 23 08:58:32 crc kubenswrapper[5118]: I0223 08:58:32.975577 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:58:32 crc kubenswrapper[5118]: I0223 08:58:32.975910 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.159682 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c"] Feb 23 08:58:41 crc kubenswrapper[5118]: E0223 08:58:41.161076 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb4d376f-4315-4f50-925f-9a88f42d12c2" containerName="init" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.161113 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4d376f-4315-4f50-925f-9a88f42d12c2" containerName="init" Feb 23 08:58:41 crc kubenswrapper[5118]: E0223 08:58:41.161125 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc62041-10c6-49aa-8e63-0709f7a4318b" containerName="dnsmasq-dns" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.161134 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc62041-10c6-49aa-8e63-0709f7a4318b" containerName="dnsmasq-dns" Feb 23 08:58:41 crc kubenswrapper[5118]: E0223 08:58:41.161182 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc62041-10c6-49aa-8e63-0709f7a4318b" containerName="init" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.161192 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc62041-10c6-49aa-8e63-0709f7a4318b" containerName="init" Feb 23 08:58:41 crc kubenswrapper[5118]: E0223 08:58:41.161213 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb4d376f-4315-4f50-925f-9a88f42d12c2" containerName="dnsmasq-dns" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.161219 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4d376f-4315-4f50-925f-9a88f42d12c2" containerName="dnsmasq-dns" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.161445 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc62041-10c6-49aa-8e63-0709f7a4318b" containerName="dnsmasq-dns" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.161457 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb4d376f-4315-4f50-925f-9a88f42d12c2" containerName="dnsmasq-dns" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.162380 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.167308 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.168802 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.170282 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.170372 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.170542 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p"] Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.172576 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.176531 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.177731 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.190953 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c"] Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.220018 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p"] Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.221803 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj2j7\" (UniqueName: \"kubernetes.io/projected/a5415a2b-4e23-462e-a219-3ec3c0c3e369-kube-api-access-dj2j7\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.221875 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.221941 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.221981 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.222011 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.323290 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.323359 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.323410 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.323476 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.323506 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.323531 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.323634 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj2j7\" (UniqueName: \"kubernetes.io/projected/a5415a2b-4e23-462e-a219-3ec3c0c3e369-kube-api-access-dj2j7\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.323719 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzsjf\" (UniqueName: \"kubernetes.io/projected/090ec22d-8571-49a4-b33a-fc211a094200-kube-api-access-qzsjf\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.324035 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.329925 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.330506 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.330624 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.336925 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.339807 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj2j7\" (UniqueName: \"kubernetes.io/projected/a5415a2b-4e23-462e-a219-3ec3c0c3e369-kube-api-access-dj2j7\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-chv96c\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.425913 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzsjf\" (UniqueName: \"kubernetes.io/projected/090ec22d-8571-49a4-b33a-fc211a094200-kube-api-access-qzsjf\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.425975 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.426010 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.426071 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.442607 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.449659 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzsjf\" (UniqueName: \"kubernetes.io/projected/090ec22d-8571-49a4-b33a-fc211a094200-kube-api-access-qzsjf\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.459292 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.459937 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.505456 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:58:41 crc kubenswrapper[5118]: I0223 08:58:41.520400 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:58:42 crc kubenswrapper[5118]: I0223 08:58:42.132527 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c"] Feb 23 08:58:42 crc kubenswrapper[5118]: I0223 08:58:42.153285 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" event={"ID":"a5415a2b-4e23-462e-a219-3ec3c0c3e369","Type":"ContainerStarted","Data":"df0839b074ee78552bc4331a528fb8cdcebd735228e484812116a5eddc83876a"} Feb 23 08:58:42 crc kubenswrapper[5118]: I0223 08:58:42.295115 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p"] Feb 23 08:58:42 crc kubenswrapper[5118]: W0223 08:58:42.302911 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod090ec22d_8571_49a4_b33a_fc211a094200.slice/crio-7a683aa741a42911726f2d66fabf0bcf7bd3406f7baa27f8a8a329f9d6882f69 WatchSource:0}: Error finding container 7a683aa741a42911726f2d66fabf0bcf7bd3406f7baa27f8a8a329f9d6882f69: Status 404 returned error can't find the container with id 7a683aa741a42911726f2d66fabf0bcf7bd3406f7baa27f8a8a329f9d6882f69 Feb 23 08:58:43 crc kubenswrapper[5118]: I0223 08:58:43.165333 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" event={"ID":"090ec22d-8571-49a4-b33a-fc211a094200","Type":"ContainerStarted","Data":"7a683aa741a42911726f2d66fabf0bcf7bd3406f7baa27f8a8a329f9d6882f69"} Feb 23 08:58:53 crc kubenswrapper[5118]: I0223 08:58:53.290202 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" event={"ID":"090ec22d-8571-49a4-b33a-fc211a094200","Type":"ContainerStarted","Data":"d8667d8f4908d77d20cb3afa38751bc2362e320de1bd6d2fc712299c8495556d"} Feb 23 08:58:53 crc kubenswrapper[5118]: I0223 08:58:53.292436 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" event={"ID":"a5415a2b-4e23-462e-a219-3ec3c0c3e369","Type":"ContainerStarted","Data":"84d84b63cce140366b1c9f6e4101722047b82eb8aa845c6e48f3bcf5a5a513f1"} Feb 23 08:58:53 crc kubenswrapper[5118]: I0223 08:58:53.315927 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" podStartSLOduration=2.420975348 podStartE2EDuration="12.315902768s" podCreationTimestamp="2026-02-23 08:58:41 +0000 UTC" firstStartedPulling="2026-02-23 08:58:42.305497822 +0000 UTC m=+7985.309282395" lastFinishedPulling="2026-02-23 08:58:52.200425232 +0000 UTC m=+7995.204209815" observedRunningTime="2026-02-23 08:58:53.307393453 +0000 UTC m=+7996.311178036" watchObservedRunningTime="2026-02-23 08:58:53.315902768 +0000 UTC m=+7996.319687341" Feb 23 08:58:53 crc kubenswrapper[5118]: I0223 08:58:53.335929 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" podStartSLOduration=2.270514843 podStartE2EDuration="12.335904149s" podCreationTimestamp="2026-02-23 08:58:41 +0000 UTC" firstStartedPulling="2026-02-23 08:58:42.141190935 +0000 UTC m=+7985.144975518" lastFinishedPulling="2026-02-23 08:58:52.206580241 +0000 UTC m=+7995.210364824" observedRunningTime="2026-02-23 08:58:53.327687571 +0000 UTC m=+7996.331472154" watchObservedRunningTime="2026-02-23 08:58:53.335904149 +0000 UTC m=+7996.339688722" Feb 23 08:59:02 crc kubenswrapper[5118]: I0223 08:59:02.975453 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 08:59:02 crc kubenswrapper[5118]: I0223 08:59:02.976223 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 08:59:02 crc kubenswrapper[5118]: I0223 08:59:02.976295 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 08:59:02 crc kubenswrapper[5118]: I0223 08:59:02.977483 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a926957ef80dc4661611201421a5011ea66ab9aea8a66b566fdd4ea493c9ceae"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 08:59:02 crc kubenswrapper[5118]: I0223 08:59:02.977598 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://a926957ef80dc4661611201421a5011ea66ab9aea8a66b566fdd4ea493c9ceae" gracePeriod=600 Feb 23 08:59:03 crc kubenswrapper[5118]: I0223 08:59:03.399813 5118 generic.go:334] "Generic (PLEG): container finished" podID="a5415a2b-4e23-462e-a219-3ec3c0c3e369" containerID="84d84b63cce140366b1c9f6e4101722047b82eb8aa845c6e48f3bcf5a5a513f1" exitCode=0 Feb 23 08:59:03 crc kubenswrapper[5118]: I0223 08:59:03.399923 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" event={"ID":"a5415a2b-4e23-462e-a219-3ec3c0c3e369","Type":"ContainerDied","Data":"84d84b63cce140366b1c9f6e4101722047b82eb8aa845c6e48f3bcf5a5a513f1"} Feb 23 08:59:03 crc kubenswrapper[5118]: I0223 08:59:03.403045 5118 generic.go:334] "Generic (PLEG): container finished" podID="090ec22d-8571-49a4-b33a-fc211a094200" containerID="d8667d8f4908d77d20cb3afa38751bc2362e320de1bd6d2fc712299c8495556d" exitCode=0 Feb 23 08:59:03 crc kubenswrapper[5118]: I0223 08:59:03.403130 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" event={"ID":"090ec22d-8571-49a4-b33a-fc211a094200","Type":"ContainerDied","Data":"d8667d8f4908d77d20cb3afa38751bc2362e320de1bd6d2fc712299c8495556d"} Feb 23 08:59:03 crc kubenswrapper[5118]: I0223 08:59:03.406341 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="a926957ef80dc4661611201421a5011ea66ab9aea8a66b566fdd4ea493c9ceae" exitCode=0 Feb 23 08:59:03 crc kubenswrapper[5118]: I0223 08:59:03.406378 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"a926957ef80dc4661611201421a5011ea66ab9aea8a66b566fdd4ea493c9ceae"} Feb 23 08:59:03 crc kubenswrapper[5118]: I0223 08:59:03.406402 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025"} Feb 23 08:59:03 crc kubenswrapper[5118]: I0223 08:59:03.406435 5118 scope.go:117] "RemoveContainer" containerID="c515ae7a0d5dd36f623827e5b71883704781de4f37413b348cb3bfd5b3a5fbcf" Feb 23 08:59:04 crc kubenswrapper[5118]: I0223 08:59:04.904993 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.060806 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-inventory\") pod \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.060867 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ssh-key-openstack-cell1\") pod \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.060895 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ceph\") pod \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.061057 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj2j7\" (UniqueName: \"kubernetes.io/projected/a5415a2b-4e23-462e-a219-3ec3c0c3e369-kube-api-access-dj2j7\") pod \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.061165 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-pre-adoption-validation-combined-ca-bundle\") pod \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\" (UID: \"a5415a2b-4e23-462e-a219-3ec3c0c3e369\") " Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.067644 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5415a2b-4e23-462e-a219-3ec3c0c3e369-kube-api-access-dj2j7" (OuterVolumeSpecName: "kube-api-access-dj2j7") pod "a5415a2b-4e23-462e-a219-3ec3c0c3e369" (UID: "a5415a2b-4e23-462e-a219-3ec3c0c3e369"). InnerVolumeSpecName "kube-api-access-dj2j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.067802 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ceph" (OuterVolumeSpecName: "ceph") pod "a5415a2b-4e23-462e-a219-3ec3c0c3e369" (UID: "a5415a2b-4e23-462e-a219-3ec3c0c3e369"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.067690 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "a5415a2b-4e23-462e-a219-3ec3c0c3e369" (UID: "a5415a2b-4e23-462e-a219-3ec3c0c3e369"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.093993 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a5415a2b-4e23-462e-a219-3ec3c0c3e369" (UID: "a5415a2b-4e23-462e-a219-3ec3c0c3e369"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.108746 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-inventory" (OuterVolumeSpecName: "inventory") pod "a5415a2b-4e23-462e-a219-3ec3c0c3e369" (UID: "a5415a2b-4e23-462e-a219-3ec3c0c3e369"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.163820 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.163860 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.163874 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.163904 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj2j7\" (UniqueName: \"kubernetes.io/projected/a5415a2b-4e23-462e-a219-3ec3c0c3e369-kube-api-access-dj2j7\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.163920 5118 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5415a2b-4e23-462e-a219-3ec3c0c3e369-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.432468 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" event={"ID":"a5415a2b-4e23-462e-a219-3ec3c0c3e369","Type":"ContainerDied","Data":"df0839b074ee78552bc4331a528fb8cdcebd735228e484812116a5eddc83876a"} Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.432518 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0839b074ee78552bc4331a528fb8cdcebd735228e484812116a5eddc83876a" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.432557 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-chv96c" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.539055 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.659210 5118 scope.go:117] "RemoveContainer" containerID="6954477e9bfb929d48ee7e253172c3d2bfbdcbff4194cc62381fbd21f0391c8a" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.672961 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-ssh-key-openstack-networker\") pod \"090ec22d-8571-49a4-b33a-fc211a094200\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.673925 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-inventory\") pod \"090ec22d-8571-49a4-b33a-fc211a094200\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.673954 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzsjf\" (UniqueName: \"kubernetes.io/projected/090ec22d-8571-49a4-b33a-fc211a094200-kube-api-access-qzsjf\") pod \"090ec22d-8571-49a4-b33a-fc211a094200\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.674125 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-pre-adoption-validation-combined-ca-bundle\") pod \"090ec22d-8571-49a4-b33a-fc211a094200\" (UID: \"090ec22d-8571-49a4-b33a-fc211a094200\") " Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.677948 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "090ec22d-8571-49a4-b33a-fc211a094200" (UID: "090ec22d-8571-49a4-b33a-fc211a094200"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.678619 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090ec22d-8571-49a4-b33a-fc211a094200-kube-api-access-qzsjf" (OuterVolumeSpecName: "kube-api-access-qzsjf") pod "090ec22d-8571-49a4-b33a-fc211a094200" (UID: "090ec22d-8571-49a4-b33a-fc211a094200"). InnerVolumeSpecName "kube-api-access-qzsjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.699924 5118 scope.go:117] "RemoveContainer" containerID="8551dd328058db9971618b44ca26868db695791ba6cab633179fb403d34348a7" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.700762 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-inventory" (OuterVolumeSpecName: "inventory") pod "090ec22d-8571-49a4-b33a-fc211a094200" (UID: "090ec22d-8571-49a4-b33a-fc211a094200"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.710163 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "090ec22d-8571-49a4-b33a-fc211a094200" (UID: "090ec22d-8571-49a4-b33a-fc211a094200"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.777719 5118 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.777960 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.778009 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/090ec22d-8571-49a4-b33a-fc211a094200-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:05 crc kubenswrapper[5118]: I0223 08:59:05.778028 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzsjf\" (UniqueName: \"kubernetes.io/projected/090ec22d-8571-49a4-b33a-fc211a094200-kube-api-access-qzsjf\") on node \"crc\" DevicePath \"\"" Feb 23 08:59:06 crc kubenswrapper[5118]: I0223 08:59:06.447741 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" event={"ID":"090ec22d-8571-49a4-b33a-fc211a094200","Type":"ContainerDied","Data":"7a683aa741a42911726f2d66fabf0bcf7bd3406f7baa27f8a8a329f9d6882f69"} Feb 23 08:59:06 crc kubenswrapper[5118]: I0223 08:59:06.447794 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a683aa741a42911726f2d66fabf0bcf7bd3406f7baa27f8a8a329f9d6882f69" Feb 23 08:59:06 crc kubenswrapper[5118]: I0223 08:59:06.447858 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p" Feb 23 08:59:08 crc kubenswrapper[5118]: I0223 08:59:08.046989 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fszjc"] Feb 23 08:59:08 crc kubenswrapper[5118]: I0223 08:59:08.057562 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fszjc"] Feb 23 08:59:09 crc kubenswrapper[5118]: I0223 08:59:09.714782 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f526e0b-c3b6-42d7-ba48-0d476701b579" path="/var/lib/kubelet/pods/5f526e0b-c3b6-42d7-ba48-0d476701b579/volumes" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.935012 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp"] Feb 23 08:59:13 crc kubenswrapper[5118]: E0223 08:59:13.936166 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090ec22d-8571-49a4-b33a-fc211a094200" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.936185 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="090ec22d-8571-49a4-b33a-fc211a094200" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 23 08:59:13 crc kubenswrapper[5118]: E0223 08:59:13.936229 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5415a2b-4e23-462e-a219-3ec3c0c3e369" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.936238 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5415a2b-4e23-462e-a219-3ec3c0c3e369" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.936449 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="090ec22d-8571-49a4-b33a-fc211a094200" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.936466 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5415a2b-4e23-462e-a219-3ec3c0c3e369" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.937423 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.940495 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.941533 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.944006 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.951419 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.954416 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf"] Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.957821 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.964834 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.972396 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 08:59:13 crc kubenswrapper[5118]: I0223 08:59:13.986061 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp"] Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.002220 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf"] Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.076382 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgfvd\" (UniqueName: \"kubernetes.io/projected/2a03095b-b10c-4ab3-b1cb-4b31730a0585-kube-api-access-pgfvd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.076745 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.077069 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.077230 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsndx\" (UniqueName: \"kubernetes.io/projected/37085206-dc9b-47c1-a2a5-5b54b83c7e47-kube-api-access-nsndx\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.077287 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.077328 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.077524 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.077853 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.077993 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.180918 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.181045 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.181271 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsndx\" (UniqueName: \"kubernetes.io/projected/37085206-dc9b-47c1-a2a5-5b54b83c7e47-kube-api-access-nsndx\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.181336 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.181380 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.181460 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.181544 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.181591 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.181735 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgfvd\" (UniqueName: \"kubernetes.io/projected/2a03095b-b10c-4ab3-b1cb-4b31730a0585-kube-api-access-pgfvd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.195175 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.195251 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.195382 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.196066 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.205995 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.206038 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.206272 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsndx\" (UniqueName: \"kubernetes.io/projected/37085206-dc9b-47c1-a2a5-5b54b83c7e47-kube-api-access-nsndx\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.208453 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.210297 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgfvd\" (UniqueName: \"kubernetes.io/projected/2a03095b-b10c-4ab3-b1cb-4b31730a0585-kube-api-access-pgfvd\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.275260 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.290384 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 08:59:14 crc kubenswrapper[5118]: I0223 08:59:14.976471 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp"] Feb 23 08:59:14 crc kubenswrapper[5118]: W0223 08:59:14.976981 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37085206_dc9b_47c1_a2a5_5b54b83c7e47.slice/crio-a573baf76635e2e0a4f9bd1598b72389ebd22f86979ee73844bd29cd1e23e01c WatchSource:0}: Error finding container a573baf76635e2e0a4f9bd1598b72389ebd22f86979ee73844bd29cd1e23e01c: Status 404 returned error can't find the container with id a573baf76635e2e0a4f9bd1598b72389ebd22f86979ee73844bd29cd1e23e01c Feb 23 08:59:15 crc kubenswrapper[5118]: I0223 08:59:15.059220 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf"] Feb 23 08:59:15 crc kubenswrapper[5118]: W0223 08:59:15.066570 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a03095b_b10c_4ab3_b1cb_4b31730a0585.slice/crio-6652c15f749b3797d71c0821a913e9a00ceae51df10a7dd7babea809194364c8 WatchSource:0}: Error finding container 6652c15f749b3797d71c0821a913e9a00ceae51df10a7dd7babea809194364c8: Status 404 returned error can't find the container with id 6652c15f749b3797d71c0821a913e9a00ceae51df10a7dd7babea809194364c8 Feb 23 08:59:15 crc kubenswrapper[5118]: I0223 08:59:15.563120 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" event={"ID":"37085206-dc9b-47c1-a2a5-5b54b83c7e47","Type":"ContainerStarted","Data":"a573baf76635e2e0a4f9bd1598b72389ebd22f86979ee73844bd29cd1e23e01c"} Feb 23 08:59:15 crc kubenswrapper[5118]: I0223 08:59:15.565946 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" event={"ID":"2a03095b-b10c-4ab3-b1cb-4b31730a0585","Type":"ContainerStarted","Data":"6652c15f749b3797d71c0821a913e9a00ceae51df10a7dd7babea809194364c8"} Feb 23 08:59:16 crc kubenswrapper[5118]: I0223 08:59:16.577266 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" event={"ID":"37085206-dc9b-47c1-a2a5-5b54b83c7e47","Type":"ContainerStarted","Data":"f10b41b574b4b3c9b78baa1377a7cb00cf4ee66a2dba8b5a7436f18fb2701754"} Feb 23 08:59:16 crc kubenswrapper[5118]: I0223 08:59:16.580198 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" event={"ID":"2a03095b-b10c-4ab3-b1cb-4b31730a0585","Type":"ContainerStarted","Data":"fe45d9ca06a9b62541e453a10876ad1813e789788e71266e7626098b0fc8f0c8"} Feb 23 08:59:16 crc kubenswrapper[5118]: I0223 08:59:16.605821 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" podStartSLOduration=3.027106926 podStartE2EDuration="3.605793583s" podCreationTimestamp="2026-02-23 08:59:13 +0000 UTC" firstStartedPulling="2026-02-23 08:59:14.98179904 +0000 UTC m=+8017.985583623" lastFinishedPulling="2026-02-23 08:59:15.560485707 +0000 UTC m=+8018.564270280" observedRunningTime="2026-02-23 08:59:16.599066071 +0000 UTC m=+8019.602850664" watchObservedRunningTime="2026-02-23 08:59:16.605793583 +0000 UTC m=+8019.609578196" Feb 23 08:59:16 crc kubenswrapper[5118]: I0223 08:59:16.633650 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" podStartSLOduration=3.178738737 podStartE2EDuration="3.633625433s" podCreationTimestamp="2026-02-23 08:59:13 +0000 UTC" firstStartedPulling="2026-02-23 08:59:15.069202345 +0000 UTC m=+8018.072986928" lastFinishedPulling="2026-02-23 08:59:15.524089011 +0000 UTC m=+8018.527873624" observedRunningTime="2026-02-23 08:59:16.620525337 +0000 UTC m=+8019.624309920" watchObservedRunningTime="2026-02-23 08:59:16.633625433 +0000 UTC m=+8019.637410006" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.182231 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk"] Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.185015 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.190717 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.191084 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.211791 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk"] Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.322619 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea9878a-b108-4feb-9385-158a04f591d6-config-volume\") pod \"collect-profiles-29530620-x2cvk\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.322687 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ea9878a-b108-4feb-9385-158a04f591d6-secret-volume\") pod \"collect-profiles-29530620-x2cvk\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.322726 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fngf4\" (UniqueName: \"kubernetes.io/projected/4ea9878a-b108-4feb-9385-158a04f591d6-kube-api-access-fngf4\") pod \"collect-profiles-29530620-x2cvk\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.426258 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea9878a-b108-4feb-9385-158a04f591d6-config-volume\") pod \"collect-profiles-29530620-x2cvk\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.426336 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ea9878a-b108-4feb-9385-158a04f591d6-secret-volume\") pod \"collect-profiles-29530620-x2cvk\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.426371 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fngf4\" (UniqueName: \"kubernetes.io/projected/4ea9878a-b108-4feb-9385-158a04f591d6-kube-api-access-fngf4\") pod \"collect-profiles-29530620-x2cvk\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.430001 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea9878a-b108-4feb-9385-158a04f591d6-config-volume\") pod \"collect-profiles-29530620-x2cvk\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.435566 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ea9878a-b108-4feb-9385-158a04f591d6-secret-volume\") pod \"collect-profiles-29530620-x2cvk\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.453896 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fngf4\" (UniqueName: \"kubernetes.io/projected/4ea9878a-b108-4feb-9385-158a04f591d6-kube-api-access-fngf4\") pod \"collect-profiles-29530620-x2cvk\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:00 crc kubenswrapper[5118]: I0223 09:00:00.513364 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:01 crc kubenswrapper[5118]: I0223 09:00:01.014203 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk"] Feb 23 09:00:01 crc kubenswrapper[5118]: I0223 09:00:01.081563 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" event={"ID":"4ea9878a-b108-4feb-9385-158a04f591d6","Type":"ContainerStarted","Data":"130932d2232f317b40fb1e6b1bc5ee2eeb5325377241b550015688c957f00da3"} Feb 23 09:00:02 crc kubenswrapper[5118]: I0223 09:00:02.103820 5118 generic.go:334] "Generic (PLEG): container finished" podID="4ea9878a-b108-4feb-9385-158a04f591d6" containerID="e38de7cf3e4d61cc96c53c0f4331d839db9b961f1850ec8823ede3c4e92d2b8b" exitCode=0 Feb 23 09:00:02 crc kubenswrapper[5118]: I0223 09:00:02.103957 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" event={"ID":"4ea9878a-b108-4feb-9385-158a04f591d6","Type":"ContainerDied","Data":"e38de7cf3e4d61cc96c53c0f4331d839db9b961f1850ec8823ede3c4e92d2b8b"} Feb 23 09:00:03 crc kubenswrapper[5118]: I0223 09:00:03.543900 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:03 crc kubenswrapper[5118]: I0223 09:00:03.701652 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fngf4\" (UniqueName: \"kubernetes.io/projected/4ea9878a-b108-4feb-9385-158a04f591d6-kube-api-access-fngf4\") pod \"4ea9878a-b108-4feb-9385-158a04f591d6\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " Feb 23 09:00:03 crc kubenswrapper[5118]: I0223 09:00:03.701965 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea9878a-b108-4feb-9385-158a04f591d6-config-volume\") pod \"4ea9878a-b108-4feb-9385-158a04f591d6\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " Feb 23 09:00:03 crc kubenswrapper[5118]: I0223 09:00:03.702320 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ea9878a-b108-4feb-9385-158a04f591d6-secret-volume\") pod \"4ea9878a-b108-4feb-9385-158a04f591d6\" (UID: \"4ea9878a-b108-4feb-9385-158a04f591d6\") " Feb 23 09:00:03 crc kubenswrapper[5118]: I0223 09:00:03.704503 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ea9878a-b108-4feb-9385-158a04f591d6-config-volume" (OuterVolumeSpecName: "config-volume") pod "4ea9878a-b108-4feb-9385-158a04f591d6" (UID: "4ea9878a-b108-4feb-9385-158a04f591d6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:00:03 crc kubenswrapper[5118]: I0223 09:00:03.706629 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea9878a-b108-4feb-9385-158a04f591d6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:03 crc kubenswrapper[5118]: I0223 09:00:03.722888 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea9878a-b108-4feb-9385-158a04f591d6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4ea9878a-b108-4feb-9385-158a04f591d6" (UID: "4ea9878a-b108-4feb-9385-158a04f591d6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:00:03 crc kubenswrapper[5118]: I0223 09:00:03.729626 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea9878a-b108-4feb-9385-158a04f591d6-kube-api-access-fngf4" (OuterVolumeSpecName: "kube-api-access-fngf4") pod "4ea9878a-b108-4feb-9385-158a04f591d6" (UID: "4ea9878a-b108-4feb-9385-158a04f591d6"). InnerVolumeSpecName "kube-api-access-fngf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:00:03 crc kubenswrapper[5118]: I0223 09:00:03.808513 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ea9878a-b108-4feb-9385-158a04f591d6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:03 crc kubenswrapper[5118]: I0223 09:00:03.808908 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fngf4\" (UniqueName: \"kubernetes.io/projected/4ea9878a-b108-4feb-9385-158a04f591d6-kube-api-access-fngf4\") on node \"crc\" DevicePath \"\"" Feb 23 09:00:04 crc kubenswrapper[5118]: I0223 09:00:04.146238 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" event={"ID":"4ea9878a-b108-4feb-9385-158a04f591d6","Type":"ContainerDied","Data":"130932d2232f317b40fb1e6b1bc5ee2eeb5325377241b550015688c957f00da3"} Feb 23 09:00:04 crc kubenswrapper[5118]: I0223 09:00:04.146281 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="130932d2232f317b40fb1e6b1bc5ee2eeb5325377241b550015688c957f00da3" Feb 23 09:00:04 crc kubenswrapper[5118]: I0223 09:00:04.146471 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk" Feb 23 09:00:04 crc kubenswrapper[5118]: I0223 09:00:04.666230 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g"] Feb 23 09:00:04 crc kubenswrapper[5118]: I0223 09:00:04.677780 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530575-6tm2g"] Feb 23 09:00:05 crc kubenswrapper[5118]: I0223 09:00:05.712847 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d79b299b-f10b-4056-aa4e-78a9a3368d64" path="/var/lib/kubelet/pods/d79b299b-f10b-4056-aa4e-78a9a3368d64/volumes" Feb 23 09:00:05 crc kubenswrapper[5118]: I0223 09:00:05.813058 5118 scope.go:117] "RemoveContainer" containerID="613538892e59998b65529532169dcd1a057ce670163a6fe806a3db8934f299fe" Feb 23 09:00:05 crc kubenswrapper[5118]: I0223 09:00:05.847068 5118 scope.go:117] "RemoveContainer" containerID="4109af6013c738107be704ca631d39f3a74fbe065217b23d61a2be23c6929ffc" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.181256 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29530621-v4ftv"] Feb 23 09:01:00 crc kubenswrapper[5118]: E0223 09:01:00.182452 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea9878a-b108-4feb-9385-158a04f591d6" containerName="collect-profiles" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.182476 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea9878a-b108-4feb-9385-158a04f591d6" containerName="collect-profiles" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.182907 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea9878a-b108-4feb-9385-158a04f591d6" containerName="collect-profiles" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.186918 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.217711 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530621-v4ftv"] Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.377748 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-combined-ca-bundle\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.377806 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-fernet-keys\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.377841 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-config-data\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.377874 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j5sq\" (UniqueName: \"kubernetes.io/projected/63ba3449-8278-4d38-b02c-080d45721d1c-kube-api-access-4j5sq\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.482139 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-combined-ca-bundle\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.482219 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-fernet-keys\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.482270 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-config-data\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.482308 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j5sq\" (UniqueName: \"kubernetes.io/projected/63ba3449-8278-4d38-b02c-080d45721d1c-kube-api-access-4j5sq\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.488799 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-combined-ca-bundle\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.490884 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-fernet-keys\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.507385 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-config-data\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.521906 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j5sq\" (UniqueName: \"kubernetes.io/projected/63ba3449-8278-4d38-b02c-080d45721d1c-kube-api-access-4j5sq\") pod \"keystone-cron-29530621-v4ftv\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:00 crc kubenswrapper[5118]: I0223 09:01:00.522630 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:01 crc kubenswrapper[5118]: I0223 09:01:01.002320 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530621-v4ftv"] Feb 23 09:01:01 crc kubenswrapper[5118]: I0223 09:01:01.831343 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-v4ftv" event={"ID":"63ba3449-8278-4d38-b02c-080d45721d1c","Type":"ContainerStarted","Data":"f0c9307106555e570cbfc6b9fae5f65bd0d2f9c05dbd5dcabeccca1bd428a535"} Feb 23 09:01:01 crc kubenswrapper[5118]: I0223 09:01:01.831686 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-v4ftv" event={"ID":"63ba3449-8278-4d38-b02c-080d45721d1c","Type":"ContainerStarted","Data":"a0367a2764c95e0dcc91f591aca4f6f432cbc3d1f29f0cf04da8eb864cf8e183"} Feb 23 09:01:01 crc kubenswrapper[5118]: I0223 09:01:01.854450 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29530621-v4ftv" podStartSLOduration=1.8544345679999998 podStartE2EDuration="1.854434568s" podCreationTimestamp="2026-02-23 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:01:01.848038354 +0000 UTC m=+8124.851822927" watchObservedRunningTime="2026-02-23 09:01:01.854434568 +0000 UTC m=+8124.858219141" Feb 23 09:01:03 crc kubenswrapper[5118]: I0223 09:01:03.850922 5118 generic.go:334] "Generic (PLEG): container finished" podID="63ba3449-8278-4d38-b02c-080d45721d1c" containerID="f0c9307106555e570cbfc6b9fae5f65bd0d2f9c05dbd5dcabeccca1bd428a535" exitCode=0 Feb 23 09:01:03 crc kubenswrapper[5118]: I0223 09:01:03.851001 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-v4ftv" event={"ID":"63ba3449-8278-4d38-b02c-080d45721d1c","Type":"ContainerDied","Data":"f0c9307106555e570cbfc6b9fae5f65bd0d2f9c05dbd5dcabeccca1bd428a535"} Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.319632 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.427536 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-config-data\") pod \"63ba3449-8278-4d38-b02c-080d45721d1c\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.427785 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-combined-ca-bundle\") pod \"63ba3449-8278-4d38-b02c-080d45721d1c\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.428065 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-fernet-keys\") pod \"63ba3449-8278-4d38-b02c-080d45721d1c\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.428384 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j5sq\" (UniqueName: \"kubernetes.io/projected/63ba3449-8278-4d38-b02c-080d45721d1c-kube-api-access-4j5sq\") pod \"63ba3449-8278-4d38-b02c-080d45721d1c\" (UID: \"63ba3449-8278-4d38-b02c-080d45721d1c\") " Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.437373 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "63ba3449-8278-4d38-b02c-080d45721d1c" (UID: "63ba3449-8278-4d38-b02c-080d45721d1c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.437729 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ba3449-8278-4d38-b02c-080d45721d1c-kube-api-access-4j5sq" (OuterVolumeSpecName: "kube-api-access-4j5sq") pod "63ba3449-8278-4d38-b02c-080d45721d1c" (UID: "63ba3449-8278-4d38-b02c-080d45721d1c"). InnerVolumeSpecName "kube-api-access-4j5sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.458403 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63ba3449-8278-4d38-b02c-080d45721d1c" (UID: "63ba3449-8278-4d38-b02c-080d45721d1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.503231 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-config-data" (OuterVolumeSpecName: "config-data") pod "63ba3449-8278-4d38-b02c-080d45721d1c" (UID: "63ba3449-8278-4d38-b02c-080d45721d1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.532787 5118 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.532850 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j5sq\" (UniqueName: \"kubernetes.io/projected/63ba3449-8278-4d38-b02c-080d45721d1c-kube-api-access-4j5sq\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.532875 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.532895 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63ba3449-8278-4d38-b02c-080d45721d1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.876305 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530621-v4ftv" event={"ID":"63ba3449-8278-4d38-b02c-080d45721d1c","Type":"ContainerDied","Data":"a0367a2764c95e0dcc91f591aca4f6f432cbc3d1f29f0cf04da8eb864cf8e183"} Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.876691 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0367a2764c95e0dcc91f591aca4f6f432cbc3d1f29f0cf04da8eb864cf8e183" Feb 23 09:01:05 crc kubenswrapper[5118]: I0223 09:01:05.876387 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530621-v4ftv" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.583586 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fg6jj"] Feb 23 09:01:15 crc kubenswrapper[5118]: E0223 09:01:15.584669 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ba3449-8278-4d38-b02c-080d45721d1c" containerName="keystone-cron" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.584689 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ba3449-8278-4d38-b02c-080d45721d1c" containerName="keystone-cron" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.584994 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ba3449-8278-4d38-b02c-080d45721d1c" containerName="keystone-cron" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.586934 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.605152 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fg6jj"] Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.648621 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgp6m\" (UniqueName: \"kubernetes.io/projected/ce0333cf-702e-4607-a1a1-acbddcbac09c-kube-api-access-xgp6m\") pod \"redhat-marketplace-fg6jj\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.648757 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-catalog-content\") pod \"redhat-marketplace-fg6jj\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.648805 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-utilities\") pod \"redhat-marketplace-fg6jj\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.750821 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgp6m\" (UniqueName: \"kubernetes.io/projected/ce0333cf-702e-4607-a1a1-acbddcbac09c-kube-api-access-xgp6m\") pod \"redhat-marketplace-fg6jj\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.750919 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-catalog-content\") pod \"redhat-marketplace-fg6jj\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.750948 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-utilities\") pod \"redhat-marketplace-fg6jj\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.751460 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-catalog-content\") pod \"redhat-marketplace-fg6jj\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.751583 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-utilities\") pod \"redhat-marketplace-fg6jj\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.769174 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgp6m\" (UniqueName: \"kubernetes.io/projected/ce0333cf-702e-4607-a1a1-acbddcbac09c-kube-api-access-xgp6m\") pod \"redhat-marketplace-fg6jj\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:15 crc kubenswrapper[5118]: I0223 09:01:15.920130 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:16 crc kubenswrapper[5118]: I0223 09:01:16.404678 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fg6jj"] Feb 23 09:01:16 crc kubenswrapper[5118]: I0223 09:01:16.988244 5118 generic.go:334] "Generic (PLEG): container finished" podID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerID="6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43" exitCode=0 Feb 23 09:01:16 crc kubenswrapper[5118]: I0223 09:01:16.988509 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg6jj" event={"ID":"ce0333cf-702e-4607-a1a1-acbddcbac09c","Type":"ContainerDied","Data":"6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43"} Feb 23 09:01:16 crc kubenswrapper[5118]: I0223 09:01:16.988560 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg6jj" event={"ID":"ce0333cf-702e-4607-a1a1-acbddcbac09c","Type":"ContainerStarted","Data":"3c267ea4b734b8c7c891c5b6be41cc0be338d29d4d34e4898d4a6ae5c6ed310d"} Feb 23 09:01:16 crc kubenswrapper[5118]: I0223 09:01:16.990726 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:01:19 crc kubenswrapper[5118]: I0223 09:01:19.009034 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg6jj" event={"ID":"ce0333cf-702e-4607-a1a1-acbddcbac09c","Type":"ContainerStarted","Data":"2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31"} Feb 23 09:01:20 crc kubenswrapper[5118]: I0223 09:01:20.019432 5118 generic.go:334] "Generic (PLEG): container finished" podID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerID="2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31" exitCode=0 Feb 23 09:01:20 crc kubenswrapper[5118]: I0223 09:01:20.019469 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg6jj" event={"ID":"ce0333cf-702e-4607-a1a1-acbddcbac09c","Type":"ContainerDied","Data":"2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31"} Feb 23 09:01:21 crc kubenswrapper[5118]: I0223 09:01:21.044112 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg6jj" event={"ID":"ce0333cf-702e-4607-a1a1-acbddcbac09c","Type":"ContainerStarted","Data":"fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4"} Feb 23 09:01:21 crc kubenswrapper[5118]: I0223 09:01:21.069412 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fg6jj" podStartSLOduration=2.6324015579999998 podStartE2EDuration="6.069395258s" podCreationTimestamp="2026-02-23 09:01:15 +0000 UTC" firstStartedPulling="2026-02-23 09:01:16.990370253 +0000 UTC m=+8139.994154846" lastFinishedPulling="2026-02-23 09:01:20.427363973 +0000 UTC m=+8143.431148546" observedRunningTime="2026-02-23 09:01:21.067334169 +0000 UTC m=+8144.071118742" watchObservedRunningTime="2026-02-23 09:01:21.069395258 +0000 UTC m=+8144.073179831" Feb 23 09:01:25 crc kubenswrapper[5118]: I0223 09:01:25.920560 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:25 crc kubenswrapper[5118]: I0223 09:01:25.922469 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:25 crc kubenswrapper[5118]: I0223 09:01:25.987198 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:26 crc kubenswrapper[5118]: I0223 09:01:26.173310 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:26 crc kubenswrapper[5118]: I0223 09:01:26.267560 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fg6jj"] Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.145870 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fg6jj" podUID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerName="registry-server" containerID="cri-o://fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4" gracePeriod=2 Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.771082 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.838608 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgp6m\" (UniqueName: \"kubernetes.io/projected/ce0333cf-702e-4607-a1a1-acbddcbac09c-kube-api-access-xgp6m\") pod \"ce0333cf-702e-4607-a1a1-acbddcbac09c\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.838743 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-catalog-content\") pod \"ce0333cf-702e-4607-a1a1-acbddcbac09c\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.838799 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-utilities\") pod \"ce0333cf-702e-4607-a1a1-acbddcbac09c\" (UID: \"ce0333cf-702e-4607-a1a1-acbddcbac09c\") " Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.839855 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-utilities" (OuterVolumeSpecName: "utilities") pod "ce0333cf-702e-4607-a1a1-acbddcbac09c" (UID: "ce0333cf-702e-4607-a1a1-acbddcbac09c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.845761 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0333cf-702e-4607-a1a1-acbddcbac09c-kube-api-access-xgp6m" (OuterVolumeSpecName: "kube-api-access-xgp6m") pod "ce0333cf-702e-4607-a1a1-acbddcbac09c" (UID: "ce0333cf-702e-4607-a1a1-acbddcbac09c"). InnerVolumeSpecName "kube-api-access-xgp6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.858276 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce0333cf-702e-4607-a1a1-acbddcbac09c" (UID: "ce0333cf-702e-4607-a1a1-acbddcbac09c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.942088 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgp6m\" (UniqueName: \"kubernetes.io/projected/ce0333cf-702e-4607-a1a1-acbddcbac09c-kube-api-access-xgp6m\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.942133 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:28 crc kubenswrapper[5118]: I0223 09:01:28.942142 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce0333cf-702e-4607-a1a1-acbddcbac09c-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.157528 5118 generic.go:334] "Generic (PLEG): container finished" podID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerID="fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4" exitCode=0 Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.157578 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg6jj" event={"ID":"ce0333cf-702e-4607-a1a1-acbddcbac09c","Type":"ContainerDied","Data":"fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4"} Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.157618 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fg6jj" event={"ID":"ce0333cf-702e-4607-a1a1-acbddcbac09c","Type":"ContainerDied","Data":"3c267ea4b734b8c7c891c5b6be41cc0be338d29d4d34e4898d4a6ae5c6ed310d"} Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.157636 5118 scope.go:117] "RemoveContainer" containerID="fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.157644 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fg6jj" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.186009 5118 scope.go:117] "RemoveContainer" containerID="2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.236325 5118 scope.go:117] "RemoveContainer" containerID="6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.236968 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fg6jj"] Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.250157 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fg6jj"] Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.262902 5118 scope.go:117] "RemoveContainer" containerID="fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4" Feb 23 09:01:29 crc kubenswrapper[5118]: E0223 09:01:29.263340 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4\": container with ID starting with fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4 not found: ID does not exist" containerID="fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.263374 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4"} err="failed to get container status \"fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4\": rpc error: code = NotFound desc = could not find container \"fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4\": container with ID starting with fc3c998cce3751c358813fa5063c80884c8a3f0c9434c89f2342793935ff93f4 not found: ID does not exist" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.263419 5118 scope.go:117] "RemoveContainer" containerID="2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31" Feb 23 09:01:29 crc kubenswrapper[5118]: E0223 09:01:29.263616 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31\": container with ID starting with 2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31 not found: ID does not exist" containerID="2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.263640 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31"} err="failed to get container status \"2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31\": rpc error: code = NotFound desc = could not find container \"2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31\": container with ID starting with 2f7769481de26e9242db1bedcd769f41e6051d01247a519a9e822e8a7756ed31 not found: ID does not exist" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.263656 5118 scope.go:117] "RemoveContainer" containerID="6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43" Feb 23 09:01:29 crc kubenswrapper[5118]: E0223 09:01:29.263846 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43\": container with ID starting with 6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43 not found: ID does not exist" containerID="6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.263870 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43"} err="failed to get container status \"6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43\": rpc error: code = NotFound desc = could not find container \"6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43\": container with ID starting with 6e68a5ab99463abd461a83dd45c242b1eb70898d5f03198a18d658eec64bcf43 not found: ID does not exist" Feb 23 09:01:29 crc kubenswrapper[5118]: I0223 09:01:29.715648 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0333cf-702e-4607-a1a1-acbddcbac09c" path="/var/lib/kubelet/pods/ce0333cf-702e-4607-a1a1-acbddcbac09c/volumes" Feb 23 09:01:32 crc kubenswrapper[5118]: I0223 09:01:32.974933 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:01:32 crc kubenswrapper[5118]: I0223 09:01:32.975597 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:02:02 crc kubenswrapper[5118]: I0223 09:02:02.975290 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:02:02 crc kubenswrapper[5118]: I0223 09:02:02.976206 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:02:32 crc kubenswrapper[5118]: I0223 09:02:32.975326 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:02:32 crc kubenswrapper[5118]: I0223 09:02:32.975912 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:02:32 crc kubenswrapper[5118]: I0223 09:02:32.975963 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 09:02:32 crc kubenswrapper[5118]: I0223 09:02:32.976634 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:02:32 crc kubenswrapper[5118]: I0223 09:02:32.976705 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" gracePeriod=600 Feb 23 09:02:33 crc kubenswrapper[5118]: E0223 09:02:33.102757 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:02:33 crc kubenswrapper[5118]: I0223 09:02:33.919392 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" exitCode=0 Feb 23 09:02:33 crc kubenswrapper[5118]: I0223 09:02:33.919524 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025"} Feb 23 09:02:33 crc kubenswrapper[5118]: I0223 09:02:33.919719 5118 scope.go:117] "RemoveContainer" containerID="a926957ef80dc4661611201421a5011ea66ab9aea8a66b566fdd4ea493c9ceae" Feb 23 09:02:33 crc kubenswrapper[5118]: I0223 09:02:33.921163 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:02:33 crc kubenswrapper[5118]: E0223 09:02:33.921782 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:02:44 crc kubenswrapper[5118]: I0223 09:02:44.697495 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:02:44 crc kubenswrapper[5118]: E0223 09:02:44.698155 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:02:55 crc kubenswrapper[5118]: I0223 09:02:55.697981 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:02:55 crc kubenswrapper[5118]: E0223 09:02:55.700445 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:03:09 crc kubenswrapper[5118]: I0223 09:03:09.698432 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:03:09 crc kubenswrapper[5118]: E0223 09:03:09.699322 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:03:24 crc kubenswrapper[5118]: I0223 09:03:24.697503 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:03:24 crc kubenswrapper[5118]: E0223 09:03:24.698125 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:03:36 crc kubenswrapper[5118]: I0223 09:03:36.699844 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:03:36 crc kubenswrapper[5118]: E0223 09:03:36.701771 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:03:43 crc kubenswrapper[5118]: I0223 09:03:43.046474 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-7af3-account-create-update-pflph"] Feb 23 09:03:43 crc kubenswrapper[5118]: I0223 09:03:43.057429 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-7af3-account-create-update-pflph"] Feb 23 09:03:43 crc kubenswrapper[5118]: I0223 09:03:43.068280 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-d9crc"] Feb 23 09:03:43 crc kubenswrapper[5118]: I0223 09:03:43.077222 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-d9crc"] Feb 23 09:03:43 crc kubenswrapper[5118]: I0223 09:03:43.713250 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fed72ec-b872-42ca-91e9-354debb471bd" path="/var/lib/kubelet/pods/5fed72ec-b872-42ca-91e9-354debb471bd/volumes" Feb 23 09:03:43 crc kubenswrapper[5118]: I0223 09:03:43.714279 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5" path="/var/lib/kubelet/pods/b7f99e13-f91b-4ee8-a7f1-4e0b7399b0c5/volumes" Feb 23 09:03:45 crc kubenswrapper[5118]: I0223 09:03:45.679867 5118 generic.go:334] "Generic (PLEG): container finished" podID="2a03095b-b10c-4ab3-b1cb-4b31730a0585" containerID="fe45d9ca06a9b62541e453a10876ad1813e789788e71266e7626098b0fc8f0c8" exitCode=0 Feb 23 09:03:45 crc kubenswrapper[5118]: I0223 09:03:45.679964 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" event={"ID":"2a03095b-b10c-4ab3-b1cb-4b31730a0585","Type":"ContainerDied","Data":"fe45d9ca06a9b62541e453a10876ad1813e789788e71266e7626098b0fc8f0c8"} Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.162239 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.275799 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-tripleo-cleanup-combined-ca-bundle\") pod \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.275855 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-ssh-key-openstack-networker\") pod \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.276055 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgfvd\" (UniqueName: \"kubernetes.io/projected/2a03095b-b10c-4ab3-b1cb-4b31730a0585-kube-api-access-pgfvd\") pod \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.276080 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-inventory\") pod \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\" (UID: \"2a03095b-b10c-4ab3-b1cb-4b31730a0585\") " Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.282676 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "2a03095b-b10c-4ab3-b1cb-4b31730a0585" (UID: "2a03095b-b10c-4ab3-b1cb-4b31730a0585"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.283686 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a03095b-b10c-4ab3-b1cb-4b31730a0585-kube-api-access-pgfvd" (OuterVolumeSpecName: "kube-api-access-pgfvd") pod "2a03095b-b10c-4ab3-b1cb-4b31730a0585" (UID: "2a03095b-b10c-4ab3-b1cb-4b31730a0585"). InnerVolumeSpecName "kube-api-access-pgfvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.305508 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-inventory" (OuterVolumeSpecName: "inventory") pod "2a03095b-b10c-4ab3-b1cb-4b31730a0585" (UID: "2a03095b-b10c-4ab3-b1cb-4b31730a0585"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.325332 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "2a03095b-b10c-4ab3-b1cb-4b31730a0585" (UID: "2a03095b-b10c-4ab3-b1cb-4b31730a0585"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.378980 5118 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.379057 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.379078 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgfvd\" (UniqueName: \"kubernetes.io/projected/2a03095b-b10c-4ab3-b1cb-4b31730a0585-kube-api-access-pgfvd\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.379113 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a03095b-b10c-4ab3-b1cb-4b31730a0585-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.704173 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.716059 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf" event={"ID":"2a03095b-b10c-4ab3-b1cb-4b31730a0585","Type":"ContainerDied","Data":"6652c15f749b3797d71c0821a913e9a00ceae51df10a7dd7babea809194364c8"} Feb 23 09:03:47 crc kubenswrapper[5118]: I0223 09:03:47.716166 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6652c15f749b3797d71c0821a913e9a00ceae51df10a7dd7babea809194364c8" Feb 23 09:03:49 crc kubenswrapper[5118]: I0223 09:03:49.702513 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:03:49 crc kubenswrapper[5118]: E0223 09:03:49.704043 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:03:59 crc kubenswrapper[5118]: I0223 09:03:59.038152 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-tjk8l"] Feb 23 09:03:59 crc kubenswrapper[5118]: I0223 09:03:59.048595 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-tjk8l"] Feb 23 09:03:59 crc kubenswrapper[5118]: I0223 09:03:59.710439 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d70fed4-6e64-4129-8fc5-75a44f448492" path="/var/lib/kubelet/pods/6d70fed4-6e64-4129-8fc5-75a44f448492/volumes" Feb 23 09:04:03 crc kubenswrapper[5118]: I0223 09:04:03.697963 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:04:03 crc kubenswrapper[5118]: E0223 09:04:03.701330 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:04:06 crc kubenswrapper[5118]: I0223 09:04:06.096032 5118 scope.go:117] "RemoveContainer" containerID="b8e1f2a06064dfc1973d351a0bebc58729628220ebb1f3d731c3aaaf5b585d8f" Feb 23 09:04:06 crc kubenswrapper[5118]: I0223 09:04:06.151266 5118 scope.go:117] "RemoveContainer" containerID="de1cf432c845b9fed9f90796455105de65a739de606adeee1601a42dd09ed884" Feb 23 09:04:06 crc kubenswrapper[5118]: I0223 09:04:06.211039 5118 scope.go:117] "RemoveContainer" containerID="390206dd53bd2a04438ba11eb9dd3db5f160cedeefc5f1853c0a8812278a576b" Feb 23 09:04:15 crc kubenswrapper[5118]: I0223 09:04:15.698491 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:04:15 crc kubenswrapper[5118]: E0223 09:04:15.699334 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:04:29 crc kubenswrapper[5118]: I0223 09:04:29.698474 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:04:29 crc kubenswrapper[5118]: E0223 09:04:29.699888 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:04:41 crc kubenswrapper[5118]: I0223 09:04:41.697603 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:04:41 crc kubenswrapper[5118]: E0223 09:04:41.698663 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:04:52 crc kubenswrapper[5118]: I0223 09:04:52.698723 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:04:52 crc kubenswrapper[5118]: E0223 09:04:52.700135 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:05:04 crc kubenswrapper[5118]: I0223 09:05:04.698619 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:05:04 crc kubenswrapper[5118]: E0223 09:05:04.699436 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:05:18 crc kubenswrapper[5118]: I0223 09:05:18.698422 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:05:18 crc kubenswrapper[5118]: E0223 09:05:18.699434 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:05:30 crc kubenswrapper[5118]: I0223 09:05:30.697909 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:05:30 crc kubenswrapper[5118]: E0223 09:05:30.698598 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:05:41 crc kubenswrapper[5118]: I0223 09:05:41.697932 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:05:41 crc kubenswrapper[5118]: E0223 09:05:41.698684 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:05:56 crc kubenswrapper[5118]: I0223 09:05:56.698090 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:05:56 crc kubenswrapper[5118]: E0223 09:05:56.698825 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:06:10 crc kubenswrapper[5118]: I0223 09:06:10.698089 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:06:10 crc kubenswrapper[5118]: E0223 09:06:10.699575 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:06:15 crc kubenswrapper[5118]: I0223 09:06:15.049876 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-1fc5-account-create-update-jpknx"] Feb 23 09:06:15 crc kubenswrapper[5118]: I0223 09:06:15.061585 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-rd62q"] Feb 23 09:06:15 crc kubenswrapper[5118]: I0223 09:06:15.071358 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-rd62q"] Feb 23 09:06:15 crc kubenswrapper[5118]: I0223 09:06:15.080052 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-1fc5-account-create-update-jpknx"] Feb 23 09:06:15 crc kubenswrapper[5118]: I0223 09:06:15.712116 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a17dcef-17b9-4b97-8d8e-5793a42993fa" path="/var/lib/kubelet/pods/2a17dcef-17b9-4b97-8d8e-5793a42993fa/volumes" Feb 23 09:06:15 crc kubenswrapper[5118]: I0223 09:06:15.712922 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="724cca70-923b-4ffd-9acd-8091e7e33632" path="/var/lib/kubelet/pods/724cca70-923b-4ffd-9acd-8091e7e33632/volumes" Feb 23 09:06:24 crc kubenswrapper[5118]: I0223 09:06:24.697349 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:06:24 crc kubenswrapper[5118]: E0223 09:06:24.700077 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:06:28 crc kubenswrapper[5118]: I0223 09:06:28.039673 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-pmzh9"] Feb 23 09:06:28 crc kubenswrapper[5118]: I0223 09:06:28.057203 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-pmzh9"] Feb 23 09:06:29 crc kubenswrapper[5118]: I0223 09:06:29.709977 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651f79aa-be9a-40aa-afae-ebb8d8497a7d" path="/var/lib/kubelet/pods/651f79aa-be9a-40aa-afae-ebb8d8497a7d/volumes" Feb 23 09:06:37 crc kubenswrapper[5118]: I0223 09:06:37.704669 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:06:37 crc kubenswrapper[5118]: E0223 09:06:37.706266 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:06:46 crc kubenswrapper[5118]: I0223 09:06:46.048619 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-ctz6l"] Feb 23 09:06:46 crc kubenswrapper[5118]: I0223 09:06:46.057615 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-ctz6l"] Feb 23 09:06:47 crc kubenswrapper[5118]: I0223 09:06:47.046445 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-4dde-account-create-update-h7lbk"] Feb 23 09:06:47 crc kubenswrapper[5118]: I0223 09:06:47.073716 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-4dde-account-create-update-h7lbk"] Feb 23 09:06:47 crc kubenswrapper[5118]: I0223 09:06:47.710255 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06ba0846-0019-4e22-83da-01867d8f5605" path="/var/lib/kubelet/pods/06ba0846-0019-4e22-83da-01867d8f5605/volumes" Feb 23 09:06:47 crc kubenswrapper[5118]: I0223 09:06:47.711024 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d758df02-271f-469b-b9c4-ef4b56b0f745" path="/var/lib/kubelet/pods/d758df02-271f-469b-b9c4-ef4b56b0f745/volumes" Feb 23 09:06:51 crc kubenswrapper[5118]: I0223 09:06:51.697252 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:06:51 crc kubenswrapper[5118]: E0223 09:06:51.698022 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:07:01 crc kubenswrapper[5118]: I0223 09:07:01.039088 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-rv82j"] Feb 23 09:07:01 crc kubenswrapper[5118]: I0223 09:07:01.048594 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-rv82j"] Feb 23 09:07:01 crc kubenswrapper[5118]: I0223 09:07:01.707900 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c07546f-84d3-41cf-a43a-f6cbe7129aca" path="/var/lib/kubelet/pods/1c07546f-84d3-41cf-a43a-f6cbe7129aca/volumes" Feb 23 09:07:06 crc kubenswrapper[5118]: I0223 09:07:06.395557 5118 scope.go:117] "RemoveContainer" containerID="568ca2fff95840aa64307a9d805e10c65cd0b3d649ce697e31dbb928c009b322" Feb 23 09:07:06 crc kubenswrapper[5118]: I0223 09:07:06.440619 5118 scope.go:117] "RemoveContainer" containerID="11b9f61934120e6a994398b7c68652af10d3cac440288682549d3130cefee46c" Feb 23 09:07:06 crc kubenswrapper[5118]: I0223 09:07:06.476740 5118 scope.go:117] "RemoveContainer" containerID="03d8b73093ced66b25c2400e2eb2d65570e3f23cec2116c02c6f866b0efb2804" Feb 23 09:07:06 crc kubenswrapper[5118]: I0223 09:07:06.540061 5118 scope.go:117] "RemoveContainer" containerID="e85fee9128b7a1f5e96cf1fef037d6a1c38a7bfa72954553ede8111475ae186e" Feb 23 09:07:06 crc kubenswrapper[5118]: I0223 09:07:06.602937 5118 scope.go:117] "RemoveContainer" containerID="1085916e24138bbcf53f8ec5956bce19754b84760c910821a629bf32816f65f9" Feb 23 09:07:06 crc kubenswrapper[5118]: I0223 09:07:06.626555 5118 scope.go:117] "RemoveContainer" containerID="77f6c7533e8f9c84f7533b4d524125c76f2b22db261909594b9fb3ba4155fd34" Feb 23 09:07:06 crc kubenswrapper[5118]: I0223 09:07:06.697811 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:07:06 crc kubenswrapper[5118]: E0223 09:07:06.698211 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:07:11 crc kubenswrapper[5118]: I0223 09:07:11.933617 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tmqzt"] Feb 23 09:07:11 crc kubenswrapper[5118]: E0223 09:07:11.934640 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerName="registry-server" Feb 23 09:07:11 crc kubenswrapper[5118]: I0223 09:07:11.934654 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerName="registry-server" Feb 23 09:07:11 crc kubenswrapper[5118]: E0223 09:07:11.934679 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a03095b-b10c-4ab3-b1cb-4b31730a0585" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 23 09:07:11 crc kubenswrapper[5118]: I0223 09:07:11.934691 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a03095b-b10c-4ab3-b1cb-4b31730a0585" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 23 09:07:11 crc kubenswrapper[5118]: E0223 09:07:11.934719 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerName="extract-utilities" Feb 23 09:07:11 crc kubenswrapper[5118]: I0223 09:07:11.934726 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerName="extract-utilities" Feb 23 09:07:11 crc kubenswrapper[5118]: E0223 09:07:11.934738 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerName="extract-content" Feb 23 09:07:11 crc kubenswrapper[5118]: I0223 09:07:11.934745 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerName="extract-content" Feb 23 09:07:11 crc kubenswrapper[5118]: I0223 09:07:11.935027 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a03095b-b10c-4ab3-b1cb-4b31730a0585" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 23 09:07:11 crc kubenswrapper[5118]: I0223 09:07:11.935041 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0333cf-702e-4607-a1a1-acbddcbac09c" containerName="registry-server" Feb 23 09:07:11 crc kubenswrapper[5118]: I0223 09:07:11.936679 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:11 crc kubenswrapper[5118]: I0223 09:07:11.942733 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmqzt"] Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.063035 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-catalog-content\") pod \"community-operators-tmqzt\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.063595 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smzn9\" (UniqueName: \"kubernetes.io/projected/ad031e2a-81b8-4068-9549-592efb859894-kube-api-access-smzn9\") pod \"community-operators-tmqzt\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.063682 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-utilities\") pod \"community-operators-tmqzt\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.165274 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-catalog-content\") pod \"community-operators-tmqzt\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.165358 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smzn9\" (UniqueName: \"kubernetes.io/projected/ad031e2a-81b8-4068-9549-592efb859894-kube-api-access-smzn9\") pod \"community-operators-tmqzt\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.165448 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-utilities\") pod \"community-operators-tmqzt\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.165745 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-catalog-content\") pod \"community-operators-tmqzt\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.165824 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-utilities\") pod \"community-operators-tmqzt\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.185319 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smzn9\" (UniqueName: \"kubernetes.io/projected/ad031e2a-81b8-4068-9549-592efb859894-kube-api-access-smzn9\") pod \"community-operators-tmqzt\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.272506 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.796572 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmqzt"] Feb 23 09:07:12 crc kubenswrapper[5118]: I0223 09:07:12.891569 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmqzt" event={"ID":"ad031e2a-81b8-4068-9549-592efb859894","Type":"ContainerStarted","Data":"238a7ea71d8cb715b8f52f3d34bd3ee59ccf2728ff81da36390726afe232d791"} Feb 23 09:07:13 crc kubenswrapper[5118]: I0223 09:07:13.901353 5118 generic.go:334] "Generic (PLEG): container finished" podID="ad031e2a-81b8-4068-9549-592efb859894" containerID="ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a" exitCode=0 Feb 23 09:07:13 crc kubenswrapper[5118]: I0223 09:07:13.901446 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmqzt" event={"ID":"ad031e2a-81b8-4068-9549-592efb859894","Type":"ContainerDied","Data":"ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a"} Feb 23 09:07:13 crc kubenswrapper[5118]: I0223 09:07:13.903470 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:07:14 crc kubenswrapper[5118]: I0223 09:07:14.916701 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmqzt" event={"ID":"ad031e2a-81b8-4068-9549-592efb859894","Type":"ContainerStarted","Data":"5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a"} Feb 23 09:07:16 crc kubenswrapper[5118]: I0223 09:07:16.935549 5118 generic.go:334] "Generic (PLEG): container finished" podID="ad031e2a-81b8-4068-9549-592efb859894" containerID="5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a" exitCode=0 Feb 23 09:07:16 crc kubenswrapper[5118]: I0223 09:07:16.935681 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmqzt" event={"ID":"ad031e2a-81b8-4068-9549-592efb859894","Type":"ContainerDied","Data":"5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a"} Feb 23 09:07:17 crc kubenswrapper[5118]: I0223 09:07:17.959032 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmqzt" event={"ID":"ad031e2a-81b8-4068-9549-592efb859894","Type":"ContainerStarted","Data":"2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d"} Feb 23 09:07:17 crc kubenswrapper[5118]: I0223 09:07:17.984516 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tmqzt" podStartSLOduration=3.528836902 podStartE2EDuration="6.984493211s" podCreationTimestamp="2026-02-23 09:07:11 +0000 UTC" firstStartedPulling="2026-02-23 09:07:13.90318512 +0000 UTC m=+8496.906969703" lastFinishedPulling="2026-02-23 09:07:17.358841439 +0000 UTC m=+8500.362626012" observedRunningTime="2026-02-23 09:07:17.974973681 +0000 UTC m=+8500.978758254" watchObservedRunningTime="2026-02-23 09:07:17.984493211 +0000 UTC m=+8500.988277784" Feb 23 09:07:21 crc kubenswrapper[5118]: I0223 09:07:21.698004 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:07:21 crc kubenswrapper[5118]: E0223 09:07:21.698753 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:07:22 crc kubenswrapper[5118]: I0223 09:07:22.273378 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:22 crc kubenswrapper[5118]: I0223 09:07:22.273696 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:22 crc kubenswrapper[5118]: I0223 09:07:22.347723 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:23 crc kubenswrapper[5118]: I0223 09:07:23.068966 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:23 crc kubenswrapper[5118]: I0223 09:07:23.145240 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmqzt"] Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.026147 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tmqzt" podUID="ad031e2a-81b8-4068-9549-592efb859894" containerName="registry-server" containerID="cri-o://2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d" gracePeriod=2 Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.576786 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.652964 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smzn9\" (UniqueName: \"kubernetes.io/projected/ad031e2a-81b8-4068-9549-592efb859894-kube-api-access-smzn9\") pod \"ad031e2a-81b8-4068-9549-592efb859894\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.653161 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-utilities\") pod \"ad031e2a-81b8-4068-9549-592efb859894\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.653206 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-catalog-content\") pod \"ad031e2a-81b8-4068-9549-592efb859894\" (UID: \"ad031e2a-81b8-4068-9549-592efb859894\") " Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.654913 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-utilities" (OuterVolumeSpecName: "utilities") pod "ad031e2a-81b8-4068-9549-592efb859894" (UID: "ad031e2a-81b8-4068-9549-592efb859894"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.659224 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad031e2a-81b8-4068-9549-592efb859894-kube-api-access-smzn9" (OuterVolumeSpecName: "kube-api-access-smzn9") pod "ad031e2a-81b8-4068-9549-592efb859894" (UID: "ad031e2a-81b8-4068-9549-592efb859894"). InnerVolumeSpecName "kube-api-access-smzn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.713113 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad031e2a-81b8-4068-9549-592efb859894" (UID: "ad031e2a-81b8-4068-9549-592efb859894"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.756065 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.756301 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad031e2a-81b8-4068-9549-592efb859894-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:25 crc kubenswrapper[5118]: I0223 09:07:25.756396 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smzn9\" (UniqueName: \"kubernetes.io/projected/ad031e2a-81b8-4068-9549-592efb859894-kube-api-access-smzn9\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.036343 5118 generic.go:334] "Generic (PLEG): container finished" podID="ad031e2a-81b8-4068-9549-592efb859894" containerID="2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d" exitCode=0 Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.036388 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmqzt" event={"ID":"ad031e2a-81b8-4068-9549-592efb859894","Type":"ContainerDied","Data":"2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d"} Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.036412 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmqzt" Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.036427 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmqzt" event={"ID":"ad031e2a-81b8-4068-9549-592efb859894","Type":"ContainerDied","Data":"238a7ea71d8cb715b8f52f3d34bd3ee59ccf2728ff81da36390726afe232d791"} Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.036445 5118 scope.go:117] "RemoveContainer" containerID="2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d" Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.060513 5118 scope.go:117] "RemoveContainer" containerID="5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a" Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.072669 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmqzt"] Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.085276 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tmqzt"] Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.089412 5118 scope.go:117] "RemoveContainer" containerID="ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a" Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.132867 5118 scope.go:117] "RemoveContainer" containerID="2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d" Feb 23 09:07:26 crc kubenswrapper[5118]: E0223 09:07:26.133547 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d\": container with ID starting with 2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d not found: ID does not exist" containerID="2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d" Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.133587 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d"} err="failed to get container status \"2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d\": rpc error: code = NotFound desc = could not find container \"2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d\": container with ID starting with 2ba907bc050ec81a7f33ef87bf940f2cb27cff2e6a80cb37f82b4de35c4b142d not found: ID does not exist" Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.133613 5118 scope.go:117] "RemoveContainer" containerID="5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a" Feb 23 09:07:26 crc kubenswrapper[5118]: E0223 09:07:26.134003 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a\": container with ID starting with 5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a not found: ID does not exist" containerID="5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a" Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.134023 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a"} err="failed to get container status \"5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a\": rpc error: code = NotFound desc = could not find container \"5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a\": container with ID starting with 5d07b7b684bc546d3b787d6c506903f3e997cd7525f3af920c7685c179a3b89a not found: ID does not exist" Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.134037 5118 scope.go:117] "RemoveContainer" containerID="ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a" Feb 23 09:07:26 crc kubenswrapper[5118]: E0223 09:07:26.134315 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a\": container with ID starting with ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a not found: ID does not exist" containerID="ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a" Feb 23 09:07:26 crc kubenswrapper[5118]: I0223 09:07:26.134335 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a"} err="failed to get container status \"ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a\": rpc error: code = NotFound desc = could not find container \"ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a\": container with ID starting with ac79a8f19dfeed95db7f2f5fa55626ed15f22e30bc88c39b2916b5eac608ca4a not found: ID does not exist" Feb 23 09:07:27 crc kubenswrapper[5118]: I0223 09:07:27.709110 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad031e2a-81b8-4068-9549-592efb859894" path="/var/lib/kubelet/pods/ad031e2a-81b8-4068-9549-592efb859894/volumes" Feb 23 09:07:33 crc kubenswrapper[5118]: I0223 09:07:33.697908 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:07:34 crc kubenswrapper[5118]: I0223 09:07:34.114027 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"3cdbaaea2ff4c518333e3cbc7d250f6e82fcea3d3f29154b7902186226929720"} Feb 23 09:07:46 crc kubenswrapper[5118]: E0223 09:07:46.473622 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37085206_dc9b_47c1_a2a5_5b54b83c7e47.slice/crio-conmon-f10b41b574b4b3c9b78baa1377a7cb00cf4ee66a2dba8b5a7436f18fb2701754.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37085206_dc9b_47c1_a2a5_5b54b83c7e47.slice/crio-f10b41b574b4b3c9b78baa1377a7cb00cf4ee66a2dba8b5a7436f18fb2701754.scope\": RecentStats: unable to find data in memory cache]" Feb 23 09:07:47 crc kubenswrapper[5118]: I0223 09:07:47.228666 5118 generic.go:334] "Generic (PLEG): container finished" podID="37085206-dc9b-47c1-a2a5-5b54b83c7e47" containerID="f10b41b574b4b3c9b78baa1377a7cb00cf4ee66a2dba8b5a7436f18fb2701754" exitCode=0 Feb 23 09:07:47 crc kubenswrapper[5118]: I0223 09:07:47.228730 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" event={"ID":"37085206-dc9b-47c1-a2a5-5b54b83c7e47","Type":"ContainerDied","Data":"f10b41b574b4b3c9b78baa1377a7cb00cf4ee66a2dba8b5a7436f18fb2701754"} Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.710384 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.826936 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ssh-key-openstack-cell1\") pod \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.827320 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsndx\" (UniqueName: \"kubernetes.io/projected/37085206-dc9b-47c1-a2a5-5b54b83c7e47-kube-api-access-nsndx\") pod \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.827473 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ceph\") pod \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.827632 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-inventory\") pod \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.827716 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-tripleo-cleanup-combined-ca-bundle\") pod \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\" (UID: \"37085206-dc9b-47c1-a2a5-5b54b83c7e47\") " Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.832658 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ceph" (OuterVolumeSpecName: "ceph") pod "37085206-dc9b-47c1-a2a5-5b54b83c7e47" (UID: "37085206-dc9b-47c1-a2a5-5b54b83c7e47"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.833071 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37085206-dc9b-47c1-a2a5-5b54b83c7e47-kube-api-access-nsndx" (OuterVolumeSpecName: "kube-api-access-nsndx") pod "37085206-dc9b-47c1-a2a5-5b54b83c7e47" (UID: "37085206-dc9b-47c1-a2a5-5b54b83c7e47"). InnerVolumeSpecName "kube-api-access-nsndx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.836240 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "37085206-dc9b-47c1-a2a5-5b54b83c7e47" (UID: "37085206-dc9b-47c1-a2a5-5b54b83c7e47"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.855117 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "37085206-dc9b-47c1-a2a5-5b54b83c7e47" (UID: "37085206-dc9b-47c1-a2a5-5b54b83c7e47"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.865309 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-inventory" (OuterVolumeSpecName: "inventory") pod "37085206-dc9b-47c1-a2a5-5b54b83c7e47" (UID: "37085206-dc9b-47c1-a2a5-5b54b83c7e47"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.930128 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.930163 5118 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.930178 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.930192 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsndx\" (UniqueName: \"kubernetes.io/projected/37085206-dc9b-47c1-a2a5-5b54b83c7e47-kube-api-access-nsndx\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:48 crc kubenswrapper[5118]: I0223 09:07:48.930204 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/37085206-dc9b-47c1-a2a5-5b54b83c7e47-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:07:49 crc kubenswrapper[5118]: I0223 09:07:49.253009 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" event={"ID":"37085206-dc9b-47c1-a2a5-5b54b83c7e47","Type":"ContainerDied","Data":"a573baf76635e2e0a4f9bd1598b72389ebd22f86979ee73844bd29cd1e23e01c"} Feb 23 09:07:49 crc kubenswrapper[5118]: I0223 09:07:49.253431 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a573baf76635e2e0a4f9bd1598b72389ebd22f86979ee73844bd29cd1e23e01c" Feb 23 09:07:49 crc kubenswrapper[5118]: I0223 09:07:49.253079 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.096525 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-tq4rm"] Feb 23 09:07:59 crc kubenswrapper[5118]: E0223 09:07:59.097627 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad031e2a-81b8-4068-9549-592efb859894" containerName="extract-utilities" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.097644 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad031e2a-81b8-4068-9549-592efb859894" containerName="extract-utilities" Feb 23 09:07:59 crc kubenswrapper[5118]: E0223 09:07:59.097653 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad031e2a-81b8-4068-9549-592efb859894" containerName="extract-content" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.097661 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad031e2a-81b8-4068-9549-592efb859894" containerName="extract-content" Feb 23 09:07:59 crc kubenswrapper[5118]: E0223 09:07:59.097685 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad031e2a-81b8-4068-9549-592efb859894" containerName="registry-server" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.097695 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad031e2a-81b8-4068-9549-592efb859894" containerName="registry-server" Feb 23 09:07:59 crc kubenswrapper[5118]: E0223 09:07:59.097706 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37085206-dc9b-47c1-a2a5-5b54b83c7e47" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.097715 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="37085206-dc9b-47c1-a2a5-5b54b83c7e47" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.097972 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad031e2a-81b8-4068-9549-592efb859894" containerName="registry-server" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.098009 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="37085206-dc9b-47c1-a2a5-5b54b83c7e47" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.099148 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.101246 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.103049 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.103325 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.103473 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.134232 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-tq4rm"] Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.172943 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-krbms"] Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.174276 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.176999 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.177014 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.184089 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.184246 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-inventory\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.184274 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnmsp\" (UniqueName: \"kubernetes.io/projected/687c6ebc-4763-49bb-8293-b1e92f4f39ca-kube-api-access-nnmsp\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.184319 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.184363 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ceph\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.185720 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-krbms"] Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.286753 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.286810 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.286834 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr2ql\" (UniqueName: \"kubernetes.io/projected/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-kube-api-access-sr2ql\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.286901 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-inventory\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.286919 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnmsp\" (UniqueName: \"kubernetes.io/projected/687c6ebc-4763-49bb-8293-b1e92f4f39ca-kube-api-access-nnmsp\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.286935 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.286964 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ceph\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.286980 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.287008 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-inventory\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.294175 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ceph\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.294733 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.294780 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.297533 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-inventory\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.309823 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnmsp\" (UniqueName: \"kubernetes.io/projected/687c6ebc-4763-49bb-8293-b1e92f4f39ca-kube-api-access-nnmsp\") pod \"bootstrap-openstack-openstack-cell1-tq4rm\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.394174 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.394524 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-inventory\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.394679 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.394724 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr2ql\" (UniqueName: \"kubernetes.io/projected/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-kube-api-access-sr2ql\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.399359 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-inventory\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.399852 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.402891 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.420000 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr2ql\" (UniqueName: \"kubernetes.io/projected/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-kube-api-access-sr2ql\") pod \"bootstrap-openstack-openstack-networker-krbms\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.443210 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:07:59 crc kubenswrapper[5118]: I0223 09:07:59.500573 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:08:00 crc kubenswrapper[5118]: I0223 09:08:00.040769 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-tq4rm"] Feb 23 09:08:00 crc kubenswrapper[5118]: I0223 09:08:00.151773 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-krbms"] Feb 23 09:08:00 crc kubenswrapper[5118]: W0223 09:08:00.155372 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f95e63a_6e71_4bf2_9ba9_694e78a6e0c1.slice/crio-2361ee7363e87503b419b2002dc8a51658ebaf99e7cf807a7771634389b0011f WatchSource:0}: Error finding container 2361ee7363e87503b419b2002dc8a51658ebaf99e7cf807a7771634389b0011f: Status 404 returned error can't find the container with id 2361ee7363e87503b419b2002dc8a51658ebaf99e7cf807a7771634389b0011f Feb 23 09:08:00 crc kubenswrapper[5118]: I0223 09:08:00.342790 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-krbms" event={"ID":"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1","Type":"ContainerStarted","Data":"2361ee7363e87503b419b2002dc8a51658ebaf99e7cf807a7771634389b0011f"} Feb 23 09:08:00 crc kubenswrapper[5118]: I0223 09:08:00.345378 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" event={"ID":"687c6ebc-4763-49bb-8293-b1e92f4f39ca","Type":"ContainerStarted","Data":"21858828c0261bff85a25e2f0a93b5c62fce30cc37c3179525580950df47ff5d"} Feb 23 09:08:01 crc kubenswrapper[5118]: I0223 09:08:01.355593 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" event={"ID":"687c6ebc-4763-49bb-8293-b1e92f4f39ca","Type":"ContainerStarted","Data":"e3f7d9c4998943af03b91c92d36e2459cc696562643d4b2561d6e11d44c0d217"} Feb 23 09:08:01 crc kubenswrapper[5118]: I0223 09:08:01.358815 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-krbms" event={"ID":"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1","Type":"ContainerStarted","Data":"9e8606ee682c3e992981c48b640d9f6ce36f945c2fdb1521422879a1f251d7cd"} Feb 23 09:08:01 crc kubenswrapper[5118]: I0223 09:08:01.378135 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" podStartSLOduration=1.9325473039999999 podStartE2EDuration="2.378115134s" podCreationTimestamp="2026-02-23 09:07:59 +0000 UTC" firstStartedPulling="2026-02-23 09:08:00.046452784 +0000 UTC m=+8543.050237357" lastFinishedPulling="2026-02-23 09:08:00.492020614 +0000 UTC m=+8543.495805187" observedRunningTime="2026-02-23 09:08:01.373668216 +0000 UTC m=+8544.377452789" watchObservedRunningTime="2026-02-23 09:08:01.378115134 +0000 UTC m=+8544.381899707" Feb 23 09:08:01 crc kubenswrapper[5118]: I0223 09:08:01.401565 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-networker-krbms" podStartSLOduration=1.973405628 podStartE2EDuration="2.401077607s" podCreationTimestamp="2026-02-23 09:07:59 +0000 UTC" firstStartedPulling="2026-02-23 09:08:00.157638434 +0000 UTC m=+8543.161423007" lastFinishedPulling="2026-02-23 09:08:00.585310413 +0000 UTC m=+8543.589094986" observedRunningTime="2026-02-23 09:08:01.392405448 +0000 UTC m=+8544.396190021" watchObservedRunningTime="2026-02-23 09:08:01.401077607 +0000 UTC m=+8544.404862190" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.153743 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r9zdn"] Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.165194 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.218970 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r9zdn"] Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.304985 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tkj7\" (UniqueName: \"kubernetes.io/projected/8726cc4c-9e5a-4d44-bfce-4e2925051764-kube-api-access-2tkj7\") pod \"certified-operators-r9zdn\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.305119 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-catalog-content\") pod \"certified-operators-r9zdn\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.305147 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-utilities\") pod \"certified-operators-r9zdn\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.407812 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tkj7\" (UniqueName: \"kubernetes.io/projected/8726cc4c-9e5a-4d44-bfce-4e2925051764-kube-api-access-2tkj7\") pod \"certified-operators-r9zdn\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.407962 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-catalog-content\") pod \"certified-operators-r9zdn\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.408009 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-utilities\") pod \"certified-operators-r9zdn\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.408628 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-catalog-content\") pod \"certified-operators-r9zdn\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.408683 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-utilities\") pod \"certified-operators-r9zdn\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.432067 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tkj7\" (UniqueName: \"kubernetes.io/projected/8726cc4c-9e5a-4d44-bfce-4e2925051764-kube-api-access-2tkj7\") pod \"certified-operators-r9zdn\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:58 crc kubenswrapper[5118]: I0223 09:09:58.519669 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:09:59 crc kubenswrapper[5118]: I0223 09:09:59.090622 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r9zdn"] Feb 23 09:09:59 crc kubenswrapper[5118]: I0223 09:09:59.508954 5118 generic.go:334] "Generic (PLEG): container finished" podID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerID="b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c" exitCode=0 Feb 23 09:09:59 crc kubenswrapper[5118]: I0223 09:09:59.509076 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9zdn" event={"ID":"8726cc4c-9e5a-4d44-bfce-4e2925051764","Type":"ContainerDied","Data":"b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c"} Feb 23 09:09:59 crc kubenswrapper[5118]: I0223 09:09:59.509311 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9zdn" event={"ID":"8726cc4c-9e5a-4d44-bfce-4e2925051764","Type":"ContainerStarted","Data":"5afdae68e5147ecac15d7b7b2dff7b45e534e3f4cd91c19bf260639782dbe0f2"} Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.529046 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-llqt5"] Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.531844 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.539123 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9zdn" event={"ID":"8726cc4c-9e5a-4d44-bfce-4e2925051764","Type":"ContainerStarted","Data":"9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934"} Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.548904 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-llqt5"] Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.570566 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-catalog-content\") pod \"redhat-operators-llqt5\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.570674 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-utilities\") pod \"redhat-operators-llqt5\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.570712 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdhzt\" (UniqueName: \"kubernetes.io/projected/537b0b99-87da-4848-852a-84e6bdd5a361-kube-api-access-qdhzt\") pod \"redhat-operators-llqt5\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.673036 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhzt\" (UniqueName: \"kubernetes.io/projected/537b0b99-87da-4848-852a-84e6bdd5a361-kube-api-access-qdhzt\") pod \"redhat-operators-llqt5\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.673201 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-catalog-content\") pod \"redhat-operators-llqt5\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.673311 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-utilities\") pod \"redhat-operators-llqt5\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.673785 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-catalog-content\") pod \"redhat-operators-llqt5\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.673818 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-utilities\") pod \"redhat-operators-llqt5\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.700130 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhzt\" (UniqueName: \"kubernetes.io/projected/537b0b99-87da-4848-852a-84e6bdd5a361-kube-api-access-qdhzt\") pod \"redhat-operators-llqt5\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:00 crc kubenswrapper[5118]: I0223 09:10:00.855602 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:01 crc kubenswrapper[5118]: I0223 09:10:01.370641 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-llqt5"] Feb 23 09:10:01 crc kubenswrapper[5118]: I0223 09:10:01.550659 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llqt5" event={"ID":"537b0b99-87da-4848-852a-84e6bdd5a361","Type":"ContainerStarted","Data":"e2111cb97b32731972aa9042b6a4707f811a0774349ded355cb63e97d093b263"} Feb 23 09:10:02 crc kubenswrapper[5118]: I0223 09:10:02.561809 5118 generic.go:334] "Generic (PLEG): container finished" podID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerID="9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934" exitCode=0 Feb 23 09:10:02 crc kubenswrapper[5118]: I0223 09:10:02.561901 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9zdn" event={"ID":"8726cc4c-9e5a-4d44-bfce-4e2925051764","Type":"ContainerDied","Data":"9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934"} Feb 23 09:10:02 crc kubenswrapper[5118]: I0223 09:10:02.565609 5118 generic.go:334] "Generic (PLEG): container finished" podID="537b0b99-87da-4848-852a-84e6bdd5a361" containerID="36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6" exitCode=0 Feb 23 09:10:02 crc kubenswrapper[5118]: I0223 09:10:02.565641 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llqt5" event={"ID":"537b0b99-87da-4848-852a-84e6bdd5a361","Type":"ContainerDied","Data":"36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6"} Feb 23 09:10:02 crc kubenswrapper[5118]: I0223 09:10:02.976249 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:10:02 crc kubenswrapper[5118]: I0223 09:10:02.976306 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:10:03 crc kubenswrapper[5118]: I0223 09:10:03.575695 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llqt5" event={"ID":"537b0b99-87da-4848-852a-84e6bdd5a361","Type":"ContainerStarted","Data":"b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68"} Feb 23 09:10:03 crc kubenswrapper[5118]: I0223 09:10:03.577326 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9zdn" event={"ID":"8726cc4c-9e5a-4d44-bfce-4e2925051764","Type":"ContainerStarted","Data":"73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43"} Feb 23 09:10:03 crc kubenswrapper[5118]: I0223 09:10:03.628428 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r9zdn" podStartSLOduration=2.179926408 podStartE2EDuration="5.628406283s" podCreationTimestamp="2026-02-23 09:09:58 +0000 UTC" firstStartedPulling="2026-02-23 09:09:59.510628204 +0000 UTC m=+8662.514412817" lastFinishedPulling="2026-02-23 09:10:02.959108119 +0000 UTC m=+8665.962892692" observedRunningTime="2026-02-23 09:10:03.618898805 +0000 UTC m=+8666.622683378" watchObservedRunningTime="2026-02-23 09:10:03.628406283 +0000 UTC m=+8666.632190866" Feb 23 09:10:08 crc kubenswrapper[5118]: I0223 09:10:08.520012 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:10:08 crc kubenswrapper[5118]: I0223 09:10:08.523391 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:10:08 crc kubenswrapper[5118]: I0223 09:10:08.645001 5118 generic.go:334] "Generic (PLEG): container finished" podID="537b0b99-87da-4848-852a-84e6bdd5a361" containerID="b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68" exitCode=0 Feb 23 09:10:08 crc kubenswrapper[5118]: I0223 09:10:08.645125 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llqt5" event={"ID":"537b0b99-87da-4848-852a-84e6bdd5a361","Type":"ContainerDied","Data":"b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68"} Feb 23 09:10:09 crc kubenswrapper[5118]: I0223 09:10:09.571462 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-r9zdn" podUID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerName="registry-server" probeResult="failure" output=< Feb 23 09:10:09 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:10:09 crc kubenswrapper[5118]: > Feb 23 09:10:09 crc kubenswrapper[5118]: I0223 09:10:09.657351 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llqt5" event={"ID":"537b0b99-87da-4848-852a-84e6bdd5a361","Type":"ContainerStarted","Data":"44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851"} Feb 23 09:10:09 crc kubenswrapper[5118]: I0223 09:10:09.686483 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-llqt5" podStartSLOduration=3.2235275740000002 podStartE2EDuration="9.686457144s" podCreationTimestamp="2026-02-23 09:10:00 +0000 UTC" firstStartedPulling="2026-02-23 09:10:02.567436329 +0000 UTC m=+8665.571220902" lastFinishedPulling="2026-02-23 09:10:09.030365899 +0000 UTC m=+8672.034150472" observedRunningTime="2026-02-23 09:10:09.675454998 +0000 UTC m=+8672.679239571" watchObservedRunningTime="2026-02-23 09:10:09.686457144 +0000 UTC m=+8672.690241727" Feb 23 09:10:10 crc kubenswrapper[5118]: I0223 09:10:10.857416 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:10 crc kubenswrapper[5118]: I0223 09:10:10.857752 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:11 crc kubenswrapper[5118]: I0223 09:10:11.905138 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llqt5" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="registry-server" probeResult="failure" output=< Feb 23 09:10:11 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:10:11 crc kubenswrapper[5118]: > Feb 23 09:10:18 crc kubenswrapper[5118]: I0223 09:10:18.579730 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:10:18 crc kubenswrapper[5118]: I0223 09:10:18.645493 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:10:18 crc kubenswrapper[5118]: I0223 09:10:18.827651 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r9zdn"] Feb 23 09:10:19 crc kubenswrapper[5118]: I0223 09:10:19.739356 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r9zdn" podUID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerName="registry-server" containerID="cri-o://73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43" gracePeriod=2 Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.287302 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.347585 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-utilities\") pod \"8726cc4c-9e5a-4d44-bfce-4e2925051764\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.347666 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-catalog-content\") pod \"8726cc4c-9e5a-4d44-bfce-4e2925051764\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.347759 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tkj7\" (UniqueName: \"kubernetes.io/projected/8726cc4c-9e5a-4d44-bfce-4e2925051764-kube-api-access-2tkj7\") pod \"8726cc4c-9e5a-4d44-bfce-4e2925051764\" (UID: \"8726cc4c-9e5a-4d44-bfce-4e2925051764\") " Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.348470 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-utilities" (OuterVolumeSpecName: "utilities") pod "8726cc4c-9e5a-4d44-bfce-4e2925051764" (UID: "8726cc4c-9e5a-4d44-bfce-4e2925051764"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.353434 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8726cc4c-9e5a-4d44-bfce-4e2925051764-kube-api-access-2tkj7" (OuterVolumeSpecName: "kube-api-access-2tkj7") pod "8726cc4c-9e5a-4d44-bfce-4e2925051764" (UID: "8726cc4c-9e5a-4d44-bfce-4e2925051764"). InnerVolumeSpecName "kube-api-access-2tkj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.406208 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8726cc4c-9e5a-4d44-bfce-4e2925051764" (UID: "8726cc4c-9e5a-4d44-bfce-4e2925051764"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.449870 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tkj7\" (UniqueName: \"kubernetes.io/projected/8726cc4c-9e5a-4d44-bfce-4e2925051764-kube-api-access-2tkj7\") on node \"crc\" DevicePath \"\"" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.449910 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.449920 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8726cc4c-9e5a-4d44-bfce-4e2925051764-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.751222 5118 generic.go:334] "Generic (PLEG): container finished" podID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerID="73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43" exitCode=0 Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.751287 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r9zdn" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.751289 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9zdn" event={"ID":"8726cc4c-9e5a-4d44-bfce-4e2925051764","Type":"ContainerDied","Data":"73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43"} Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.751388 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r9zdn" event={"ID":"8726cc4c-9e5a-4d44-bfce-4e2925051764","Type":"ContainerDied","Data":"5afdae68e5147ecac15d7b7b2dff7b45e534e3f4cd91c19bf260639782dbe0f2"} Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.751436 5118 scope.go:117] "RemoveContainer" containerID="73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.779440 5118 scope.go:117] "RemoveContainer" containerID="9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.789204 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r9zdn"] Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.799760 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r9zdn"] Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.824373 5118 scope.go:117] "RemoveContainer" containerID="b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.855231 5118 scope.go:117] "RemoveContainer" containerID="73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43" Feb 23 09:10:20 crc kubenswrapper[5118]: E0223 09:10:20.855715 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43\": container with ID starting with 73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43 not found: ID does not exist" containerID="73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.855746 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43"} err="failed to get container status \"73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43\": rpc error: code = NotFound desc = could not find container \"73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43\": container with ID starting with 73510e299042aabc5b6df5d1f0a27e8e7336eec1827c440e90f6e5a887ad2c43 not found: ID does not exist" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.855766 5118 scope.go:117] "RemoveContainer" containerID="9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934" Feb 23 09:10:20 crc kubenswrapper[5118]: E0223 09:10:20.856080 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934\": container with ID starting with 9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934 not found: ID does not exist" containerID="9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.856112 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934"} err="failed to get container status \"9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934\": rpc error: code = NotFound desc = could not find container \"9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934\": container with ID starting with 9a56246bc5c8995c4fea857239f7d137c58542386d27eefb7969149f78bea934 not found: ID does not exist" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.856123 5118 scope.go:117] "RemoveContainer" containerID="b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c" Feb 23 09:10:20 crc kubenswrapper[5118]: E0223 09:10:20.856472 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c\": container with ID starting with b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c not found: ID does not exist" containerID="b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c" Feb 23 09:10:20 crc kubenswrapper[5118]: I0223 09:10:20.856516 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c"} err="failed to get container status \"b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c\": rpc error: code = NotFound desc = could not find container \"b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c\": container with ID starting with b73e9c1c90802cd711c1b68520eabb94a9c2fbb6a4557a0ba07286930fa1ac8c not found: ID does not exist" Feb 23 09:10:21 crc kubenswrapper[5118]: I0223 09:10:21.715508 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8726cc4c-9e5a-4d44-bfce-4e2925051764" path="/var/lib/kubelet/pods/8726cc4c-9e5a-4d44-bfce-4e2925051764/volumes" Feb 23 09:10:21 crc kubenswrapper[5118]: I0223 09:10:21.902984 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llqt5" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="registry-server" probeResult="failure" output=< Feb 23 09:10:21 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:10:21 crc kubenswrapper[5118]: > Feb 23 09:10:31 crc kubenswrapper[5118]: I0223 09:10:31.909631 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llqt5" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="registry-server" probeResult="failure" output=< Feb 23 09:10:31 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:10:31 crc kubenswrapper[5118]: > Feb 23 09:10:32 crc kubenswrapper[5118]: I0223 09:10:32.974939 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:10:32 crc kubenswrapper[5118]: I0223 09:10:32.975273 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:10:41 crc kubenswrapper[5118]: I0223 09:10:41.903335 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-llqt5" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="registry-server" probeResult="failure" output=< Feb 23 09:10:41 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:10:41 crc kubenswrapper[5118]: > Feb 23 09:10:50 crc kubenswrapper[5118]: I0223 09:10:50.916767 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:50 crc kubenswrapper[5118]: I0223 09:10:50.980162 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:51 crc kubenswrapper[5118]: I0223 09:10:51.156813 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-llqt5"] Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.060090 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-llqt5" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="registry-server" containerID="cri-o://44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851" gracePeriod=2 Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.553812 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.674312 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-catalog-content\") pod \"537b0b99-87da-4848-852a-84e6bdd5a361\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.674349 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdhzt\" (UniqueName: \"kubernetes.io/projected/537b0b99-87da-4848-852a-84e6bdd5a361-kube-api-access-qdhzt\") pod \"537b0b99-87da-4848-852a-84e6bdd5a361\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.674398 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-utilities\") pod \"537b0b99-87da-4848-852a-84e6bdd5a361\" (UID: \"537b0b99-87da-4848-852a-84e6bdd5a361\") " Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.675591 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-utilities" (OuterVolumeSpecName: "utilities") pod "537b0b99-87da-4848-852a-84e6bdd5a361" (UID: "537b0b99-87da-4848-852a-84e6bdd5a361"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.680389 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537b0b99-87da-4848-852a-84e6bdd5a361-kube-api-access-qdhzt" (OuterVolumeSpecName: "kube-api-access-qdhzt") pod "537b0b99-87da-4848-852a-84e6bdd5a361" (UID: "537b0b99-87da-4848-852a-84e6bdd5a361"). InnerVolumeSpecName "kube-api-access-qdhzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.777413 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdhzt\" (UniqueName: \"kubernetes.io/projected/537b0b99-87da-4848-852a-84e6bdd5a361-kube-api-access-qdhzt\") on node \"crc\" DevicePath \"\"" Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.777448 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.805692 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "537b0b99-87da-4848-852a-84e6bdd5a361" (UID: "537b0b99-87da-4848-852a-84e6bdd5a361"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:10:52 crc kubenswrapper[5118]: I0223 09:10:52.879036 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/537b0b99-87da-4848-852a-84e6bdd5a361-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.071548 5118 generic.go:334] "Generic (PLEG): container finished" podID="537b0b99-87da-4848-852a-84e6bdd5a361" containerID="44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851" exitCode=0 Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.071600 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llqt5" event={"ID":"537b0b99-87da-4848-852a-84e6bdd5a361","Type":"ContainerDied","Data":"44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851"} Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.071629 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-llqt5" event={"ID":"537b0b99-87da-4848-852a-84e6bdd5a361","Type":"ContainerDied","Data":"e2111cb97b32731972aa9042b6a4707f811a0774349ded355cb63e97d093b263"} Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.071649 5118 scope.go:117] "RemoveContainer" containerID="44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.071709 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-llqt5" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.104233 5118 scope.go:117] "RemoveContainer" containerID="b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.111178 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-llqt5"] Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.121557 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-llqt5"] Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.132689 5118 scope.go:117] "RemoveContainer" containerID="36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.174551 5118 scope.go:117] "RemoveContainer" containerID="44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851" Feb 23 09:10:53 crc kubenswrapper[5118]: E0223 09:10:53.179190 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851\": container with ID starting with 44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851 not found: ID does not exist" containerID="44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.179243 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851"} err="failed to get container status \"44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851\": rpc error: code = NotFound desc = could not find container \"44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851\": container with ID starting with 44fa874e1e7f0d660caae94c0dc32e6ad66a617dec262320e2a37cc602235851 not found: ID does not exist" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.179271 5118 scope.go:117] "RemoveContainer" containerID="b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68" Feb 23 09:10:53 crc kubenswrapper[5118]: E0223 09:10:53.179659 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68\": container with ID starting with b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68 not found: ID does not exist" containerID="b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.179721 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68"} err="failed to get container status \"b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68\": rpc error: code = NotFound desc = could not find container \"b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68\": container with ID starting with b884cd75166bf9378a6e0ac579e2f809d28c27e63b1b1d4665545e6d7c45cf68 not found: ID does not exist" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.179763 5118 scope.go:117] "RemoveContainer" containerID="36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6" Feb 23 09:10:53 crc kubenswrapper[5118]: E0223 09:10:53.180126 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6\": container with ID starting with 36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6 not found: ID does not exist" containerID="36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.180161 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6"} err="failed to get container status \"36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6\": rpc error: code = NotFound desc = could not find container \"36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6\": container with ID starting with 36a60ce628c9f1be84ccd85b5856bc39d834840fa5b722b850878580b88c3db6 not found: ID does not exist" Feb 23 09:10:53 crc kubenswrapper[5118]: I0223 09:10:53.713842 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" path="/var/lib/kubelet/pods/537b0b99-87da-4848-852a-84e6bdd5a361/volumes" Feb 23 09:11:00 crc kubenswrapper[5118]: I0223 09:11:00.179749 5118 generic.go:334] "Generic (PLEG): container finished" podID="5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1" containerID="9e8606ee682c3e992981c48b640d9f6ce36f945c2fdb1521422879a1f251d7cd" exitCode=0 Feb 23 09:11:00 crc kubenswrapper[5118]: I0223 09:11:00.179864 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-krbms" event={"ID":"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1","Type":"ContainerDied","Data":"9e8606ee682c3e992981c48b640d9f6ce36f945c2fdb1521422879a1f251d7cd"} Feb 23 09:11:00 crc kubenswrapper[5118]: I0223 09:11:00.183186 5118 generic.go:334] "Generic (PLEG): container finished" podID="687c6ebc-4763-49bb-8293-b1e92f4f39ca" containerID="e3f7d9c4998943af03b91c92d36e2459cc696562643d4b2561d6e11d44c0d217" exitCode=0 Feb 23 09:11:00 crc kubenswrapper[5118]: I0223 09:11:00.183210 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" event={"ID":"687c6ebc-4763-49bb-8293-b1e92f4f39ca","Type":"ContainerDied","Data":"e3f7d9c4998943af03b91c92d36e2459cc696562643d4b2561d6e11d44c0d217"} Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.646761 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.769313 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-inventory\") pod \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.769416 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnmsp\" (UniqueName: \"kubernetes.io/projected/687c6ebc-4763-49bb-8293-b1e92f4f39ca-kube-api-access-nnmsp\") pod \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.769659 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ssh-key-openstack-cell1\") pod \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.769714 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ceph\") pod \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.769754 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-bootstrap-combined-ca-bundle\") pod \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\" (UID: \"687c6ebc-4763-49bb-8293-b1e92f4f39ca\") " Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.776138 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ceph" (OuterVolumeSpecName: "ceph") pod "687c6ebc-4763-49bb-8293-b1e92f4f39ca" (UID: "687c6ebc-4763-49bb-8293-b1e92f4f39ca"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.776349 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "687c6ebc-4763-49bb-8293-b1e92f4f39ca" (UID: "687c6ebc-4763-49bb-8293-b1e92f4f39ca"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.777859 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/687c6ebc-4763-49bb-8293-b1e92f4f39ca-kube-api-access-nnmsp" (OuterVolumeSpecName: "kube-api-access-nnmsp") pod "687c6ebc-4763-49bb-8293-b1e92f4f39ca" (UID: "687c6ebc-4763-49bb-8293-b1e92f4f39ca"). InnerVolumeSpecName "kube-api-access-nnmsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.800186 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-inventory" (OuterVolumeSpecName: "inventory") pod "687c6ebc-4763-49bb-8293-b1e92f4f39ca" (UID: "687c6ebc-4763-49bb-8293-b1e92f4f39ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.801302 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "687c6ebc-4763-49bb-8293-b1e92f4f39ca" (UID: "687c6ebc-4763-49bb-8293-b1e92f4f39ca"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.872859 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnmsp\" (UniqueName: \"kubernetes.io/projected/687c6ebc-4763-49bb-8293-b1e92f4f39ca-kube-api-access-nnmsp\") on node \"crc\" DevicePath \"\"" Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.872915 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.872928 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.872941 5118 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:11:01 crc kubenswrapper[5118]: I0223 09:11:01.872957 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/687c6ebc-4763-49bb-8293-b1e92f4f39ca-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.203181 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" event={"ID":"687c6ebc-4763-49bb-8293-b1e92f4f39ca","Type":"ContainerDied","Data":"21858828c0261bff85a25e2f0a93b5c62fce30cc37c3179525580950df47ff5d"} Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.203235 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21858828c0261bff85a25e2f0a93b5c62fce30cc37c3179525580950df47ff5d" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.203308 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-tq4rm" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.313755 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fpn5p"] Feb 23 09:11:02 crc kubenswrapper[5118]: E0223 09:11:02.314336 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="extract-content" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.314361 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="extract-content" Feb 23 09:11:02 crc kubenswrapper[5118]: E0223 09:11:02.314380 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerName="registry-server" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.314388 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerName="registry-server" Feb 23 09:11:02 crc kubenswrapper[5118]: E0223 09:11:02.314410 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687c6ebc-4763-49bb-8293-b1e92f4f39ca" containerName="bootstrap-openstack-openstack-cell1" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.314418 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="687c6ebc-4763-49bb-8293-b1e92f4f39ca" containerName="bootstrap-openstack-openstack-cell1" Feb 23 09:11:02 crc kubenswrapper[5118]: E0223 09:11:02.314428 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="extract-utilities" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.314435 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="extract-utilities" Feb 23 09:11:02 crc kubenswrapper[5118]: E0223 09:11:02.314461 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerName="extract-utilities" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.314468 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerName="extract-utilities" Feb 23 09:11:02 crc kubenswrapper[5118]: E0223 09:11:02.314483 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="registry-server" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.314488 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="registry-server" Feb 23 09:11:02 crc kubenswrapper[5118]: E0223 09:11:02.314494 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerName="extract-content" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.314500 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerName="extract-content" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.314685 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="537b0b99-87da-4848-852a-84e6bdd5a361" containerName="registry-server" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.314699 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="687c6ebc-4763-49bb-8293-b1e92f4f39ca" containerName="bootstrap-openstack-openstack-cell1" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.314710 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8726cc4c-9e5a-4d44-bfce-4e2925051764" containerName="registry-server" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.315557 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.319004 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.319832 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.329641 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fpn5p"] Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.382418 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.382673 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgmk7\" (UniqueName: \"kubernetes.io/projected/7bba7957-c51c-4948-8946-e3f32d217698-kube-api-access-sgmk7\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.382802 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ceph\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.382887 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-inventory\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.485411 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.485480 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgmk7\" (UniqueName: \"kubernetes.io/projected/7bba7957-c51c-4948-8946-e3f32d217698-kube-api-access-sgmk7\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.485577 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ceph\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.485612 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-inventory\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.503243 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ceph\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.503699 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-inventory\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.504341 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.505980 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgmk7\" (UniqueName: \"kubernetes.io/projected/7bba7957-c51c-4948-8946-e3f32d217698-kube-api-access-sgmk7\") pod \"download-cache-openstack-openstack-cell1-fpn5p\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.636431 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.643841 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.689870 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-bootstrap-combined-ca-bundle\") pod \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.689937 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-inventory\") pod \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.690213 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr2ql\" (UniqueName: \"kubernetes.io/projected/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-kube-api-access-sr2ql\") pod \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.690284 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-ssh-key-openstack-networker\") pod \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\" (UID: \"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1\") " Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.696466 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1" (UID: "5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.696608 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-kube-api-access-sr2ql" (OuterVolumeSpecName: "kube-api-access-sr2ql") pod "5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1" (UID: "5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1"). InnerVolumeSpecName "kube-api-access-sr2ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.724562 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-inventory" (OuterVolumeSpecName: "inventory") pod "5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1" (UID: "5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.730895 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1" (UID: "5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.798611 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr2ql\" (UniqueName: \"kubernetes.io/projected/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-kube-api-access-sr2ql\") on node \"crc\" DevicePath \"\"" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.798758 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.798774 5118 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.798787 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.975183 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.975249 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.975299 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.976241 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3cdbaaea2ff4c518333e3cbc7d250f6e82fcea3d3f29154b7902186226929720"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:11:02 crc kubenswrapper[5118]: I0223 09:11:02.976306 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://3cdbaaea2ff4c518333e3cbc7d250f6e82fcea3d3f29154b7902186226929720" gracePeriod=600 Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.173467 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fpn5p"] Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.220154 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-krbms" event={"ID":"5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1","Type":"ContainerDied","Data":"2361ee7363e87503b419b2002dc8a51658ebaf99e7cf807a7771634389b0011f"} Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.220200 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2361ee7363e87503b419b2002dc8a51658ebaf99e7cf807a7771634389b0011f" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.220278 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-krbms" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.221951 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" event={"ID":"7bba7957-c51c-4948-8946-e3f32d217698","Type":"ContainerStarted","Data":"88a29544da5e108498eec89069585018aab6e4086938846eefb47eca301e348a"} Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.234444 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="3cdbaaea2ff4c518333e3cbc7d250f6e82fcea3d3f29154b7902186226929720" exitCode=0 Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.234489 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"3cdbaaea2ff4c518333e3cbc7d250f6e82fcea3d3f29154b7902186226929720"} Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.234519 5118 scope.go:117] "RemoveContainer" containerID="3073685a73baf8c358608fe0735d7824255cf9da963acf1ee226f5a80eefd025" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.728856 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-networker-q6vdr"] Feb 23 09:11:03 crc kubenswrapper[5118]: E0223 09:11:03.729699 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1" containerName="bootstrap-openstack-openstack-networker" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.729715 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1" containerName="bootstrap-openstack-openstack-networker" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.729925 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1" containerName="bootstrap-openstack-openstack-networker" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.731028 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.734796 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.735090 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.748467 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-q6vdr"] Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.818233 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-inventory\") pod \"download-cache-openstack-openstack-networker-q6vdr\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.820364 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbgxb\" (UniqueName: \"kubernetes.io/projected/4a112539-48d7-414b-9d7e-b63e8afb0244-kube-api-access-vbgxb\") pod \"download-cache-openstack-openstack-networker-q6vdr\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.820621 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-q6vdr\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.923216 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-inventory\") pod \"download-cache-openstack-openstack-networker-q6vdr\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.923551 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbgxb\" (UniqueName: \"kubernetes.io/projected/4a112539-48d7-414b-9d7e-b63e8afb0244-kube-api-access-vbgxb\") pod \"download-cache-openstack-openstack-networker-q6vdr\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.923693 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-q6vdr\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.928354 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-inventory\") pod \"download-cache-openstack-openstack-networker-q6vdr\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.930643 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-q6vdr\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:03 crc kubenswrapper[5118]: I0223 09:11:03.939750 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbgxb\" (UniqueName: \"kubernetes.io/projected/4a112539-48d7-414b-9d7e-b63e8afb0244-kube-api-access-vbgxb\") pod \"download-cache-openstack-openstack-networker-q6vdr\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:04 crc kubenswrapper[5118]: I0223 09:11:04.136241 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:11:04 crc kubenswrapper[5118]: I0223 09:11:04.251234 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" event={"ID":"7bba7957-c51c-4948-8946-e3f32d217698","Type":"ContainerStarted","Data":"b2344a5b55228346b193fc1460186b37dacef042020af7de93bff9709d38a217"} Feb 23 09:11:04 crc kubenswrapper[5118]: I0223 09:11:04.261113 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae"} Feb 23 09:11:04 crc kubenswrapper[5118]: I0223 09:11:04.277370 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" podStartSLOduration=1.8288891600000001 podStartE2EDuration="2.277351231s" podCreationTimestamp="2026-02-23 09:11:02 +0000 UTC" firstStartedPulling="2026-02-23 09:11:03.183045772 +0000 UTC m=+8726.186830335" lastFinishedPulling="2026-02-23 09:11:03.631507833 +0000 UTC m=+8726.635292406" observedRunningTime="2026-02-23 09:11:04.264675066 +0000 UTC m=+8727.268459649" watchObservedRunningTime="2026-02-23 09:11:04.277351231 +0000 UTC m=+8727.281135814" Feb 23 09:11:04 crc kubenswrapper[5118]: I0223 09:11:04.682084 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-q6vdr"] Feb 23 09:11:05 crc kubenswrapper[5118]: I0223 09:11:05.275138 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-q6vdr" event={"ID":"4a112539-48d7-414b-9d7e-b63e8afb0244","Type":"ContainerStarted","Data":"f555e868bb5b9339441a311f775087b4cafff51ca69b7bac52d3ab3deb48feaf"} Feb 23 09:11:06 crc kubenswrapper[5118]: I0223 09:11:06.286086 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-q6vdr" event={"ID":"4a112539-48d7-414b-9d7e-b63e8afb0244","Type":"ContainerStarted","Data":"5ee03c9fc9d540ea145b1963b1070591d682736014a0769f413730a8ec420439"} Feb 23 09:11:06 crc kubenswrapper[5118]: I0223 09:11:06.308607 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-networker-q6vdr" podStartSLOduration=2.847539802 podStartE2EDuration="3.308579385s" podCreationTimestamp="2026-02-23 09:11:03 +0000 UTC" firstStartedPulling="2026-02-23 09:11:04.673631294 +0000 UTC m=+8727.677415867" lastFinishedPulling="2026-02-23 09:11:05.134670857 +0000 UTC m=+8728.138455450" observedRunningTime="2026-02-23 09:11:06.300091149 +0000 UTC m=+8729.303875732" watchObservedRunningTime="2026-02-23 09:11:06.308579385 +0000 UTC m=+8729.312363958" Feb 23 09:12:12 crc kubenswrapper[5118]: I0223 09:12:12.957202 5118 generic.go:334] "Generic (PLEG): container finished" podID="4a112539-48d7-414b-9d7e-b63e8afb0244" containerID="5ee03c9fc9d540ea145b1963b1070591d682736014a0769f413730a8ec420439" exitCode=0 Feb 23 09:12:12 crc kubenswrapper[5118]: I0223 09:12:12.957344 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-q6vdr" event={"ID":"4a112539-48d7-414b-9d7e-b63e8afb0244","Type":"ContainerDied","Data":"5ee03c9fc9d540ea145b1963b1070591d682736014a0769f413730a8ec420439"} Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.432553 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.572188 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-inventory\") pod \"4a112539-48d7-414b-9d7e-b63e8afb0244\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.572393 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbgxb\" (UniqueName: \"kubernetes.io/projected/4a112539-48d7-414b-9d7e-b63e8afb0244-kube-api-access-vbgxb\") pod \"4a112539-48d7-414b-9d7e-b63e8afb0244\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.572472 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-ssh-key-openstack-networker\") pod \"4a112539-48d7-414b-9d7e-b63e8afb0244\" (UID: \"4a112539-48d7-414b-9d7e-b63e8afb0244\") " Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.578718 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a112539-48d7-414b-9d7e-b63e8afb0244-kube-api-access-vbgxb" (OuterVolumeSpecName: "kube-api-access-vbgxb") pod "4a112539-48d7-414b-9d7e-b63e8afb0244" (UID: "4a112539-48d7-414b-9d7e-b63e8afb0244"). InnerVolumeSpecName "kube-api-access-vbgxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.601374 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "4a112539-48d7-414b-9d7e-b63e8afb0244" (UID: "4a112539-48d7-414b-9d7e-b63e8afb0244"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.616088 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-inventory" (OuterVolumeSpecName: "inventory") pod "4a112539-48d7-414b-9d7e-b63e8afb0244" (UID: "4a112539-48d7-414b-9d7e-b63e8afb0244"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.674855 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbgxb\" (UniqueName: \"kubernetes.io/projected/4a112539-48d7-414b-9d7e-b63e8afb0244-kube-api-access-vbgxb\") on node \"crc\" DevicePath \"\"" Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.674907 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.674920 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a112539-48d7-414b-9d7e-b63e8afb0244-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.975702 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-q6vdr" event={"ID":"4a112539-48d7-414b-9d7e-b63e8afb0244","Type":"ContainerDied","Data":"f555e868bb5b9339441a311f775087b4cafff51ca69b7bac52d3ab3deb48feaf"} Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.975748 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f555e868bb5b9339441a311f775087b4cafff51ca69b7bac52d3ab3deb48feaf" Feb 23 09:12:14 crc kubenswrapper[5118]: I0223 09:12:14.975832 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-q6vdr" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.065598 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-networker-gtzln"] Feb 23 09:12:15 crc kubenswrapper[5118]: E0223 09:12:15.066341 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a112539-48d7-414b-9d7e-b63e8afb0244" containerName="download-cache-openstack-openstack-networker" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.066372 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a112539-48d7-414b-9d7e-b63e8afb0244" containerName="download-cache-openstack-openstack-networker" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.066692 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a112539-48d7-414b-9d7e-b63e8afb0244" containerName="download-cache-openstack-openstack-networker" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.067659 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.072059 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.072128 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.081407 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-gtzln"] Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.184651 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-inventory\") pod \"configure-network-openstack-openstack-networker-gtzln\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.185070 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6vqh\" (UniqueName: \"kubernetes.io/projected/bfa05bc3-3fc4-4d65-a467-6096899d3260-kube-api-access-c6vqh\") pod \"configure-network-openstack-openstack-networker-gtzln\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.185163 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-gtzln\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.288393 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6vqh\" (UniqueName: \"kubernetes.io/projected/bfa05bc3-3fc4-4d65-a467-6096899d3260-kube-api-access-c6vqh\") pod \"configure-network-openstack-openstack-networker-gtzln\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.288529 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-gtzln\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.288814 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-inventory\") pod \"configure-network-openstack-openstack-networker-gtzln\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.294600 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-inventory\") pod \"configure-network-openstack-openstack-networker-gtzln\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.295735 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-gtzln\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.312978 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6vqh\" (UniqueName: \"kubernetes.io/projected/bfa05bc3-3fc4-4d65-a467-6096899d3260-kube-api-access-c6vqh\") pod \"configure-network-openstack-openstack-networker-gtzln\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.387028 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.940687 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-gtzln"] Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.952517 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:12:15 crc kubenswrapper[5118]: I0223 09:12:15.986353 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-gtzln" event={"ID":"bfa05bc3-3fc4-4d65-a467-6096899d3260","Type":"ContainerStarted","Data":"3cc71710aeb9a26a14da09017a10cd899e364fc605779fbb4df7093f4654c28c"} Feb 23 09:12:16 crc kubenswrapper[5118]: I0223 09:12:16.999251 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-gtzln" event={"ID":"bfa05bc3-3fc4-4d65-a467-6096899d3260","Type":"ContainerStarted","Data":"b7151c646046225fcb8d8a17628d158f0d80da691de99c24c969cc001de001ba"} Feb 23 09:12:17 crc kubenswrapper[5118]: I0223 09:12:17.022914 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-networker-gtzln" podStartSLOduration=1.5782193709999999 podStartE2EDuration="2.022891141s" podCreationTimestamp="2026-02-23 09:12:15 +0000 UTC" firstStartedPulling="2026-02-23 09:12:15.952257712 +0000 UTC m=+8798.956042285" lastFinishedPulling="2026-02-23 09:12:16.396929482 +0000 UTC m=+8799.400714055" observedRunningTime="2026-02-23 09:12:17.019508538 +0000 UTC m=+8800.023293131" watchObservedRunningTime="2026-02-23 09:12:17.022891141 +0000 UTC m=+8800.026675724" Feb 23 09:13:01 crc kubenswrapper[5118]: I0223 09:13:01.520186 5118 generic.go:334] "Generic (PLEG): container finished" podID="7bba7957-c51c-4948-8946-e3f32d217698" containerID="b2344a5b55228346b193fc1460186b37dacef042020af7de93bff9709d38a217" exitCode=0 Feb 23 09:13:01 crc kubenswrapper[5118]: I0223 09:13:01.520284 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" event={"ID":"7bba7957-c51c-4948-8946-e3f32d217698","Type":"ContainerDied","Data":"b2344a5b55228346b193fc1460186b37dacef042020af7de93bff9709d38a217"} Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.034242 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.157442 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgmk7\" (UniqueName: \"kubernetes.io/projected/7bba7957-c51c-4948-8946-e3f32d217698-kube-api-access-sgmk7\") pod \"7bba7957-c51c-4948-8946-e3f32d217698\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.157833 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ceph\") pod \"7bba7957-c51c-4948-8946-e3f32d217698\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.158017 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-inventory\") pod \"7bba7957-c51c-4948-8946-e3f32d217698\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.158256 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ssh-key-openstack-cell1\") pod \"7bba7957-c51c-4948-8946-e3f32d217698\" (UID: \"7bba7957-c51c-4948-8946-e3f32d217698\") " Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.167378 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bba7957-c51c-4948-8946-e3f32d217698-kube-api-access-sgmk7" (OuterVolumeSpecName: "kube-api-access-sgmk7") pod "7bba7957-c51c-4948-8946-e3f32d217698" (UID: "7bba7957-c51c-4948-8946-e3f32d217698"). InnerVolumeSpecName "kube-api-access-sgmk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.187448 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ceph" (OuterVolumeSpecName: "ceph") pod "7bba7957-c51c-4948-8946-e3f32d217698" (UID: "7bba7957-c51c-4948-8946-e3f32d217698"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.263498 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgmk7\" (UniqueName: \"kubernetes.io/projected/7bba7957-c51c-4948-8946-e3f32d217698-kube-api-access-sgmk7\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.263553 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.304073 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7bba7957-c51c-4948-8946-e3f32d217698" (UID: "7bba7957-c51c-4948-8946-e3f32d217698"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.327275 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-inventory" (OuterVolumeSpecName: "inventory") pod "7bba7957-c51c-4948-8946-e3f32d217698" (UID: "7bba7957-c51c-4948-8946-e3f32d217698"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.372367 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.372402 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7bba7957-c51c-4948-8946-e3f32d217698-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.540366 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" event={"ID":"7bba7957-c51c-4948-8946-e3f32d217698","Type":"ContainerDied","Data":"88a29544da5e108498eec89069585018aab6e4086938846eefb47eca301e348a"} Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.540428 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88a29544da5e108498eec89069585018aab6e4086938846eefb47eca301e348a" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.540524 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fpn5p" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.640896 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-znjvp"] Feb 23 09:13:03 crc kubenswrapper[5118]: E0223 09:13:03.641596 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bba7957-c51c-4948-8946-e3f32d217698" containerName="download-cache-openstack-openstack-cell1" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.641615 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bba7957-c51c-4948-8946-e3f32d217698" containerName="download-cache-openstack-openstack-cell1" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.641823 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bba7957-c51c-4948-8946-e3f32d217698" containerName="download-cache-openstack-openstack-cell1" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.642727 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.645287 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.645691 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.650467 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-znjvp"] Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.779938 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-inventory\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.780528 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npmkf\" (UniqueName: \"kubernetes.io/projected/c0594606-fafa-4f81-9497-2b0a637ef8d8-kube-api-access-npmkf\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.782021 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.782123 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ceph\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.883925 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npmkf\" (UniqueName: \"kubernetes.io/projected/c0594606-fafa-4f81-9497-2b0a637ef8d8-kube-api-access-npmkf\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.884154 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.884230 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ceph\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.884286 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-inventory\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.888319 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.888904 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-inventory\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.889218 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ceph\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.910527 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npmkf\" (UniqueName: \"kubernetes.io/projected/c0594606-fafa-4f81-9497-2b0a637ef8d8-kube-api-access-npmkf\") pod \"configure-network-openstack-openstack-cell1-znjvp\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:03 crc kubenswrapper[5118]: I0223 09:13:03.959041 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:13:04 crc kubenswrapper[5118]: I0223 09:13:04.570154 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-znjvp"] Feb 23 09:13:05 crc kubenswrapper[5118]: I0223 09:13:05.560470 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-znjvp" event={"ID":"c0594606-fafa-4f81-9497-2b0a637ef8d8","Type":"ContainerStarted","Data":"7d27b889ff73bdc8154520d56e4d3627dfc496647176dddbf40cec2d3e926a8c"} Feb 23 09:13:05 crc kubenswrapper[5118]: I0223 09:13:05.560769 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-znjvp" event={"ID":"c0594606-fafa-4f81-9497-2b0a637ef8d8","Type":"ContainerStarted","Data":"34e00294e197f1e812a010d1ff4fcf45f47a539daf2ba4ce72fb38b6c6077dc3"} Feb 23 09:13:05 crc kubenswrapper[5118]: I0223 09:13:05.584873 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-znjvp" podStartSLOduration=2.184421266 podStartE2EDuration="2.584849599s" podCreationTimestamp="2026-02-23 09:13:03 +0000 UTC" firstStartedPulling="2026-02-23 09:13:04.579234847 +0000 UTC m=+8847.583019420" lastFinishedPulling="2026-02-23 09:13:04.97966318 +0000 UTC m=+8847.983447753" observedRunningTime="2026-02-23 09:13:05.573988648 +0000 UTC m=+8848.577773241" watchObservedRunningTime="2026-02-23 09:13:05.584849599 +0000 UTC m=+8848.588634172" Feb 23 09:13:18 crc kubenswrapper[5118]: I0223 09:13:18.684944 5118 generic.go:334] "Generic (PLEG): container finished" podID="bfa05bc3-3fc4-4d65-a467-6096899d3260" containerID="b7151c646046225fcb8d8a17628d158f0d80da691de99c24c969cc001de001ba" exitCode=0 Feb 23 09:13:18 crc kubenswrapper[5118]: I0223 09:13:18.685037 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-gtzln" event={"ID":"bfa05bc3-3fc4-4d65-a467-6096899d3260","Type":"ContainerDied","Data":"b7151c646046225fcb8d8a17628d158f0d80da691de99c24c969cc001de001ba"} Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.100223 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.222435 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-ssh-key-openstack-networker\") pod \"bfa05bc3-3fc4-4d65-a467-6096899d3260\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.222645 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6vqh\" (UniqueName: \"kubernetes.io/projected/bfa05bc3-3fc4-4d65-a467-6096899d3260-kube-api-access-c6vqh\") pod \"bfa05bc3-3fc4-4d65-a467-6096899d3260\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.223358 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-inventory\") pod \"bfa05bc3-3fc4-4d65-a467-6096899d3260\" (UID: \"bfa05bc3-3fc4-4d65-a467-6096899d3260\") " Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.230509 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa05bc3-3fc4-4d65-a467-6096899d3260-kube-api-access-c6vqh" (OuterVolumeSpecName: "kube-api-access-c6vqh") pod "bfa05bc3-3fc4-4d65-a467-6096899d3260" (UID: "bfa05bc3-3fc4-4d65-a467-6096899d3260"). InnerVolumeSpecName "kube-api-access-c6vqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.255360 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "bfa05bc3-3fc4-4d65-a467-6096899d3260" (UID: "bfa05bc3-3fc4-4d65-a467-6096899d3260"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.255679 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-inventory" (OuterVolumeSpecName: "inventory") pod "bfa05bc3-3fc4-4d65-a467-6096899d3260" (UID: "bfa05bc3-3fc4-4d65-a467-6096899d3260"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.325940 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6vqh\" (UniqueName: \"kubernetes.io/projected/bfa05bc3-3fc4-4d65-a467-6096899d3260-kube-api-access-c6vqh\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.325987 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.325997 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/bfa05bc3-3fc4-4d65-a467-6096899d3260-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.704197 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-gtzln" event={"ID":"bfa05bc3-3fc4-4d65-a467-6096899d3260","Type":"ContainerDied","Data":"3cc71710aeb9a26a14da09017a10cd899e364fc605779fbb4df7093f4654c28c"} Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.704233 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc71710aeb9a26a14da09017a10cd899e364fc605779fbb4df7093f4654c28c" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.704246 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-gtzln" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.793583 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-networker-4ssqk"] Feb 23 09:13:20 crc kubenswrapper[5118]: E0223 09:13:20.794088 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa05bc3-3fc4-4d65-a467-6096899d3260" containerName="configure-network-openstack-openstack-networker" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.794126 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa05bc3-3fc4-4d65-a467-6096899d3260" containerName="configure-network-openstack-openstack-networker" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.794335 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa05bc3-3fc4-4d65-a467-6096899d3260" containerName="configure-network-openstack-openstack-networker" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.795046 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.796670 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.796945 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.804786 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-4ssqk"] Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.836450 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vb86\" (UniqueName: \"kubernetes.io/projected/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-kube-api-access-6vb86\") pod \"validate-network-openstack-openstack-networker-4ssqk\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.836490 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-4ssqk\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.836655 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-inventory\") pod \"validate-network-openstack-openstack-networker-4ssqk\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.937973 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-inventory\") pod \"validate-network-openstack-openstack-networker-4ssqk\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.938081 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vb86\" (UniqueName: \"kubernetes.io/projected/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-kube-api-access-6vb86\") pod \"validate-network-openstack-openstack-networker-4ssqk\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.938201 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-4ssqk\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.943685 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-inventory\") pod \"validate-network-openstack-openstack-networker-4ssqk\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.943771 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-4ssqk\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:20 crc kubenswrapper[5118]: I0223 09:13:20.954468 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vb86\" (UniqueName: \"kubernetes.io/projected/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-kube-api-access-6vb86\") pod \"validate-network-openstack-openstack-networker-4ssqk\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:21 crc kubenswrapper[5118]: I0223 09:13:21.113032 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:21 crc kubenswrapper[5118]: I0223 09:13:21.845933 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-4ssqk"] Feb 23 09:13:22 crc kubenswrapper[5118]: I0223 09:13:22.723674 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-4ssqk" event={"ID":"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc","Type":"ContainerStarted","Data":"f097946561f5c62559b118cf945d0659e2a81fe131a2eb059045836a713f2a07"} Feb 23 09:13:22 crc kubenswrapper[5118]: I0223 09:13:22.724130 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-4ssqk" event={"ID":"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc","Type":"ContainerStarted","Data":"da447d7ca9e939691a31ee31c7b9b3f257968f00e0bec8fdb110a92737d792eb"} Feb 23 09:13:22 crc kubenswrapper[5118]: I0223 09:13:22.747882 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-networker-4ssqk" podStartSLOduration=2.314755297 podStartE2EDuration="2.747866208s" podCreationTimestamp="2026-02-23 09:13:20 +0000 UTC" firstStartedPulling="2026-02-23 09:13:21.853050698 +0000 UTC m=+8864.856835271" lastFinishedPulling="2026-02-23 09:13:22.286161569 +0000 UTC m=+8865.289946182" observedRunningTime="2026-02-23 09:13:22.742460819 +0000 UTC m=+8865.746245392" watchObservedRunningTime="2026-02-23 09:13:22.747866208 +0000 UTC m=+8865.751650771" Feb 23 09:13:27 crc kubenswrapper[5118]: I0223 09:13:27.782820 5118 generic.go:334] "Generic (PLEG): container finished" podID="fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc" containerID="f097946561f5c62559b118cf945d0659e2a81fe131a2eb059045836a713f2a07" exitCode=0 Feb 23 09:13:27 crc kubenswrapper[5118]: I0223 09:13:27.782903 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-4ssqk" event={"ID":"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc","Type":"ContainerDied","Data":"f097946561f5c62559b118cf945d0659e2a81fe131a2eb059045836a713f2a07"} Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.252981 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.429867 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-ssh-key-openstack-networker\") pod \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.430066 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-inventory\") pod \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.430174 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vb86\" (UniqueName: \"kubernetes.io/projected/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-kube-api-access-6vb86\") pod \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\" (UID: \"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc\") " Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.436729 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-kube-api-access-6vb86" (OuterVolumeSpecName: "kube-api-access-6vb86") pod "fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc" (UID: "fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc"). InnerVolumeSpecName "kube-api-access-6vb86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.474685 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-inventory" (OuterVolumeSpecName: "inventory") pod "fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc" (UID: "fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.479233 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc" (UID: "fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.533636 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.533699 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vb86\" (UniqueName: \"kubernetes.io/projected/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-kube-api-access-6vb86\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.533720 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.807947 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-4ssqk" event={"ID":"fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc","Type":"ContainerDied","Data":"da447d7ca9e939691a31ee31c7b9b3f257968f00e0bec8fdb110a92737d792eb"} Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.807984 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da447d7ca9e939691a31ee31c7b9b3f257968f00e0bec8fdb110a92737d792eb" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.808399 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-4ssqk" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.874516 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-networker-8s5dc"] Feb 23 09:13:29 crc kubenswrapper[5118]: E0223 09:13:29.875074 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc" containerName="validate-network-openstack-openstack-networker" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.875116 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc" containerName="validate-network-openstack-openstack-networker" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.875356 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc" containerName="validate-network-openstack-openstack-networker" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.876349 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.879188 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.879228 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:13:29 crc kubenswrapper[5118]: I0223 09:13:29.885219 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-8s5dc"] Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.044617 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-8s5dc\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.044731 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdjkv\" (UniqueName: \"kubernetes.io/projected/350a056c-4003-4872-b099-2a475d6aabe9-kube-api-access-gdjkv\") pod \"install-os-openstack-openstack-networker-8s5dc\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.044769 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-inventory\") pod \"install-os-openstack-openstack-networker-8s5dc\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.146509 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdjkv\" (UniqueName: \"kubernetes.io/projected/350a056c-4003-4872-b099-2a475d6aabe9-kube-api-access-gdjkv\") pod \"install-os-openstack-openstack-networker-8s5dc\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.146631 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-inventory\") pod \"install-os-openstack-openstack-networker-8s5dc\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.147490 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-8s5dc\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.151313 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-8s5dc\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.151905 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-inventory\") pod \"install-os-openstack-openstack-networker-8s5dc\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.178316 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdjkv\" (UniqueName: \"kubernetes.io/projected/350a056c-4003-4872-b099-2a475d6aabe9-kube-api-access-gdjkv\") pod \"install-os-openstack-openstack-networker-8s5dc\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.194453 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.740494 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-8s5dc"] Feb 23 09:13:30 crc kubenswrapper[5118]: I0223 09:13:30.817868 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-8s5dc" event={"ID":"350a056c-4003-4872-b099-2a475d6aabe9","Type":"ContainerStarted","Data":"af58dd2a1d6169b2f7aa740b99f494ae1f080947c3abb12ae1b6ac667a5182db"} Feb 23 09:13:31 crc kubenswrapper[5118]: I0223 09:13:31.827030 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-8s5dc" event={"ID":"350a056c-4003-4872-b099-2a475d6aabe9","Type":"ContainerStarted","Data":"97b68201467062490c6f0e86f1cc910649474143199f3dc5c6aec9997d79a334"} Feb 23 09:13:31 crc kubenswrapper[5118]: I0223 09:13:31.858265 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-networker-8s5dc" podStartSLOduration=2.458790466 podStartE2EDuration="2.858248266s" podCreationTimestamp="2026-02-23 09:13:29 +0000 UTC" firstStartedPulling="2026-02-23 09:13:30.746975428 +0000 UTC m=+8873.750760001" lastFinishedPulling="2026-02-23 09:13:31.146433228 +0000 UTC m=+8874.150217801" observedRunningTime="2026-02-23 09:13:31.847893927 +0000 UTC m=+8874.851678510" watchObservedRunningTime="2026-02-23 09:13:31.858248266 +0000 UTC m=+8874.862032839" Feb 23 09:13:32 crc kubenswrapper[5118]: I0223 09:13:32.975351 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:13:32 crc kubenswrapper[5118]: I0223 09:13:32.975601 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:14:02 crc kubenswrapper[5118]: I0223 09:14:02.975453 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:14:02 crc kubenswrapper[5118]: I0223 09:14:02.976411 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:14:07 crc kubenswrapper[5118]: I0223 09:14:07.203259 5118 generic.go:334] "Generic (PLEG): container finished" podID="c0594606-fafa-4f81-9497-2b0a637ef8d8" containerID="7d27b889ff73bdc8154520d56e4d3627dfc496647176dddbf40cec2d3e926a8c" exitCode=0 Feb 23 09:14:07 crc kubenswrapper[5118]: I0223 09:14:07.203974 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-znjvp" event={"ID":"c0594606-fafa-4f81-9497-2b0a637ef8d8","Type":"ContainerDied","Data":"7d27b889ff73bdc8154520d56e4d3627dfc496647176dddbf40cec2d3e926a8c"} Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.615272 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.784177 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npmkf\" (UniqueName: \"kubernetes.io/projected/c0594606-fafa-4f81-9497-2b0a637ef8d8-kube-api-access-npmkf\") pod \"c0594606-fafa-4f81-9497-2b0a637ef8d8\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.784361 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-inventory\") pod \"c0594606-fafa-4f81-9497-2b0a637ef8d8\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.784422 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ceph\") pod \"c0594606-fafa-4f81-9497-2b0a637ef8d8\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.784475 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ssh-key-openstack-cell1\") pod \"c0594606-fafa-4f81-9497-2b0a637ef8d8\" (UID: \"c0594606-fafa-4f81-9497-2b0a637ef8d8\") " Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.789164 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0594606-fafa-4f81-9497-2b0a637ef8d8-kube-api-access-npmkf" (OuterVolumeSpecName: "kube-api-access-npmkf") pod "c0594606-fafa-4f81-9497-2b0a637ef8d8" (UID: "c0594606-fafa-4f81-9497-2b0a637ef8d8"). InnerVolumeSpecName "kube-api-access-npmkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.792526 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ceph" (OuterVolumeSpecName: "ceph") pod "c0594606-fafa-4f81-9497-2b0a637ef8d8" (UID: "c0594606-fafa-4f81-9497-2b0a637ef8d8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.810332 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "c0594606-fafa-4f81-9497-2b0a637ef8d8" (UID: "c0594606-fafa-4f81-9497-2b0a637ef8d8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.815418 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-inventory" (OuterVolumeSpecName: "inventory") pod "c0594606-fafa-4f81-9497-2b0a637ef8d8" (UID: "c0594606-fafa-4f81-9497-2b0a637ef8d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.887310 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npmkf\" (UniqueName: \"kubernetes.io/projected/c0594606-fafa-4f81-9497-2b0a637ef8d8-kube-api-access-npmkf\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.887575 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.887650 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:08 crc kubenswrapper[5118]: I0223 09:14:08.887706 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c0594606-fafa-4f81-9497-2b0a637ef8d8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.231360 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-znjvp" event={"ID":"c0594606-fafa-4f81-9497-2b0a637ef8d8","Type":"ContainerDied","Data":"34e00294e197f1e812a010d1ff4fcf45f47a539daf2ba4ce72fb38b6c6077dc3"} Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.231408 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34e00294e197f1e812a010d1ff4fcf45f47a539daf2ba4ce72fb38b6c6077dc3" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.231433 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-znjvp" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.326404 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-ttqz2"] Feb 23 09:14:09 crc kubenswrapper[5118]: E0223 09:14:09.327175 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0594606-fafa-4f81-9497-2b0a637ef8d8" containerName="configure-network-openstack-openstack-cell1" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.327197 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0594606-fafa-4f81-9497-2b0a637ef8d8" containerName="configure-network-openstack-openstack-cell1" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.327425 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0594606-fafa-4f81-9497-2b0a637ef8d8" containerName="configure-network-openstack-openstack-cell1" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.328443 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.333687 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.334168 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.349708 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-ttqz2"] Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.500930 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-inventory\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.501009 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l97w\" (UniqueName: \"kubernetes.io/projected/19007ba1-bffe-48df-b101-500178cc1930-kube-api-access-6l97w\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.501090 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.501167 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ceph\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.603928 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.604030 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ceph\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.604211 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-inventory\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.604280 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l97w\" (UniqueName: \"kubernetes.io/projected/19007ba1-bffe-48df-b101-500178cc1930-kube-api-access-6l97w\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.615842 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.615938 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ceph\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.620325 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-inventory\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.620711 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l97w\" (UniqueName: \"kubernetes.io/projected/19007ba1-bffe-48df-b101-500178cc1930-kube-api-access-6l97w\") pod \"validate-network-openstack-openstack-cell1-ttqz2\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:09 crc kubenswrapper[5118]: I0223 09:14:09.656941 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:10 crc kubenswrapper[5118]: I0223 09:14:10.236535 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-ttqz2"] Feb 23 09:14:11 crc kubenswrapper[5118]: I0223 09:14:11.257701 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" event={"ID":"19007ba1-bffe-48df-b101-500178cc1930","Type":"ContainerStarted","Data":"45f2ab1e0b4678ced23c7ae72821cefb3963d8996ca2526a3e548c8a172b1627"} Feb 23 09:14:11 crc kubenswrapper[5118]: I0223 09:14:11.258021 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" event={"ID":"19007ba1-bffe-48df-b101-500178cc1930","Type":"ContainerStarted","Data":"4c351ee80c845e5434a0cb4c3ce3f8d666d21624d9d94ce05c600d4d0f10af65"} Feb 23 09:14:11 crc kubenswrapper[5118]: I0223 09:14:11.281596 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" podStartSLOduration=1.790017736 podStartE2EDuration="2.281579735s" podCreationTimestamp="2026-02-23 09:14:09 +0000 UTC" firstStartedPulling="2026-02-23 09:14:10.251593797 +0000 UTC m=+8913.255378380" lastFinishedPulling="2026-02-23 09:14:10.743155806 +0000 UTC m=+8913.746940379" observedRunningTime="2026-02-23 09:14:11.275205451 +0000 UTC m=+8914.278990024" watchObservedRunningTime="2026-02-23 09:14:11.281579735 +0000 UTC m=+8914.285364308" Feb 23 09:14:16 crc kubenswrapper[5118]: I0223 09:14:16.318826 5118 generic.go:334] "Generic (PLEG): container finished" podID="19007ba1-bffe-48df-b101-500178cc1930" containerID="45f2ab1e0b4678ced23c7ae72821cefb3963d8996ca2526a3e548c8a172b1627" exitCode=0 Feb 23 09:14:16 crc kubenswrapper[5118]: I0223 09:14:16.318977 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" event={"ID":"19007ba1-bffe-48df-b101-500178cc1930","Type":"ContainerDied","Data":"45f2ab1e0b4678ced23c7ae72821cefb3963d8996ca2526a3e548c8a172b1627"} Feb 23 09:14:17 crc kubenswrapper[5118]: I0223 09:14:17.828233 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:17 crc kubenswrapper[5118]: I0223 09:14:17.992980 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ssh-key-openstack-cell1\") pod \"19007ba1-bffe-48df-b101-500178cc1930\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " Feb 23 09:14:17 crc kubenswrapper[5118]: I0223 09:14:17.993146 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ceph\") pod \"19007ba1-bffe-48df-b101-500178cc1930\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " Feb 23 09:14:17 crc kubenswrapper[5118]: I0223 09:14:17.993178 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l97w\" (UniqueName: \"kubernetes.io/projected/19007ba1-bffe-48df-b101-500178cc1930-kube-api-access-6l97w\") pod \"19007ba1-bffe-48df-b101-500178cc1930\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " Feb 23 09:14:17 crc kubenswrapper[5118]: I0223 09:14:17.993199 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-inventory\") pod \"19007ba1-bffe-48df-b101-500178cc1930\" (UID: \"19007ba1-bffe-48df-b101-500178cc1930\") " Feb 23 09:14:17 crc kubenswrapper[5118]: I0223 09:14:17.998572 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19007ba1-bffe-48df-b101-500178cc1930-kube-api-access-6l97w" (OuterVolumeSpecName: "kube-api-access-6l97w") pod "19007ba1-bffe-48df-b101-500178cc1930" (UID: "19007ba1-bffe-48df-b101-500178cc1930"). InnerVolumeSpecName "kube-api-access-6l97w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:14:17 crc kubenswrapper[5118]: I0223 09:14:17.998720 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ceph" (OuterVolumeSpecName: "ceph") pod "19007ba1-bffe-48df-b101-500178cc1930" (UID: "19007ba1-bffe-48df-b101-500178cc1930"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.020069 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-inventory" (OuterVolumeSpecName: "inventory") pod "19007ba1-bffe-48df-b101-500178cc1930" (UID: "19007ba1-bffe-48df-b101-500178cc1930"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.021228 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "19007ba1-bffe-48df-b101-500178cc1930" (UID: "19007ba1-bffe-48df-b101-500178cc1930"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.096844 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.096888 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.096902 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l97w\" (UniqueName: \"kubernetes.io/projected/19007ba1-bffe-48df-b101-500178cc1930-kube-api-access-6l97w\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.096915 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19007ba1-bffe-48df-b101-500178cc1930-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.350487 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" event={"ID":"19007ba1-bffe-48df-b101-500178cc1930","Type":"ContainerDied","Data":"4c351ee80c845e5434a0cb4c3ce3f8d666d21624d9d94ce05c600d4d0f10af65"} Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.350526 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c351ee80c845e5434a0cb4c3ce3f8d666d21624d9d94ce05c600d4d0f10af65" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.350541 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-ttqz2" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.432790 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-rjjq4"] Feb 23 09:14:18 crc kubenswrapper[5118]: E0223 09:14:18.433300 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19007ba1-bffe-48df-b101-500178cc1930" containerName="validate-network-openstack-openstack-cell1" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.433322 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="19007ba1-bffe-48df-b101-500178cc1930" containerName="validate-network-openstack-openstack-cell1" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.433556 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="19007ba1-bffe-48df-b101-500178cc1930" containerName="validate-network-openstack-openstack-cell1" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.434337 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.440641 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.440822 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.443155 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-rjjq4"] Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.607660 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn8w7\" (UniqueName: \"kubernetes.io/projected/2d868181-f742-41c8-b371-869982a76657-kube-api-access-pn8w7\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.608145 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.608206 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-inventory\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.608277 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ceph\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.709984 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn8w7\" (UniqueName: \"kubernetes.io/projected/2d868181-f742-41c8-b371-869982a76657-kube-api-access-pn8w7\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.710046 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.710181 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-inventory\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.710259 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ceph\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.713667 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ceph\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.713771 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.713923 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-inventory\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.728916 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn8w7\" (UniqueName: \"kubernetes.io/projected/2d868181-f742-41c8-b371-869982a76657-kube-api-access-pn8w7\") pod \"install-os-openstack-openstack-cell1-rjjq4\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:18 crc kubenswrapper[5118]: I0223 09:14:18.768342 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:14:19 crc kubenswrapper[5118]: I0223 09:14:19.322316 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-rjjq4"] Feb 23 09:14:19 crc kubenswrapper[5118]: I0223 09:14:19.362204 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-rjjq4" event={"ID":"2d868181-f742-41c8-b371-869982a76657","Type":"ContainerStarted","Data":"907a125d21c5bf7b8bc9cea615bd7e3de10b02e9be43f250c87094af60493452"} Feb 23 09:14:20 crc kubenswrapper[5118]: I0223 09:14:20.374788 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-rjjq4" event={"ID":"2d868181-f742-41c8-b371-869982a76657","Type":"ContainerStarted","Data":"d316aa5dc492efb05b0a93b63672c95f963686347e751c5795fb382217ecbb46"} Feb 23 09:14:20 crc kubenswrapper[5118]: I0223 09:14:20.405243 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-rjjq4" podStartSLOduration=1.950281746 podStartE2EDuration="2.405220042s" podCreationTimestamp="2026-02-23 09:14:18 +0000 UTC" firstStartedPulling="2026-02-23 09:14:19.325604658 +0000 UTC m=+8922.329389241" lastFinishedPulling="2026-02-23 09:14:19.780542964 +0000 UTC m=+8922.784327537" observedRunningTime="2026-02-23 09:14:20.400031997 +0000 UTC m=+8923.403816570" watchObservedRunningTime="2026-02-23 09:14:20.405220042 +0000 UTC m=+8923.409004635" Feb 23 09:14:21 crc kubenswrapper[5118]: I0223 09:14:21.388876 5118 generic.go:334] "Generic (PLEG): container finished" podID="350a056c-4003-4872-b099-2a475d6aabe9" containerID="97b68201467062490c6f0e86f1cc910649474143199f3dc5c6aec9997d79a334" exitCode=0 Feb 23 09:14:21 crc kubenswrapper[5118]: I0223 09:14:21.389381 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-8s5dc" event={"ID":"350a056c-4003-4872-b099-2a475d6aabe9","Type":"ContainerDied","Data":"97b68201467062490c6f0e86f1cc910649474143199f3dc5c6aec9997d79a334"} Feb 23 09:14:22 crc kubenswrapper[5118]: I0223 09:14:22.894648 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.018934 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-ssh-key-openstack-networker\") pod \"350a056c-4003-4872-b099-2a475d6aabe9\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.019209 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-inventory\") pod \"350a056c-4003-4872-b099-2a475d6aabe9\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.019288 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdjkv\" (UniqueName: \"kubernetes.io/projected/350a056c-4003-4872-b099-2a475d6aabe9-kube-api-access-gdjkv\") pod \"350a056c-4003-4872-b099-2a475d6aabe9\" (UID: \"350a056c-4003-4872-b099-2a475d6aabe9\") " Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.028319 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/350a056c-4003-4872-b099-2a475d6aabe9-kube-api-access-gdjkv" (OuterVolumeSpecName: "kube-api-access-gdjkv") pod "350a056c-4003-4872-b099-2a475d6aabe9" (UID: "350a056c-4003-4872-b099-2a475d6aabe9"). InnerVolumeSpecName "kube-api-access-gdjkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.047670 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-inventory" (OuterVolumeSpecName: "inventory") pod "350a056c-4003-4872-b099-2a475d6aabe9" (UID: "350a056c-4003-4872-b099-2a475d6aabe9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.050063 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "350a056c-4003-4872-b099-2a475d6aabe9" (UID: "350a056c-4003-4872-b099-2a475d6aabe9"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.122029 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.122068 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdjkv\" (UniqueName: \"kubernetes.io/projected/350a056c-4003-4872-b099-2a475d6aabe9-kube-api-access-gdjkv\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.122081 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/350a056c-4003-4872-b099-2a475d6aabe9-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.407240 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-8s5dc" event={"ID":"350a056c-4003-4872-b099-2a475d6aabe9","Type":"ContainerDied","Data":"af58dd2a1d6169b2f7aa740b99f494ae1f080947c3abb12ae1b6ac667a5182db"} Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.407558 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af58dd2a1d6169b2f7aa740b99f494ae1f080947c3abb12ae1b6ac667a5182db" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.407672 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-8s5dc" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.557398 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-networker-vxbgf"] Feb 23 09:14:23 crc kubenswrapper[5118]: E0223 09:14:23.557940 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="350a056c-4003-4872-b099-2a475d6aabe9" containerName="install-os-openstack-openstack-networker" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.557985 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="350a056c-4003-4872-b099-2a475d6aabe9" containerName="install-os-openstack-openstack-networker" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.560801 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="350a056c-4003-4872-b099-2a475d6aabe9" containerName="install-os-openstack-openstack-networker" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.562240 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.564909 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.566267 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.585411 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-vxbgf"] Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.654362 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrqch\" (UniqueName: \"kubernetes.io/projected/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-kube-api-access-mrqch\") pod \"configure-os-openstack-openstack-networker-vxbgf\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.654447 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-vxbgf\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.654473 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-inventory\") pod \"configure-os-openstack-openstack-networker-vxbgf\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.756040 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrqch\" (UniqueName: \"kubernetes.io/projected/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-kube-api-access-mrqch\") pod \"configure-os-openstack-openstack-networker-vxbgf\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.756155 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-vxbgf\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.756182 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-inventory\") pod \"configure-os-openstack-openstack-networker-vxbgf\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.763185 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-inventory\") pod \"configure-os-openstack-openstack-networker-vxbgf\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.764821 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-vxbgf\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.776014 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrqch\" (UniqueName: \"kubernetes.io/projected/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-kube-api-access-mrqch\") pod \"configure-os-openstack-openstack-networker-vxbgf\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:23 crc kubenswrapper[5118]: I0223 09:14:23.895089 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:14:24 crc kubenswrapper[5118]: I0223 09:14:24.468958 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-vxbgf"] Feb 23 09:14:25 crc kubenswrapper[5118]: I0223 09:14:25.426127 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-vxbgf" event={"ID":"394e2de4-4a97-4449-9c8a-f6cd4e6a1788","Type":"ContainerStarted","Data":"4b6b46e172fd8420ef3dad441401c4a0496412d959d8b8147891dfbbfdf27640"} Feb 23 09:14:25 crc kubenswrapper[5118]: I0223 09:14:25.426443 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-vxbgf" event={"ID":"394e2de4-4a97-4449-9c8a-f6cd4e6a1788","Type":"ContainerStarted","Data":"28c23212674affde0fca8d42509c16207e2e743b3e31c38aef21a416d8b47c97"} Feb 23 09:14:25 crc kubenswrapper[5118]: I0223 09:14:25.449937 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-networker-vxbgf" podStartSLOduration=2.053420607 podStartE2EDuration="2.449921466s" podCreationTimestamp="2026-02-23 09:14:23 +0000 UTC" firstStartedPulling="2026-02-23 09:14:24.488576332 +0000 UTC m=+8927.492360905" lastFinishedPulling="2026-02-23 09:14:24.885077191 +0000 UTC m=+8927.888861764" observedRunningTime="2026-02-23 09:14:25.444382122 +0000 UTC m=+8928.448166735" watchObservedRunningTime="2026-02-23 09:14:25.449921466 +0000 UTC m=+8928.453706039" Feb 23 09:14:32 crc kubenswrapper[5118]: I0223 09:14:32.975092 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:14:32 crc kubenswrapper[5118]: I0223 09:14:32.976558 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:14:32 crc kubenswrapper[5118]: I0223 09:14:32.976692 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 09:14:32 crc kubenswrapper[5118]: I0223 09:14:32.977679 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:14:32 crc kubenswrapper[5118]: I0223 09:14:32.977822 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" gracePeriod=600 Feb 23 09:14:33 crc kubenswrapper[5118]: E0223 09:14:33.103593 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:14:33 crc kubenswrapper[5118]: I0223 09:14:33.517627 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" exitCode=0 Feb 23 09:14:33 crc kubenswrapper[5118]: I0223 09:14:33.517894 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae"} Feb 23 09:14:33 crc kubenswrapper[5118]: I0223 09:14:33.517981 5118 scope.go:117] "RemoveContainer" containerID="3cdbaaea2ff4c518333e3cbc7d250f6e82fcea3d3f29154b7902186226929720" Feb 23 09:14:33 crc kubenswrapper[5118]: I0223 09:14:33.518462 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:14:33 crc kubenswrapper[5118]: E0223 09:14:33.518754 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:14:45 crc kubenswrapper[5118]: I0223 09:14:45.698183 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:14:45 crc kubenswrapper[5118]: E0223 09:14:45.701265 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:14:57 crc kubenswrapper[5118]: I0223 09:14:57.705155 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:14:57 crc kubenswrapper[5118]: E0223 09:14:57.706043 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.150398 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d"] Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.152827 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.155141 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.155396 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.167060 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d"] Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.231778 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkb25\" (UniqueName: \"kubernetes.io/projected/10ed81de-2549-464b-9243-25e9cf51511c-kube-api-access-kkb25\") pod \"collect-profiles-29530635-wkf8d\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.232430 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10ed81de-2549-464b-9243-25e9cf51511c-config-volume\") pod \"collect-profiles-29530635-wkf8d\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.232525 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10ed81de-2549-464b-9243-25e9cf51511c-secret-volume\") pod \"collect-profiles-29530635-wkf8d\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.334737 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkb25\" (UniqueName: \"kubernetes.io/projected/10ed81de-2549-464b-9243-25e9cf51511c-kube-api-access-kkb25\") pod \"collect-profiles-29530635-wkf8d\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.334825 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10ed81de-2549-464b-9243-25e9cf51511c-config-volume\") pod \"collect-profiles-29530635-wkf8d\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.334844 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10ed81de-2549-464b-9243-25e9cf51511c-secret-volume\") pod \"collect-profiles-29530635-wkf8d\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.335798 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10ed81de-2549-464b-9243-25e9cf51511c-config-volume\") pod \"collect-profiles-29530635-wkf8d\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.347676 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10ed81de-2549-464b-9243-25e9cf51511c-secret-volume\") pod \"collect-profiles-29530635-wkf8d\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.358470 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkb25\" (UniqueName: \"kubernetes.io/projected/10ed81de-2549-464b-9243-25e9cf51511c-kube-api-access-kkb25\") pod \"collect-profiles-29530635-wkf8d\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.491387 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:00 crc kubenswrapper[5118]: I0223 09:15:00.986129 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d"] Feb 23 09:15:01 crc kubenswrapper[5118]: I0223 09:15:01.866224 5118 generic.go:334] "Generic (PLEG): container finished" podID="10ed81de-2549-464b-9243-25e9cf51511c" containerID="a4b0589081c57b575016d17e7d1912dd918259c3dcb31a4c5b71ff4d12099014" exitCode=0 Feb 23 09:15:01 crc kubenswrapper[5118]: I0223 09:15:01.866324 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" event={"ID":"10ed81de-2549-464b-9243-25e9cf51511c","Type":"ContainerDied","Data":"a4b0589081c57b575016d17e7d1912dd918259c3dcb31a4c5b71ff4d12099014"} Feb 23 09:15:01 crc kubenswrapper[5118]: I0223 09:15:01.866629 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" event={"ID":"10ed81de-2549-464b-9243-25e9cf51511c","Type":"ContainerStarted","Data":"1e543f705d552332b4ffdd8ab57dc8460dd15ceaf06459d19b69dd3f2e97a684"} Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.245724 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.301259 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkb25\" (UniqueName: \"kubernetes.io/projected/10ed81de-2549-464b-9243-25e9cf51511c-kube-api-access-kkb25\") pod \"10ed81de-2549-464b-9243-25e9cf51511c\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.301435 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10ed81de-2549-464b-9243-25e9cf51511c-config-volume\") pod \"10ed81de-2549-464b-9243-25e9cf51511c\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.302181 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10ed81de-2549-464b-9243-25e9cf51511c-config-volume" (OuterVolumeSpecName: "config-volume") pod "10ed81de-2549-464b-9243-25e9cf51511c" (UID: "10ed81de-2549-464b-9243-25e9cf51511c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.302335 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10ed81de-2549-464b-9243-25e9cf51511c-secret-volume\") pod \"10ed81de-2549-464b-9243-25e9cf51511c\" (UID: \"10ed81de-2549-464b-9243-25e9cf51511c\") " Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.303380 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10ed81de-2549-464b-9243-25e9cf51511c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.309206 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10ed81de-2549-464b-9243-25e9cf51511c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10ed81de-2549-464b-9243-25e9cf51511c" (UID: "10ed81de-2549-464b-9243-25e9cf51511c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.310413 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ed81de-2549-464b-9243-25e9cf51511c-kube-api-access-kkb25" (OuterVolumeSpecName: "kube-api-access-kkb25") pod "10ed81de-2549-464b-9243-25e9cf51511c" (UID: "10ed81de-2549-464b-9243-25e9cf51511c"). InnerVolumeSpecName "kube-api-access-kkb25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.407809 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10ed81de-2549-464b-9243-25e9cf51511c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.407961 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkb25\" (UniqueName: \"kubernetes.io/projected/10ed81de-2549-464b-9243-25e9cf51511c-kube-api-access-kkb25\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.888789 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" event={"ID":"10ed81de-2549-464b-9243-25e9cf51511c","Type":"ContainerDied","Data":"1e543f705d552332b4ffdd8ab57dc8460dd15ceaf06459d19b69dd3f2e97a684"} Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.888828 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e543f705d552332b4ffdd8ab57dc8460dd15ceaf06459d19b69dd3f2e97a684" Feb 23 09:15:03 crc kubenswrapper[5118]: I0223 09:15:03.888841 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d" Feb 23 09:15:04 crc kubenswrapper[5118]: I0223 09:15:04.325502 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx"] Feb 23 09:15:04 crc kubenswrapper[5118]: I0223 09:15:04.334592 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530590-px8wx"] Feb 23 09:15:05 crc kubenswrapper[5118]: I0223 09:15:05.712282 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b96aadd9-1e52-4dec-96cf-936f0d92aab1" path="/var/lib/kubelet/pods/b96aadd9-1e52-4dec-96cf-936f0d92aab1/volumes" Feb 23 09:15:07 crc kubenswrapper[5118]: I0223 09:15:07.003132 5118 scope.go:117] "RemoveContainer" containerID="519906fd70d32fc8df3ea8cfb0359c4725b533a5689c70686c36c318a7fa512f" Feb 23 09:15:07 crc kubenswrapper[5118]: I0223 09:15:07.935376 5118 generic.go:334] "Generic (PLEG): container finished" podID="2d868181-f742-41c8-b371-869982a76657" containerID="d316aa5dc492efb05b0a93b63672c95f963686347e751c5795fb382217ecbb46" exitCode=0 Feb 23 09:15:07 crc kubenswrapper[5118]: I0223 09:15:07.935505 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-rjjq4" event={"ID":"2d868181-f742-41c8-b371-869982a76657","Type":"ContainerDied","Data":"d316aa5dc492efb05b0a93b63672c95f963686347e751c5795fb382217ecbb46"} Feb 23 09:15:08 crc kubenswrapper[5118]: I0223 09:15:08.697386 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:15:08 crc kubenswrapper[5118]: E0223 09:15:08.698011 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.412771 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.435017 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ssh-key-openstack-cell1\") pod \"2d868181-f742-41c8-b371-869982a76657\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.435129 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-inventory\") pod \"2d868181-f742-41c8-b371-869982a76657\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.435224 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn8w7\" (UniqueName: \"kubernetes.io/projected/2d868181-f742-41c8-b371-869982a76657-kube-api-access-pn8w7\") pod \"2d868181-f742-41c8-b371-869982a76657\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.435336 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ceph\") pod \"2d868181-f742-41c8-b371-869982a76657\" (UID: \"2d868181-f742-41c8-b371-869982a76657\") " Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.440432 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d868181-f742-41c8-b371-869982a76657-kube-api-access-pn8w7" (OuterVolumeSpecName: "kube-api-access-pn8w7") pod "2d868181-f742-41c8-b371-869982a76657" (UID: "2d868181-f742-41c8-b371-869982a76657"). InnerVolumeSpecName "kube-api-access-pn8w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.440529 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ceph" (OuterVolumeSpecName: "ceph") pod "2d868181-f742-41c8-b371-869982a76657" (UID: "2d868181-f742-41c8-b371-869982a76657"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.469720 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-inventory" (OuterVolumeSpecName: "inventory") pod "2d868181-f742-41c8-b371-869982a76657" (UID: "2d868181-f742-41c8-b371-869982a76657"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.474260 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "2d868181-f742-41c8-b371-869982a76657" (UID: "2d868181-f742-41c8-b371-869982a76657"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.537872 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.537902 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.537914 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d868181-f742-41c8-b371-869982a76657-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.537923 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn8w7\" (UniqueName: \"kubernetes.io/projected/2d868181-f742-41c8-b371-869982a76657-kube-api-access-pn8w7\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.957013 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-rjjq4" event={"ID":"2d868181-f742-41c8-b371-869982a76657","Type":"ContainerDied","Data":"907a125d21c5bf7b8bc9cea615bd7e3de10b02e9be43f250c87094af60493452"} Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.957053 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="907a125d21c5bf7b8bc9cea615bd7e3de10b02e9be43f250c87094af60493452" Feb 23 09:15:09 crc kubenswrapper[5118]: I0223 09:15:09.957120 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-rjjq4" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.060157 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-5wgqh"] Feb 23 09:15:10 crc kubenswrapper[5118]: E0223 09:15:10.060627 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d868181-f742-41c8-b371-869982a76657" containerName="install-os-openstack-openstack-cell1" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.060648 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d868181-f742-41c8-b371-869982a76657" containerName="install-os-openstack-openstack-cell1" Feb 23 09:15:10 crc kubenswrapper[5118]: E0223 09:15:10.060659 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ed81de-2549-464b-9243-25e9cf51511c" containerName="collect-profiles" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.060667 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ed81de-2549-464b-9243-25e9cf51511c" containerName="collect-profiles" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.060925 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ed81de-2549-464b-9243-25e9cf51511c" containerName="collect-profiles" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.060966 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d868181-f742-41c8-b371-869982a76657" containerName="install-os-openstack-openstack-cell1" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.061790 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.067493 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.067562 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.074007 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-5wgqh"] Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.150266 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nldg\" (UniqueName: \"kubernetes.io/projected/bfff4753-aa33-4692-8798-63c79f965334-kube-api-access-7nldg\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.150473 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-inventory\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.150854 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ceph\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.151261 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.253449 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-inventory\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.253591 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ceph\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.253716 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.253803 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nldg\" (UniqueName: \"kubernetes.io/projected/bfff4753-aa33-4692-8798-63c79f965334-kube-api-access-7nldg\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.262142 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-inventory\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.266178 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ceph\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.270906 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.271840 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nldg\" (UniqueName: \"kubernetes.io/projected/bfff4753-aa33-4692-8798-63c79f965334-kube-api-access-7nldg\") pod \"configure-os-openstack-openstack-cell1-5wgqh\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.378477 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.895813 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-5wgqh"] Feb 23 09:15:10 crc kubenswrapper[5118]: I0223 09:15:10.969678 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" event={"ID":"bfff4753-aa33-4692-8798-63c79f965334","Type":"ContainerStarted","Data":"20c839f4cd36b0a2b7281e50166eebff99dc39dccc094319ce412afd6e92691a"} Feb 23 09:15:11 crc kubenswrapper[5118]: I0223 09:15:11.987202 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" event={"ID":"bfff4753-aa33-4692-8798-63c79f965334","Type":"ContainerStarted","Data":"67ba97395254d0b7bab018ff764af49abf20505a55aba415e814344a63fd0923"} Feb 23 09:15:12 crc kubenswrapper[5118]: I0223 09:15:12.016794 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" podStartSLOduration=1.583447827 podStartE2EDuration="2.016776343s" podCreationTimestamp="2026-02-23 09:15:10 +0000 UTC" firstStartedPulling="2026-02-23 09:15:10.898882095 +0000 UTC m=+8973.902666668" lastFinishedPulling="2026-02-23 09:15:11.332210611 +0000 UTC m=+8974.335995184" observedRunningTime="2026-02-23 09:15:12.011943796 +0000 UTC m=+8975.015728379" watchObservedRunningTime="2026-02-23 09:15:12.016776343 +0000 UTC m=+8975.020560916" Feb 23 09:15:15 crc kubenswrapper[5118]: I0223 09:15:15.043780 5118 generic.go:334] "Generic (PLEG): container finished" podID="394e2de4-4a97-4449-9c8a-f6cd4e6a1788" containerID="4b6b46e172fd8420ef3dad441401c4a0496412d959d8b8147891dfbbfdf27640" exitCode=0 Feb 23 09:15:15 crc kubenswrapper[5118]: I0223 09:15:15.043839 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-vxbgf" event={"ID":"394e2de4-4a97-4449-9c8a-f6cd4e6a1788","Type":"ContainerDied","Data":"4b6b46e172fd8420ef3dad441401c4a0496412d959d8b8147891dfbbfdf27640"} Feb 23 09:15:16 crc kubenswrapper[5118]: I0223 09:15:16.620789 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:15:16 crc kubenswrapper[5118]: I0223 09:15:16.787288 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-inventory\") pod \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " Feb 23 09:15:16 crc kubenswrapper[5118]: I0223 09:15:16.787763 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrqch\" (UniqueName: \"kubernetes.io/projected/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-kube-api-access-mrqch\") pod \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " Feb 23 09:15:16 crc kubenswrapper[5118]: I0223 09:15:16.788667 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-ssh-key-openstack-networker\") pod \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\" (UID: \"394e2de4-4a97-4449-9c8a-f6cd4e6a1788\") " Feb 23 09:15:16 crc kubenswrapper[5118]: I0223 09:15:16.797159 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-kube-api-access-mrqch" (OuterVolumeSpecName: "kube-api-access-mrqch") pod "394e2de4-4a97-4449-9c8a-f6cd4e6a1788" (UID: "394e2de4-4a97-4449-9c8a-f6cd4e6a1788"). InnerVolumeSpecName "kube-api-access-mrqch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:15:16 crc kubenswrapper[5118]: I0223 09:15:16.823555 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "394e2de4-4a97-4449-9c8a-f6cd4e6a1788" (UID: "394e2de4-4a97-4449-9c8a-f6cd4e6a1788"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:16 crc kubenswrapper[5118]: I0223 09:15:16.823924 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-inventory" (OuterVolumeSpecName: "inventory") pod "394e2de4-4a97-4449-9c8a-f6cd4e6a1788" (UID: "394e2de4-4a97-4449-9c8a-f6cd4e6a1788"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:16 crc kubenswrapper[5118]: I0223 09:15:16.892171 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:16 crc kubenswrapper[5118]: I0223 09:15:16.892205 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrqch\" (UniqueName: \"kubernetes.io/projected/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-kube-api-access-mrqch\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:16 crc kubenswrapper[5118]: I0223 09:15:16.892219 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/394e2de4-4a97-4449-9c8a-f6cd4e6a1788-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.068267 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-vxbgf" event={"ID":"394e2de4-4a97-4449-9c8a-f6cd4e6a1788","Type":"ContainerDied","Data":"28c23212674affde0fca8d42509c16207e2e743b3e31c38aef21a416d8b47c97"} Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.068317 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28c23212674affde0fca8d42509c16207e2e743b3e31c38aef21a416d8b47c97" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.068380 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-vxbgf" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.154865 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-networker-fkzmh"] Feb 23 09:15:17 crc kubenswrapper[5118]: E0223 09:15:17.155425 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394e2de4-4a97-4449-9c8a-f6cd4e6a1788" containerName="configure-os-openstack-openstack-networker" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.155463 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="394e2de4-4a97-4449-9c8a-f6cd4e6a1788" containerName="configure-os-openstack-openstack-networker" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.155783 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="394e2de4-4a97-4449-9c8a-f6cd4e6a1788" containerName="configure-os-openstack-openstack-networker" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.156837 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.159049 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.163547 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.170373 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-fkzmh"] Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.198062 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scj5z\" (UniqueName: \"kubernetes.io/projected/99646e7b-9e06-4397-8677-6fc383d0683d-kube-api-access-scj5z\") pod \"run-os-openstack-openstack-networker-fkzmh\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.198220 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-inventory\") pod \"run-os-openstack-openstack-networker-fkzmh\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.198248 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-fkzmh\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.299476 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scj5z\" (UniqueName: \"kubernetes.io/projected/99646e7b-9e06-4397-8677-6fc383d0683d-kube-api-access-scj5z\") pod \"run-os-openstack-openstack-networker-fkzmh\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.299601 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-inventory\") pod \"run-os-openstack-openstack-networker-fkzmh\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.299634 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-fkzmh\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.303504 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-inventory\") pod \"run-os-openstack-openstack-networker-fkzmh\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.307729 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-fkzmh\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.319649 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scj5z\" (UniqueName: \"kubernetes.io/projected/99646e7b-9e06-4397-8677-6fc383d0683d-kube-api-access-scj5z\") pod \"run-os-openstack-openstack-networker-fkzmh\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:17 crc kubenswrapper[5118]: I0223 09:15:17.478522 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:18 crc kubenswrapper[5118]: I0223 09:15:18.111260 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-fkzmh"] Feb 23 09:15:19 crc kubenswrapper[5118]: I0223 09:15:19.090583 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-fkzmh" event={"ID":"99646e7b-9e06-4397-8677-6fc383d0683d","Type":"ContainerStarted","Data":"0d49761baaef4b2a86dc2ddc79561808d2b6ce2f080ba0cd1c7f099eb2020349"} Feb 23 09:15:19 crc kubenswrapper[5118]: I0223 09:15:19.090925 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-fkzmh" event={"ID":"99646e7b-9e06-4397-8677-6fc383d0683d","Type":"ContainerStarted","Data":"f4dd19f05ae7f3342b0b60578ffeefba0c2f4ab289219b42da1ca2619a784ce3"} Feb 23 09:15:19 crc kubenswrapper[5118]: I0223 09:15:19.112993 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-networker-fkzmh" podStartSLOduration=1.727740863 podStartE2EDuration="2.112970939s" podCreationTimestamp="2026-02-23 09:15:17 +0000 UTC" firstStartedPulling="2026-02-23 09:15:18.115829092 +0000 UTC m=+8981.119613675" lastFinishedPulling="2026-02-23 09:15:18.501059178 +0000 UTC m=+8981.504843751" observedRunningTime="2026-02-23 09:15:19.107566228 +0000 UTC m=+8982.111350831" watchObservedRunningTime="2026-02-23 09:15:19.112970939 +0000 UTC m=+8982.116755512" Feb 23 09:15:23 crc kubenswrapper[5118]: I0223 09:15:23.698430 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:15:23 crc kubenswrapper[5118]: E0223 09:15:23.699785 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:15:27 crc kubenswrapper[5118]: I0223 09:15:27.171689 5118 generic.go:334] "Generic (PLEG): container finished" podID="99646e7b-9e06-4397-8677-6fc383d0683d" containerID="0d49761baaef4b2a86dc2ddc79561808d2b6ce2f080ba0cd1c7f099eb2020349" exitCode=0 Feb 23 09:15:27 crc kubenswrapper[5118]: I0223 09:15:27.171822 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-fkzmh" event={"ID":"99646e7b-9e06-4397-8677-6fc383d0683d","Type":"ContainerDied","Data":"0d49761baaef4b2a86dc2ddc79561808d2b6ce2f080ba0cd1c7f099eb2020349"} Feb 23 09:15:28 crc kubenswrapper[5118]: I0223 09:15:28.655978 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:28 crc kubenswrapper[5118]: I0223 09:15:28.758135 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scj5z\" (UniqueName: \"kubernetes.io/projected/99646e7b-9e06-4397-8677-6fc383d0683d-kube-api-access-scj5z\") pod \"99646e7b-9e06-4397-8677-6fc383d0683d\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " Feb 23 09:15:28 crc kubenswrapper[5118]: I0223 09:15:28.758191 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-inventory\") pod \"99646e7b-9e06-4397-8677-6fc383d0683d\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " Feb 23 09:15:28 crc kubenswrapper[5118]: I0223 09:15:28.758233 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-ssh-key-openstack-networker\") pod \"99646e7b-9e06-4397-8677-6fc383d0683d\" (UID: \"99646e7b-9e06-4397-8677-6fc383d0683d\") " Feb 23 09:15:28 crc kubenswrapper[5118]: I0223 09:15:28.765529 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99646e7b-9e06-4397-8677-6fc383d0683d-kube-api-access-scj5z" (OuterVolumeSpecName: "kube-api-access-scj5z") pod "99646e7b-9e06-4397-8677-6fc383d0683d" (UID: "99646e7b-9e06-4397-8677-6fc383d0683d"). InnerVolumeSpecName "kube-api-access-scj5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:15:28 crc kubenswrapper[5118]: I0223 09:15:28.785910 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "99646e7b-9e06-4397-8677-6fc383d0683d" (UID: "99646e7b-9e06-4397-8677-6fc383d0683d"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:28 crc kubenswrapper[5118]: I0223 09:15:28.798564 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-inventory" (OuterVolumeSpecName: "inventory") pod "99646e7b-9e06-4397-8677-6fc383d0683d" (UID: "99646e7b-9e06-4397-8677-6fc383d0683d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:28 crc kubenswrapper[5118]: I0223 09:15:28.862671 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scj5z\" (UniqueName: \"kubernetes.io/projected/99646e7b-9e06-4397-8677-6fc383d0683d-kube-api-access-scj5z\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:28 crc kubenswrapper[5118]: I0223 09:15:28.862719 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:28 crc kubenswrapper[5118]: I0223 09:15:28.862733 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/99646e7b-9e06-4397-8677-6fc383d0683d-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.194315 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-fkzmh" event={"ID":"99646e7b-9e06-4397-8677-6fc383d0683d","Type":"ContainerDied","Data":"f4dd19f05ae7f3342b0b60578ffeefba0c2f4ab289219b42da1ca2619a784ce3"} Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.194390 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4dd19f05ae7f3342b0b60578ffeefba0c2f4ab289219b42da1ca2619a784ce3" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.194452 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-fkzmh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.271601 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-9m9lh"] Feb 23 09:15:29 crc kubenswrapper[5118]: E0223 09:15:29.272073 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99646e7b-9e06-4397-8677-6fc383d0683d" containerName="run-os-openstack-openstack-networker" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.272129 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="99646e7b-9e06-4397-8677-6fc383d0683d" containerName="run-os-openstack-openstack-networker" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.272416 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="99646e7b-9e06-4397-8677-6fc383d0683d" containerName="run-os-openstack-openstack-networker" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.273406 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.276566 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.276707 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.287982 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-9m9lh"] Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.373081 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-9m9lh\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.373253 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2m4r\" (UniqueName: \"kubernetes.io/projected/11eb8834-97e4-4039-aab8-41e55d069d49-kube-api-access-j2m4r\") pod \"reboot-os-openstack-openstack-networker-9m9lh\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.373296 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-inventory\") pod \"reboot-os-openstack-openstack-networker-9m9lh\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.474729 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2m4r\" (UniqueName: \"kubernetes.io/projected/11eb8834-97e4-4039-aab8-41e55d069d49-kube-api-access-j2m4r\") pod \"reboot-os-openstack-openstack-networker-9m9lh\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.475074 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-inventory\") pod \"reboot-os-openstack-openstack-networker-9m9lh\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.475293 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-9m9lh\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.483887 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-9m9lh\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.484048 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-inventory\") pod \"reboot-os-openstack-openstack-networker-9m9lh\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.501892 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2m4r\" (UniqueName: \"kubernetes.io/projected/11eb8834-97e4-4039-aab8-41e55d069d49-kube-api-access-j2m4r\") pod \"reboot-os-openstack-openstack-networker-9m9lh\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:29 crc kubenswrapper[5118]: I0223 09:15:29.591657 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:30 crc kubenswrapper[5118]: I0223 09:15:30.147201 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-9m9lh"] Feb 23 09:15:30 crc kubenswrapper[5118]: W0223 09:15:30.155151 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11eb8834_97e4_4039_aab8_41e55d069d49.slice/crio-9b68ac484bd06ad5d9d74128e312bb4a2cf2aec394cbfe4469fb7189583bcd1a WatchSource:0}: Error finding container 9b68ac484bd06ad5d9d74128e312bb4a2cf2aec394cbfe4469fb7189583bcd1a: Status 404 returned error can't find the container with id 9b68ac484bd06ad5d9d74128e312bb4a2cf2aec394cbfe4469fb7189583bcd1a Feb 23 09:15:30 crc kubenswrapper[5118]: I0223 09:15:30.206615 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" event={"ID":"11eb8834-97e4-4039-aab8-41e55d069d49","Type":"ContainerStarted","Data":"9b68ac484bd06ad5d9d74128e312bb4a2cf2aec394cbfe4469fb7189583bcd1a"} Feb 23 09:15:31 crc kubenswrapper[5118]: I0223 09:15:31.219203 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" event={"ID":"11eb8834-97e4-4039-aab8-41e55d069d49","Type":"ContainerStarted","Data":"a2852864f6172251a0029eeca7feeefaacc48871615502f59babbba4972c9618"} Feb 23 09:15:31 crc kubenswrapper[5118]: I0223 09:15:31.238132 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" podStartSLOduration=1.767110774 podStartE2EDuration="2.238113158s" podCreationTimestamp="2026-02-23 09:15:29 +0000 UTC" firstStartedPulling="2026-02-23 09:15:30.159379704 +0000 UTC m=+8993.163164277" lastFinishedPulling="2026-02-23 09:15:30.630382058 +0000 UTC m=+8993.634166661" observedRunningTime="2026-02-23 09:15:31.237702608 +0000 UTC m=+8994.241487181" watchObservedRunningTime="2026-02-23 09:15:31.238113158 +0000 UTC m=+8994.241897731" Feb 23 09:15:35 crc kubenswrapper[5118]: I0223 09:15:35.697559 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:15:35 crc kubenswrapper[5118]: E0223 09:15:35.698414 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:15:45 crc kubenswrapper[5118]: I0223 09:15:45.361594 5118 generic.go:334] "Generic (PLEG): container finished" podID="11eb8834-97e4-4039-aab8-41e55d069d49" containerID="a2852864f6172251a0029eeca7feeefaacc48871615502f59babbba4972c9618" exitCode=0 Feb 23 09:15:45 crc kubenswrapper[5118]: I0223 09:15:45.362361 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" event={"ID":"11eb8834-97e4-4039-aab8-41e55d069d49","Type":"ContainerDied","Data":"a2852864f6172251a0029eeca7feeefaacc48871615502f59babbba4972c9618"} Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.034609 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b8rmk"] Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.037848 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.061307 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8rmk"] Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.127565 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-catalog-content\") pod \"redhat-marketplace-b8rmk\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.127965 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h92xs\" (UniqueName: \"kubernetes.io/projected/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-kube-api-access-h92xs\") pod \"redhat-marketplace-b8rmk\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.128151 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-utilities\") pod \"redhat-marketplace-b8rmk\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.231260 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-utilities\") pod \"redhat-marketplace-b8rmk\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.231674 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-catalog-content\") pod \"redhat-marketplace-b8rmk\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.231827 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h92xs\" (UniqueName: \"kubernetes.io/projected/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-kube-api-access-h92xs\") pod \"redhat-marketplace-b8rmk\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.232035 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-utilities\") pod \"redhat-marketplace-b8rmk\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.232188 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-catalog-content\") pod \"redhat-marketplace-b8rmk\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.261167 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h92xs\" (UniqueName: \"kubernetes.io/projected/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-kube-api-access-h92xs\") pod \"redhat-marketplace-b8rmk\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:46 crc kubenswrapper[5118]: I0223 09:15:46.360317 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.002583 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.012696 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8rmk"] Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.164481 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2m4r\" (UniqueName: \"kubernetes.io/projected/11eb8834-97e4-4039-aab8-41e55d069d49-kube-api-access-j2m4r\") pod \"11eb8834-97e4-4039-aab8-41e55d069d49\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.164603 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-ssh-key-openstack-networker\") pod \"11eb8834-97e4-4039-aab8-41e55d069d49\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.164730 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-inventory\") pod \"11eb8834-97e4-4039-aab8-41e55d069d49\" (UID: \"11eb8834-97e4-4039-aab8-41e55d069d49\") " Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.171176 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11eb8834-97e4-4039-aab8-41e55d069d49-kube-api-access-j2m4r" (OuterVolumeSpecName: "kube-api-access-j2m4r") pod "11eb8834-97e4-4039-aab8-41e55d069d49" (UID: "11eb8834-97e4-4039-aab8-41e55d069d49"). InnerVolumeSpecName "kube-api-access-j2m4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.205560 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "11eb8834-97e4-4039-aab8-41e55d069d49" (UID: "11eb8834-97e4-4039-aab8-41e55d069d49"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.214765 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-inventory" (OuterVolumeSpecName: "inventory") pod "11eb8834-97e4-4039-aab8-41e55d069d49" (UID: "11eb8834-97e4-4039-aab8-41e55d069d49"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.266942 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.266990 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2m4r\" (UniqueName: \"kubernetes.io/projected/11eb8834-97e4-4039-aab8-41e55d069d49-kube-api-access-j2m4r\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.267007 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/11eb8834-97e4-4039-aab8-41e55d069d49-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.393350 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" event={"ID":"11eb8834-97e4-4039-aab8-41e55d069d49","Type":"ContainerDied","Data":"9b68ac484bd06ad5d9d74128e312bb4a2cf2aec394cbfe4469fb7189583bcd1a"} Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.393394 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b68ac484bd06ad5d9d74128e312bb4a2cf2aec394cbfe4469fb7189583bcd1a" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.393473 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-9m9lh" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.397534 5118 generic.go:334] "Generic (PLEG): container finished" podID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerID="9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853" exitCode=0 Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.397586 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8rmk" event={"ID":"4aa3055e-bd31-4334-9c80-ac0a5ff624ec","Type":"ContainerDied","Data":"9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853"} Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.397614 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8rmk" event={"ID":"4aa3055e-bd31-4334-9c80-ac0a5ff624ec","Type":"ContainerStarted","Data":"d1d4f0ae1e470df0c21e715004d0e11f31f675a641bf61ba3beb0580b579d108"} Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.481434 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-networker-ntsq6"] Feb 23 09:15:47 crc kubenswrapper[5118]: E0223 09:15:47.482179 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11eb8834-97e4-4039-aab8-41e55d069d49" containerName="reboot-os-openstack-openstack-networker" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.482197 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="11eb8834-97e4-4039-aab8-41e55d069d49" containerName="reboot-os-openstack-openstack-networker" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.482391 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="11eb8834-97e4-4039-aab8-41e55d069d49" containerName="reboot-os-openstack-openstack-networker" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.483202 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.487730 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.487742 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.499048 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-ntsq6"] Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.679084 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.680013 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5884s\" (UniqueName: \"kubernetes.io/projected/327a5d6a-824c-45f9-b361-e796699fb933-kube-api-access-5884s\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.680339 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.680534 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.680625 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-inventory\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.680751 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.783239 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-inventory\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.783317 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.783443 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.783511 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5884s\" (UniqueName: \"kubernetes.io/projected/327a5d6a-824c-45f9-b361-e796699fb933-kube-api-access-5884s\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.783583 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.783631 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.788839 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.789024 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.789346 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.789589 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-inventory\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.795292 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:47 crc kubenswrapper[5118]: I0223 09:15:47.805937 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5884s\" (UniqueName: \"kubernetes.io/projected/327a5d6a-824c-45f9-b361-e796699fb933-kube-api-access-5884s\") pod \"install-certs-openstack-openstack-networker-ntsq6\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:48 crc kubenswrapper[5118]: I0223 09:15:48.099843 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:15:48 crc kubenswrapper[5118]: I0223 09:15:48.407719 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8rmk" event={"ID":"4aa3055e-bd31-4334-9c80-ac0a5ff624ec","Type":"ContainerStarted","Data":"02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff"} Feb 23 09:15:48 crc kubenswrapper[5118]: I0223 09:15:48.625491 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-ntsq6"] Feb 23 09:15:48 crc kubenswrapper[5118]: I0223 09:15:48.709707 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:15:48 crc kubenswrapper[5118]: E0223 09:15:48.718204 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:15:49 crc kubenswrapper[5118]: I0223 09:15:49.418110 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-ntsq6" event={"ID":"327a5d6a-824c-45f9-b361-e796699fb933","Type":"ContainerStarted","Data":"d63f2ecb6f05cc6848b1586189a7e31224156a7bf1f412b8841525292f697926"} Feb 23 09:15:49 crc kubenswrapper[5118]: I0223 09:15:49.418366 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-ntsq6" event={"ID":"327a5d6a-824c-45f9-b361-e796699fb933","Type":"ContainerStarted","Data":"af70f837a8e7d179b020091b2433b050e44fcbee9e99e1eb78cc41b0684c01ca"} Feb 23 09:15:49 crc kubenswrapper[5118]: I0223 09:15:49.420301 5118 generic.go:334] "Generic (PLEG): container finished" podID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerID="02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff" exitCode=0 Feb 23 09:15:49 crc kubenswrapper[5118]: I0223 09:15:49.420341 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8rmk" event={"ID":"4aa3055e-bd31-4334-9c80-ac0a5ff624ec","Type":"ContainerDied","Data":"02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff"} Feb 23 09:15:49 crc kubenswrapper[5118]: I0223 09:15:49.446881 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-networker-ntsq6" podStartSLOduration=2.034868093 podStartE2EDuration="2.446854584s" podCreationTimestamp="2026-02-23 09:15:47 +0000 UTC" firstStartedPulling="2026-02-23 09:15:48.630451684 +0000 UTC m=+9011.634236257" lastFinishedPulling="2026-02-23 09:15:49.042438185 +0000 UTC m=+9012.046222748" observedRunningTime="2026-02-23 09:15:49.433502081 +0000 UTC m=+9012.437286654" watchObservedRunningTime="2026-02-23 09:15:49.446854584 +0000 UTC m=+9012.450639157" Feb 23 09:15:50 crc kubenswrapper[5118]: I0223 09:15:50.437149 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8rmk" event={"ID":"4aa3055e-bd31-4334-9c80-ac0a5ff624ec","Type":"ContainerStarted","Data":"1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c"} Feb 23 09:15:50 crc kubenswrapper[5118]: I0223 09:15:50.462553 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b8rmk" podStartSLOduration=2.015403018 podStartE2EDuration="4.462527196s" podCreationTimestamp="2026-02-23 09:15:46 +0000 UTC" firstStartedPulling="2026-02-23 09:15:47.401197853 +0000 UTC m=+9010.404982426" lastFinishedPulling="2026-02-23 09:15:49.848322041 +0000 UTC m=+9012.852106604" observedRunningTime="2026-02-23 09:15:50.454190685 +0000 UTC m=+9013.457975258" watchObservedRunningTime="2026-02-23 09:15:50.462527196 +0000 UTC m=+9013.466311769" Feb 23 09:15:56 crc kubenswrapper[5118]: I0223 09:15:56.360866 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:56 crc kubenswrapper[5118]: I0223 09:15:56.361756 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:56 crc kubenswrapper[5118]: I0223 09:15:56.421562 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:56 crc kubenswrapper[5118]: I0223 09:15:56.555250 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:56 crc kubenswrapper[5118]: I0223 09:15:56.664514 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8rmk"] Feb 23 09:15:58 crc kubenswrapper[5118]: I0223 09:15:58.528388 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b8rmk" podUID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerName="registry-server" containerID="cri-o://1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c" gracePeriod=2 Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.029715 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.127555 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-catalog-content\") pod \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.127604 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-utilities\") pod \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.127700 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h92xs\" (UniqueName: \"kubernetes.io/projected/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-kube-api-access-h92xs\") pod \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\" (UID: \"4aa3055e-bd31-4334-9c80-ac0a5ff624ec\") " Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.128634 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-utilities" (OuterVolumeSpecName: "utilities") pod "4aa3055e-bd31-4334-9c80-ac0a5ff624ec" (UID: "4aa3055e-bd31-4334-9c80-ac0a5ff624ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.133627 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-kube-api-access-h92xs" (OuterVolumeSpecName: "kube-api-access-h92xs") pod "4aa3055e-bd31-4334-9c80-ac0a5ff624ec" (UID: "4aa3055e-bd31-4334-9c80-ac0a5ff624ec"). InnerVolumeSpecName "kube-api-access-h92xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.152540 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4aa3055e-bd31-4334-9c80-ac0a5ff624ec" (UID: "4aa3055e-bd31-4334-9c80-ac0a5ff624ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.230425 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.230467 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.230485 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h92xs\" (UniqueName: \"kubernetes.io/projected/4aa3055e-bd31-4334-9c80-ac0a5ff624ec-kube-api-access-h92xs\") on node \"crc\" DevicePath \"\"" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.542713 5118 generic.go:334] "Generic (PLEG): container finished" podID="327a5d6a-824c-45f9-b361-e796699fb933" containerID="d63f2ecb6f05cc6848b1586189a7e31224156a7bf1f412b8841525292f697926" exitCode=0 Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.542766 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-ntsq6" event={"ID":"327a5d6a-824c-45f9-b361-e796699fb933","Type":"ContainerDied","Data":"d63f2ecb6f05cc6848b1586189a7e31224156a7bf1f412b8841525292f697926"} Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.546586 5118 generic.go:334] "Generic (PLEG): container finished" podID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerID="1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c" exitCode=0 Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.546620 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8rmk" event={"ID":"4aa3055e-bd31-4334-9c80-ac0a5ff624ec","Type":"ContainerDied","Data":"1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c"} Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.546642 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b8rmk" event={"ID":"4aa3055e-bd31-4334-9c80-ac0a5ff624ec","Type":"ContainerDied","Data":"d1d4f0ae1e470df0c21e715004d0e11f31f675a641bf61ba3beb0580b579d108"} Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.546661 5118 scope.go:117] "RemoveContainer" containerID="1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.546673 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b8rmk" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.587177 5118 scope.go:117] "RemoveContainer" containerID="02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.601020 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8rmk"] Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.611894 5118 scope.go:117] "RemoveContainer" containerID="9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.612198 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b8rmk"] Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.664354 5118 scope.go:117] "RemoveContainer" containerID="1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c" Feb 23 09:15:59 crc kubenswrapper[5118]: E0223 09:15:59.667279 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c\": container with ID starting with 1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c not found: ID does not exist" containerID="1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.667333 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c"} err="failed to get container status \"1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c\": rpc error: code = NotFound desc = could not find container \"1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c\": container with ID starting with 1fb6d8c7128551dcac8e33f8f2419b01e6ee629d3340dfdc643724a047f33f3c not found: ID does not exist" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.667361 5118 scope.go:117] "RemoveContainer" containerID="02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff" Feb 23 09:15:59 crc kubenswrapper[5118]: E0223 09:15:59.667641 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff\": container with ID starting with 02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff not found: ID does not exist" containerID="02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.667669 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff"} err="failed to get container status \"02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff\": rpc error: code = NotFound desc = could not find container \"02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff\": container with ID starting with 02a8564b13bfae31e072a971fac8c375f714eced659eae7ba1bcc0f572ce75ff not found: ID does not exist" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.667682 5118 scope.go:117] "RemoveContainer" containerID="9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853" Feb 23 09:15:59 crc kubenswrapper[5118]: E0223 09:15:59.667981 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853\": container with ID starting with 9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853 not found: ID does not exist" containerID="9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.668031 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853"} err="failed to get container status \"9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853\": rpc error: code = NotFound desc = could not find container \"9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853\": container with ID starting with 9c8e9469080958bc68e7b76d04a35632691fc7c66bfed254cb0191cb730f4853 not found: ID does not exist" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.697785 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:15:59 crc kubenswrapper[5118]: E0223 09:15:59.698231 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:15:59 crc kubenswrapper[5118]: I0223 09:15:59.711443 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" path="/var/lib/kubelet/pods/4aa3055e-bd31-4334-9c80-ac0a5ff624ec/volumes" Feb 23 09:15:59 crc kubenswrapper[5118]: E0223 09:15:59.739486 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aa3055e_bd31_4334_9c80_ac0a5ff624ec.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aa3055e_bd31_4334_9c80_ac0a5ff624ec.slice/crio-d1d4f0ae1e470df0c21e715004d0e11f31f675a641bf61ba3beb0580b579d108\": RecentStats: unable to find data in memory cache]" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.113201 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.192711 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-inventory\") pod \"327a5d6a-824c-45f9-b361-e796699fb933\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.192862 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5884s\" (UniqueName: \"kubernetes.io/projected/327a5d6a-824c-45f9-b361-e796699fb933-kube-api-access-5884s\") pod \"327a5d6a-824c-45f9-b361-e796699fb933\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.192922 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ssh-key-openstack-networker\") pod \"327a5d6a-824c-45f9-b361-e796699fb933\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.193036 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ovn-combined-ca-bundle\") pod \"327a5d6a-824c-45f9-b361-e796699fb933\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.193082 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-neutron-metadata-combined-ca-bundle\") pod \"327a5d6a-824c-45f9-b361-e796699fb933\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.193176 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-bootstrap-combined-ca-bundle\") pod \"327a5d6a-824c-45f9-b361-e796699fb933\" (UID: \"327a5d6a-824c-45f9-b361-e796699fb933\") " Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.200009 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "327a5d6a-824c-45f9-b361-e796699fb933" (UID: "327a5d6a-824c-45f9-b361-e796699fb933"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.200181 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "327a5d6a-824c-45f9-b361-e796699fb933" (UID: "327a5d6a-824c-45f9-b361-e796699fb933"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.201121 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "327a5d6a-824c-45f9-b361-e796699fb933" (UID: "327a5d6a-824c-45f9-b361-e796699fb933"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.201832 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327a5d6a-824c-45f9-b361-e796699fb933-kube-api-access-5884s" (OuterVolumeSpecName: "kube-api-access-5884s") pod "327a5d6a-824c-45f9-b361-e796699fb933" (UID: "327a5d6a-824c-45f9-b361-e796699fb933"). InnerVolumeSpecName "kube-api-access-5884s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.234729 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "327a5d6a-824c-45f9-b361-e796699fb933" (UID: "327a5d6a-824c-45f9-b361-e796699fb933"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.236278 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-inventory" (OuterVolumeSpecName: "inventory") pod "327a5d6a-824c-45f9-b361-e796699fb933" (UID: "327a5d6a-824c-45f9-b361-e796699fb933"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.296277 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.296312 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5884s\" (UniqueName: \"kubernetes.io/projected/327a5d6a-824c-45f9-b361-e796699fb933-kube-api-access-5884s\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.296324 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.296334 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.296343 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.296354 5118 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327a5d6a-824c-45f9-b361-e796699fb933-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.597844 5118 generic.go:334] "Generic (PLEG): container finished" podID="bfff4753-aa33-4692-8798-63c79f965334" containerID="67ba97395254d0b7bab018ff764af49abf20505a55aba415e814344a63fd0923" exitCode=0 Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.598272 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" event={"ID":"bfff4753-aa33-4692-8798-63c79f965334","Type":"ContainerDied","Data":"67ba97395254d0b7bab018ff764af49abf20505a55aba415e814344a63fd0923"} Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.600788 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-ntsq6" event={"ID":"327a5d6a-824c-45f9-b361-e796699fb933","Type":"ContainerDied","Data":"af70f837a8e7d179b020091b2433b050e44fcbee9e99e1eb78cc41b0684c01ca"} Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.600831 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af70f837a8e7d179b020091b2433b050e44fcbee9e99e1eb78cc41b0684c01ca" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.600892 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-ntsq6" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.749011 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-networker-h8lzm"] Feb 23 09:16:01 crc kubenswrapper[5118]: E0223 09:16:01.749512 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327a5d6a-824c-45f9-b361-e796699fb933" containerName="install-certs-openstack-openstack-networker" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.749539 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="327a5d6a-824c-45f9-b361-e796699fb933" containerName="install-certs-openstack-openstack-networker" Feb 23 09:16:01 crc kubenswrapper[5118]: E0223 09:16:01.749559 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerName="registry-server" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.749567 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerName="registry-server" Feb 23 09:16:01 crc kubenswrapper[5118]: E0223 09:16:01.749577 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerName="extract-content" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.749584 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerName="extract-content" Feb 23 09:16:01 crc kubenswrapper[5118]: E0223 09:16:01.749613 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerName="extract-utilities" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.749621 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerName="extract-utilities" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.749882 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa3055e-bd31-4334-9c80-ac0a5ff624ec" containerName="registry-server" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.749917 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="327a5d6a-824c-45f9-b361-e796699fb933" containerName="install-certs-openstack-openstack-networker" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.750730 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.753252 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.756831 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.756837 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.785311 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-h8lzm"] Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.807990 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-inventory\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.808087 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cztq2\" (UniqueName: \"kubernetes.io/projected/4258d556-fe9b-469a-b37a-6a15eb81b4be-kube-api-access-cztq2\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.808138 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.808210 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.808316 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.909112 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-inventory\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.909189 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cztq2\" (UniqueName: \"kubernetes.io/projected/4258d556-fe9b-469a-b37a-6a15eb81b4be-kube-api-access-cztq2\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.909221 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.909265 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.909331 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.910181 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.915564 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.918545 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-inventory\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.924749 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:01 crc kubenswrapper[5118]: I0223 09:16:01.935374 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cztq2\" (UniqueName: \"kubernetes.io/projected/4258d556-fe9b-469a-b37a-6a15eb81b4be-kube-api-access-cztq2\") pod \"ovn-openstack-openstack-networker-h8lzm\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:02 crc kubenswrapper[5118]: I0223 09:16:02.070963 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:16:02 crc kubenswrapper[5118]: I0223 09:16:02.613135 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-h8lzm"] Feb 23 09:16:02 crc kubenswrapper[5118]: W0223 09:16:02.619639 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4258d556_fe9b_469a_b37a_6a15eb81b4be.slice/crio-08d3d373faa44c420cb762b4e66b1a99d58c4e44db6f9d38da1e9e2c0d7f31a9 WatchSource:0}: Error finding container 08d3d373faa44c420cb762b4e66b1a99d58c4e44db6f9d38da1e9e2c0d7f31a9: Status 404 returned error can't find the container with id 08d3d373faa44c420cb762b4e66b1a99d58c4e44db6f9d38da1e9e2c0d7f31a9 Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.082581 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.137255 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nldg\" (UniqueName: \"kubernetes.io/projected/bfff4753-aa33-4692-8798-63c79f965334-kube-api-access-7nldg\") pod \"bfff4753-aa33-4692-8798-63c79f965334\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.137486 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ceph\") pod \"bfff4753-aa33-4692-8798-63c79f965334\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.137540 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ssh-key-openstack-cell1\") pod \"bfff4753-aa33-4692-8798-63c79f965334\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.137579 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-inventory\") pod \"bfff4753-aa33-4692-8798-63c79f965334\" (UID: \"bfff4753-aa33-4692-8798-63c79f965334\") " Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.150368 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfff4753-aa33-4692-8798-63c79f965334-kube-api-access-7nldg" (OuterVolumeSpecName: "kube-api-access-7nldg") pod "bfff4753-aa33-4692-8798-63c79f965334" (UID: "bfff4753-aa33-4692-8798-63c79f965334"). InnerVolumeSpecName "kube-api-access-7nldg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.153460 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ceph" (OuterVolumeSpecName: "ceph") pod "bfff4753-aa33-4692-8798-63c79f965334" (UID: "bfff4753-aa33-4692-8798-63c79f965334"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.187204 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "bfff4753-aa33-4692-8798-63c79f965334" (UID: "bfff4753-aa33-4692-8798-63c79f965334"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.201406 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-inventory" (OuterVolumeSpecName: "inventory") pod "bfff4753-aa33-4692-8798-63c79f965334" (UID: "bfff4753-aa33-4692-8798-63c79f965334"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.240150 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.240188 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.240201 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfff4753-aa33-4692-8798-63c79f965334-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.240280 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nldg\" (UniqueName: \"kubernetes.io/projected/bfff4753-aa33-4692-8798-63c79f965334-kube-api-access-7nldg\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.629553 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" event={"ID":"bfff4753-aa33-4692-8798-63c79f965334","Type":"ContainerDied","Data":"20c839f4cd36b0a2b7281e50166eebff99dc39dccc094319ce412afd6e92691a"} Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.629591 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20c839f4cd36b0a2b7281e50166eebff99dc39dccc094319ce412afd6e92691a" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.629666 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-5wgqh" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.640327 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-h8lzm" event={"ID":"4258d556-fe9b-469a-b37a-6a15eb81b4be","Type":"ContainerStarted","Data":"535a98dfb3d6357fc0b6d5800027eb0af3e6adb548fe2f921ab5f93032adaf37"} Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.640698 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-h8lzm" event={"ID":"4258d556-fe9b-469a-b37a-6a15eb81b4be","Type":"ContainerStarted","Data":"08d3d373faa44c420cb762b4e66b1a99d58c4e44db6f9d38da1e9e2c0d7f31a9"} Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.688478 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-networker-h8lzm" podStartSLOduration=2.083992381 podStartE2EDuration="2.688452672s" podCreationTimestamp="2026-02-23 09:16:01 +0000 UTC" firstStartedPulling="2026-02-23 09:16:02.622563729 +0000 UTC m=+9025.626348302" lastFinishedPulling="2026-02-23 09:16:03.22702401 +0000 UTC m=+9026.230808593" observedRunningTime="2026-02-23 09:16:03.668612624 +0000 UTC m=+9026.672397197" watchObservedRunningTime="2026-02-23 09:16:03.688452672 +0000 UTC m=+9026.692237245" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.742764 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-md2tf"] Feb 23 09:16:03 crc kubenswrapper[5118]: E0223 09:16:03.743306 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfff4753-aa33-4692-8798-63c79f965334" containerName="configure-os-openstack-openstack-cell1" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.743371 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfff4753-aa33-4692-8798-63c79f965334" containerName="configure-os-openstack-openstack-cell1" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.743673 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfff4753-aa33-4692-8798-63c79f965334" containerName="configure-os-openstack-openstack-cell1" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.744593 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.747343 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.747572 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.755151 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ceph\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.755309 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.755375 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.755430 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-0\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.755452 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvs8r\" (UniqueName: \"kubernetes.io/projected/10127b7d-da9f-43f0-b2c4-625ea937f23d-kube-api-access-vvs8r\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.755502 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-1\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.759530 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-md2tf"] Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.857910 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.858011 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.858076 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-0\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.858146 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvs8r\" (UniqueName: \"kubernetes.io/projected/10127b7d-da9f-43f0-b2c4-625ea937f23d-kube-api-access-vvs8r\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.858186 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-1\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.858251 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ceph\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.863113 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ceph\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.864686 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-0\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.865028 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.865123 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-1\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.866024 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:03 crc kubenswrapper[5118]: I0223 09:16:03.881048 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvs8r\" (UniqueName: \"kubernetes.io/projected/10127b7d-da9f-43f0-b2c4-625ea937f23d-kube-api-access-vvs8r\") pod \"ssh-known-hosts-openstack-md2tf\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:04 crc kubenswrapper[5118]: I0223 09:16:04.069965 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:04 crc kubenswrapper[5118]: I0223 09:16:04.665674 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-md2tf"] Feb 23 09:16:04 crc kubenswrapper[5118]: W0223 09:16:04.684358 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10127b7d_da9f_43f0_b2c4_625ea937f23d.slice/crio-dd2d9ec0355a18de4a0be3474e5994a2b42d4941fe2eb423acaca121ef50ea9c WatchSource:0}: Error finding container dd2d9ec0355a18de4a0be3474e5994a2b42d4941fe2eb423acaca121ef50ea9c: Status 404 returned error can't find the container with id dd2d9ec0355a18de4a0be3474e5994a2b42d4941fe2eb423acaca121ef50ea9c Feb 23 09:16:05 crc kubenswrapper[5118]: I0223 09:16:05.666312 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-md2tf" event={"ID":"10127b7d-da9f-43f0-b2c4-625ea937f23d","Type":"ContainerStarted","Data":"94eaa2b3945296d54b9f0821fe37cae3e7457a32e7693e95333153598f18c483"} Feb 23 09:16:05 crc kubenswrapper[5118]: I0223 09:16:05.666712 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-md2tf" event={"ID":"10127b7d-da9f-43f0-b2c4-625ea937f23d","Type":"ContainerStarted","Data":"dd2d9ec0355a18de4a0be3474e5994a2b42d4941fe2eb423acaca121ef50ea9c"} Feb 23 09:16:05 crc kubenswrapper[5118]: I0223 09:16:05.690803 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-md2tf" podStartSLOduration=2.28881963 podStartE2EDuration="2.690770939s" podCreationTimestamp="2026-02-23 09:16:03 +0000 UTC" firstStartedPulling="2026-02-23 09:16:04.687523905 +0000 UTC m=+9027.691308478" lastFinishedPulling="2026-02-23 09:16:05.089475214 +0000 UTC m=+9028.093259787" observedRunningTime="2026-02-23 09:16:05.685078441 +0000 UTC m=+9028.688863014" watchObservedRunningTime="2026-02-23 09:16:05.690770939 +0000 UTC m=+9028.694555552" Feb 23 09:16:10 crc kubenswrapper[5118]: I0223 09:16:10.697565 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:16:10 crc kubenswrapper[5118]: E0223 09:16:10.698131 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:16:19 crc kubenswrapper[5118]: I0223 09:16:19.800670 5118 generic.go:334] "Generic (PLEG): container finished" podID="10127b7d-da9f-43f0-b2c4-625ea937f23d" containerID="94eaa2b3945296d54b9f0821fe37cae3e7457a32e7693e95333153598f18c483" exitCode=0 Feb 23 09:16:19 crc kubenswrapper[5118]: I0223 09:16:19.800726 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-md2tf" event={"ID":"10127b7d-da9f-43f0-b2c4-625ea937f23d","Type":"ContainerDied","Data":"94eaa2b3945296d54b9f0821fe37cae3e7457a32e7693e95333153598f18c483"} Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.229149 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.354414 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ceph\") pod \"10127b7d-da9f-43f0-b2c4-625ea937f23d\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.355758 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-networker\") pod \"10127b7d-da9f-43f0-b2c4-625ea937f23d\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.356084 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-1\") pod \"10127b7d-da9f-43f0-b2c4-625ea937f23d\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.356260 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-cell1\") pod \"10127b7d-da9f-43f0-b2c4-625ea937f23d\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.356430 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-0\") pod \"10127b7d-da9f-43f0-b2c4-625ea937f23d\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.356587 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvs8r\" (UniqueName: \"kubernetes.io/projected/10127b7d-da9f-43f0-b2c4-625ea937f23d-kube-api-access-vvs8r\") pod \"10127b7d-da9f-43f0-b2c4-625ea937f23d\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.360222 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ceph" (OuterVolumeSpecName: "ceph") pod "10127b7d-da9f-43f0-b2c4-625ea937f23d" (UID: "10127b7d-da9f-43f0-b2c4-625ea937f23d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.360326 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10127b7d-da9f-43f0-b2c4-625ea937f23d-kube-api-access-vvs8r" (OuterVolumeSpecName: "kube-api-access-vvs8r") pod "10127b7d-da9f-43f0-b2c4-625ea937f23d" (UID: "10127b7d-da9f-43f0-b2c4-625ea937f23d"). InnerVolumeSpecName "kube-api-access-vvs8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.390921 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "10127b7d-da9f-43f0-b2c4-625ea937f23d" (UID: "10127b7d-da9f-43f0-b2c4-625ea937f23d"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.397145 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "10127b7d-da9f-43f0-b2c4-625ea937f23d" (UID: "10127b7d-da9f-43f0-b2c4-625ea937f23d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.418267 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "10127b7d-da9f-43f0-b2c4-625ea937f23d" (UID: "10127b7d-da9f-43f0-b2c4-625ea937f23d"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.458668 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "10127b7d-da9f-43f0-b2c4-625ea937f23d" (UID: "10127b7d-da9f-43f0-b2c4-625ea937f23d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.459604 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-0\") pod \"10127b7d-da9f-43f0-b2c4-625ea937f23d\" (UID: \"10127b7d-da9f-43f0-b2c4-625ea937f23d\") " Feb 23 09:16:21 crc kubenswrapper[5118]: W0223 09:16:21.459759 5118 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/10127b7d-da9f-43f0-b2c4-625ea937f23d/volumes/kubernetes.io~secret/inventory-0 Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.459783 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "10127b7d-da9f-43f0-b2c4-625ea937f23d" (UID: "10127b7d-da9f-43f0-b2c4-625ea937f23d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.460117 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.460136 5118 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-1\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.460145 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.460154 5118 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.460162 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvs8r\" (UniqueName: \"kubernetes.io/projected/10127b7d-da9f-43f0-b2c4-625ea937f23d-kube-api-access-vvs8r\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.460172 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/10127b7d-da9f-43f0-b2c4-625ea937f23d-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.820903 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-md2tf" event={"ID":"10127b7d-da9f-43f0-b2c4-625ea937f23d","Type":"ContainerDied","Data":"dd2d9ec0355a18de4a0be3474e5994a2b42d4941fe2eb423acaca121ef50ea9c"} Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.820966 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd2d9ec0355a18de4a0be3474e5994a2b42d4941fe2eb423acaca121ef50ea9c" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.821050 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-md2tf" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.933225 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-2rc7r"] Feb 23 09:16:21 crc kubenswrapper[5118]: E0223 09:16:21.933875 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10127b7d-da9f-43f0-b2c4-625ea937f23d" containerName="ssh-known-hosts-openstack" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.933903 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="10127b7d-da9f-43f0-b2c4-625ea937f23d" containerName="ssh-known-hosts-openstack" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.934190 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="10127b7d-da9f-43f0-b2c4-625ea937f23d" containerName="ssh-known-hosts-openstack" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.935269 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.938050 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.938157 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:16:21 crc kubenswrapper[5118]: I0223 09:16:21.947975 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-2rc7r"] Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.071408 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.071493 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wthgd\" (UniqueName: \"kubernetes.io/projected/5fbe2adc-de72-4e17-8635-911860358d61-kube-api-access-wthgd\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.071560 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ceph\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.071768 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-inventory\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.184901 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.184985 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wthgd\" (UniqueName: \"kubernetes.io/projected/5fbe2adc-de72-4e17-8635-911860358d61-kube-api-access-wthgd\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.185021 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ceph\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.185209 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-inventory\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.192134 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-inventory\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.192146 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.196650 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ceph\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.207366 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wthgd\" (UniqueName: \"kubernetes.io/projected/5fbe2adc-de72-4e17-8635-911860358d61-kube-api-access-wthgd\") pod \"run-os-openstack-openstack-cell1-2rc7r\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.256773 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.781918 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-2rc7r"] Feb 23 09:16:22 crc kubenswrapper[5118]: I0223 09:16:22.830286 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-2rc7r" event={"ID":"5fbe2adc-de72-4e17-8635-911860358d61","Type":"ContainerStarted","Data":"16280b2791c623c0daeecc666af9b56782b025049c71dc5d0a8c9b88e9c733db"} Feb 23 09:16:23 crc kubenswrapper[5118]: I0223 09:16:23.841558 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-2rc7r" event={"ID":"5fbe2adc-de72-4e17-8635-911860358d61","Type":"ContainerStarted","Data":"c2c2c84955ac71aed6479a20b1954d96da7269c9a0b2648cbf083eda38eb9cc0"} Feb 23 09:16:23 crc kubenswrapper[5118]: I0223 09:16:23.862195 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-2rc7r" podStartSLOduration=2.428395348 podStartE2EDuration="2.862174555s" podCreationTimestamp="2026-02-23 09:16:21 +0000 UTC" firstStartedPulling="2026-02-23 09:16:22.798448893 +0000 UTC m=+9045.802233466" lastFinishedPulling="2026-02-23 09:16:23.2322281 +0000 UTC m=+9046.236012673" observedRunningTime="2026-02-23 09:16:23.855048343 +0000 UTC m=+9046.858832916" watchObservedRunningTime="2026-02-23 09:16:23.862174555 +0000 UTC m=+9046.865959138" Feb 23 09:16:24 crc kubenswrapper[5118]: I0223 09:16:24.697941 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:16:24 crc kubenswrapper[5118]: E0223 09:16:24.698472 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:16:32 crc kubenswrapper[5118]: I0223 09:16:32.928769 5118 generic.go:334] "Generic (PLEG): container finished" podID="5fbe2adc-de72-4e17-8635-911860358d61" containerID="c2c2c84955ac71aed6479a20b1954d96da7269c9a0b2648cbf083eda38eb9cc0" exitCode=0 Feb 23 09:16:32 crc kubenswrapper[5118]: I0223 09:16:32.929376 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-2rc7r" event={"ID":"5fbe2adc-de72-4e17-8635-911860358d61","Type":"ContainerDied","Data":"c2c2c84955ac71aed6479a20b1954d96da7269c9a0b2648cbf083eda38eb9cc0"} Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.475338 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.653016 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wthgd\" (UniqueName: \"kubernetes.io/projected/5fbe2adc-de72-4e17-8635-911860358d61-kube-api-access-wthgd\") pod \"5fbe2adc-de72-4e17-8635-911860358d61\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.653086 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ssh-key-openstack-cell1\") pod \"5fbe2adc-de72-4e17-8635-911860358d61\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.653164 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ceph\") pod \"5fbe2adc-de72-4e17-8635-911860358d61\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.653350 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-inventory\") pod \"5fbe2adc-de72-4e17-8635-911860358d61\" (UID: \"5fbe2adc-de72-4e17-8635-911860358d61\") " Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.659816 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fbe2adc-de72-4e17-8635-911860358d61-kube-api-access-wthgd" (OuterVolumeSpecName: "kube-api-access-wthgd") pod "5fbe2adc-de72-4e17-8635-911860358d61" (UID: "5fbe2adc-de72-4e17-8635-911860358d61"). InnerVolumeSpecName "kube-api-access-wthgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.660203 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ceph" (OuterVolumeSpecName: "ceph") pod "5fbe2adc-de72-4e17-8635-911860358d61" (UID: "5fbe2adc-de72-4e17-8635-911860358d61"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.692801 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5fbe2adc-de72-4e17-8635-911860358d61" (UID: "5fbe2adc-de72-4e17-8635-911860358d61"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.695077 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-inventory" (OuterVolumeSpecName: "inventory") pod "5fbe2adc-de72-4e17-8635-911860358d61" (UID: "5fbe2adc-de72-4e17-8635-911860358d61"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.756238 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.756286 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wthgd\" (UniqueName: \"kubernetes.io/projected/5fbe2adc-de72-4e17-8635-911860358d61-kube-api-access-wthgd\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.756303 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.756316 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5fbe2adc-de72-4e17-8635-911860358d61-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.950739 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-2rc7r" event={"ID":"5fbe2adc-de72-4e17-8635-911860358d61","Type":"ContainerDied","Data":"16280b2791c623c0daeecc666af9b56782b025049c71dc5d0a8c9b88e9c733db"} Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.950799 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-2rc7r" Feb 23 09:16:34 crc kubenswrapper[5118]: I0223 09:16:34.950821 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16280b2791c623c0daeecc666af9b56782b025049c71dc5d0a8c9b88e9c733db" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.035063 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-r5qpm"] Feb 23 09:16:35 crc kubenswrapper[5118]: E0223 09:16:35.035852 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbe2adc-de72-4e17-8635-911860358d61" containerName="run-os-openstack-openstack-cell1" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.035902 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbe2adc-de72-4e17-8635-911860358d61" containerName="run-os-openstack-openstack-cell1" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.036246 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbe2adc-de72-4e17-8635-911860358d61" containerName="run-os-openstack-openstack-cell1" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.040252 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.046993 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.047000 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.068311 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-r5qpm"] Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.165307 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ceph\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.165370 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.165523 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-inventory\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.165552 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrsmn\" (UniqueName: \"kubernetes.io/projected/5bb7c61d-20fe-4699-9ed2-1db80db282b5-kube-api-access-qrsmn\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.267688 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.267859 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-inventory\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.267901 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrsmn\" (UniqueName: \"kubernetes.io/projected/5bb7c61d-20fe-4699-9ed2-1db80db282b5-kube-api-access-qrsmn\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.268068 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ceph\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.272286 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.272446 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-inventory\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.279520 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ceph\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.293516 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrsmn\" (UniqueName: \"kubernetes.io/projected/5bb7c61d-20fe-4699-9ed2-1db80db282b5-kube-api-access-qrsmn\") pod \"reboot-os-openstack-openstack-cell1-r5qpm\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.368737 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:35 crc kubenswrapper[5118]: I0223 09:16:35.964998 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-r5qpm"] Feb 23 09:16:36 crc kubenswrapper[5118]: I0223 09:16:36.976791 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" event={"ID":"5bb7c61d-20fe-4699-9ed2-1db80db282b5","Type":"ContainerStarted","Data":"366246d6e25a81efdfe827edd7e604cec07c5aa74c3b9dc2946f99ba04f6f267"} Feb 23 09:16:36 crc kubenswrapper[5118]: I0223 09:16:36.978561 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" event={"ID":"5bb7c61d-20fe-4699-9ed2-1db80db282b5","Type":"ContainerStarted","Data":"6fe7995950489c654a60e466dc4a58cbd2c5da0a6d0ed698155277e6ceea4970"} Feb 23 09:16:37 crc kubenswrapper[5118]: I0223 09:16:37.010468 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" podStartSLOduration=1.573834481 podStartE2EDuration="2.010438155s" podCreationTimestamp="2026-02-23 09:16:35 +0000 UTC" firstStartedPulling="2026-02-23 09:16:35.969963385 +0000 UTC m=+9058.973747958" lastFinishedPulling="2026-02-23 09:16:36.406567059 +0000 UTC m=+9059.410351632" observedRunningTime="2026-02-23 09:16:36.997995516 +0000 UTC m=+9060.001780089" watchObservedRunningTime="2026-02-23 09:16:37.010438155 +0000 UTC m=+9060.014222728" Feb 23 09:16:37 crc kubenswrapper[5118]: I0223 09:16:37.708915 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:16:37 crc kubenswrapper[5118]: E0223 09:16:37.709759 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:16:48 crc kubenswrapper[5118]: I0223 09:16:48.697268 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:16:48 crc kubenswrapper[5118]: E0223 09:16:48.698120 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:16:52 crc kubenswrapper[5118]: I0223 09:16:52.128450 5118 generic.go:334] "Generic (PLEG): container finished" podID="5bb7c61d-20fe-4699-9ed2-1db80db282b5" containerID="366246d6e25a81efdfe827edd7e604cec07c5aa74c3b9dc2946f99ba04f6f267" exitCode=0 Feb 23 09:16:52 crc kubenswrapper[5118]: I0223 09:16:52.128612 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" event={"ID":"5bb7c61d-20fe-4699-9ed2-1db80db282b5","Type":"ContainerDied","Data":"366246d6e25a81efdfe827edd7e604cec07c5aa74c3b9dc2946f99ba04f6f267"} Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.630162 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.691680 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrsmn\" (UniqueName: \"kubernetes.io/projected/5bb7c61d-20fe-4699-9ed2-1db80db282b5-kube-api-access-qrsmn\") pod \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.691743 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ssh-key-openstack-cell1\") pod \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.691807 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ceph\") pod \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.691835 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-inventory\") pod \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\" (UID: \"5bb7c61d-20fe-4699-9ed2-1db80db282b5\") " Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.699353 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ceph" (OuterVolumeSpecName: "ceph") pod "5bb7c61d-20fe-4699-9ed2-1db80db282b5" (UID: "5bb7c61d-20fe-4699-9ed2-1db80db282b5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.705935 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bb7c61d-20fe-4699-9ed2-1db80db282b5-kube-api-access-qrsmn" (OuterVolumeSpecName: "kube-api-access-qrsmn") pod "5bb7c61d-20fe-4699-9ed2-1db80db282b5" (UID: "5bb7c61d-20fe-4699-9ed2-1db80db282b5"). InnerVolumeSpecName "kube-api-access-qrsmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.726153 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5bb7c61d-20fe-4699-9ed2-1db80db282b5" (UID: "5bb7c61d-20fe-4699-9ed2-1db80db282b5"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.754889 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-inventory" (OuterVolumeSpecName: "inventory") pod "5bb7c61d-20fe-4699-9ed2-1db80db282b5" (UID: "5bb7c61d-20fe-4699-9ed2-1db80db282b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.795428 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrsmn\" (UniqueName: \"kubernetes.io/projected/5bb7c61d-20fe-4699-9ed2-1db80db282b5-kube-api-access-qrsmn\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.795457 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.795467 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:53 crc kubenswrapper[5118]: I0223 09:16:53.795476 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bb7c61d-20fe-4699-9ed2-1db80db282b5-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.163237 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" event={"ID":"5bb7c61d-20fe-4699-9ed2-1db80db282b5","Type":"ContainerDied","Data":"6fe7995950489c654a60e466dc4a58cbd2c5da0a6d0ed698155277e6ceea4970"} Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.163269 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-r5qpm" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.163279 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fe7995950489c654a60e466dc4a58cbd2c5da0a6d0ed698155277e6ceea4970" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.266735 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-jxfs9"] Feb 23 09:16:54 crc kubenswrapper[5118]: E0223 09:16:54.267298 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb7c61d-20fe-4699-9ed2-1db80db282b5" containerName="reboot-os-openstack-openstack-cell1" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.267314 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb7c61d-20fe-4699-9ed2-1db80db282b5" containerName="reboot-os-openstack-openstack-cell1" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.267518 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bb7c61d-20fe-4699-9ed2-1db80db282b5" containerName="reboot-os-openstack-openstack-cell1" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.268286 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.271073 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.272406 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.278021 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-jxfs9"] Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.312905 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313019 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fv7x\" (UniqueName: \"kubernetes.io/projected/a6383853-b182-4930-87fe-2d27f67c8cf3-kube-api-access-5fv7x\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313055 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313116 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ceph\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313162 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-inventory\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313255 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313392 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313449 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313495 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313602 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313774 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.313844 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418283 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418386 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418429 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418484 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fv7x\" (UniqueName: \"kubernetes.io/projected/a6383853-b182-4930-87fe-2d27f67c8cf3-kube-api-access-5fv7x\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418513 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418545 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ceph\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418586 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-inventory\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418665 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418717 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418742 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418786 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.418855 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.427539 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.427586 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.428034 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ceph\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.432928 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.437877 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.439610 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-inventory\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.440710 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.441860 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.443874 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.450390 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.453788 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.460813 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fv7x\" (UniqueName: \"kubernetes.io/projected/a6383853-b182-4930-87fe-2d27f67c8cf3-kube-api-access-5fv7x\") pod \"install-certs-openstack-openstack-cell1-jxfs9\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:54 crc kubenswrapper[5118]: I0223 09:16:54.582967 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:16:55 crc kubenswrapper[5118]: W0223 09:16:55.584033 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6383853_b182_4930_87fe_2d27f67c8cf3.slice/crio-32067e9d6b6b9f078d73b2b666dd0a7b66e859685b0c8cd20fde2d97a9bae108 WatchSource:0}: Error finding container 32067e9d6b6b9f078d73b2b666dd0a7b66e859685b0c8cd20fde2d97a9bae108: Status 404 returned error can't find the container with id 32067e9d6b6b9f078d73b2b666dd0a7b66e859685b0c8cd20fde2d97a9bae108 Feb 23 09:16:55 crc kubenswrapper[5118]: I0223 09:16:55.590874 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-jxfs9"] Feb 23 09:16:56 crc kubenswrapper[5118]: I0223 09:16:56.213372 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" event={"ID":"a6383853-b182-4930-87fe-2d27f67c8cf3","Type":"ContainerStarted","Data":"32067e9d6b6b9f078d73b2b666dd0a7b66e859685b0c8cd20fde2d97a9bae108"} Feb 23 09:16:57 crc kubenswrapper[5118]: I0223 09:16:57.224704 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" event={"ID":"a6383853-b182-4930-87fe-2d27f67c8cf3","Type":"ContainerStarted","Data":"a23ea128076e403ce7121fb2ff8a39d0427da4d0c371215500298a72b15b864a"} Feb 23 09:16:57 crc kubenswrapper[5118]: I0223 09:16:57.259311 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" podStartSLOduration=2.855383664 podStartE2EDuration="3.259279759s" podCreationTimestamp="2026-02-23 09:16:54 +0000 UTC" firstStartedPulling="2026-02-23 09:16:55.586786624 +0000 UTC m=+9078.590571197" lastFinishedPulling="2026-02-23 09:16:55.990682709 +0000 UTC m=+9078.994467292" observedRunningTime="2026-02-23 09:16:57.248224853 +0000 UTC m=+9080.252009436" watchObservedRunningTime="2026-02-23 09:16:57.259279759 +0000 UTC m=+9080.263064342" Feb 23 09:17:01 crc kubenswrapper[5118]: I0223 09:17:01.702467 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:17:01 crc kubenswrapper[5118]: E0223 09:17:01.703121 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:17:13 crc kubenswrapper[5118]: I0223 09:17:13.698054 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:17:13 crc kubenswrapper[5118]: E0223 09:17:13.698896 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:17:15 crc kubenswrapper[5118]: I0223 09:17:15.430081 5118 generic.go:334] "Generic (PLEG): container finished" podID="4258d556-fe9b-469a-b37a-6a15eb81b4be" containerID="535a98dfb3d6357fc0b6d5800027eb0af3e6adb548fe2f921ab5f93032adaf37" exitCode=0 Feb 23 09:17:15 crc kubenswrapper[5118]: I0223 09:17:15.430157 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-h8lzm" event={"ID":"4258d556-fe9b-469a-b37a-6a15eb81b4be","Type":"ContainerDied","Data":"535a98dfb3d6357fc0b6d5800027eb0af3e6adb548fe2f921ab5f93032adaf37"} Feb 23 09:17:16 crc kubenswrapper[5118]: I0223 09:17:16.448170 5118 generic.go:334] "Generic (PLEG): container finished" podID="a6383853-b182-4930-87fe-2d27f67c8cf3" containerID="a23ea128076e403ce7121fb2ff8a39d0427da4d0c371215500298a72b15b864a" exitCode=0 Feb 23 09:17:16 crc kubenswrapper[5118]: I0223 09:17:16.448261 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" event={"ID":"a6383853-b182-4930-87fe-2d27f67c8cf3","Type":"ContainerDied","Data":"a23ea128076e403ce7121fb2ff8a39d0427da4d0c371215500298a72b15b864a"} Feb 23 09:17:16 crc kubenswrapper[5118]: I0223 09:17:16.938066 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.014077 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovncontroller-config-0\") pod \"4258d556-fe9b-469a-b37a-6a15eb81b4be\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.014267 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovn-combined-ca-bundle\") pod \"4258d556-fe9b-469a-b37a-6a15eb81b4be\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.014344 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-inventory\") pod \"4258d556-fe9b-469a-b37a-6a15eb81b4be\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.014485 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cztq2\" (UniqueName: \"kubernetes.io/projected/4258d556-fe9b-469a-b37a-6a15eb81b4be-kube-api-access-cztq2\") pod \"4258d556-fe9b-469a-b37a-6a15eb81b4be\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.014535 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ssh-key-openstack-networker\") pod \"4258d556-fe9b-469a-b37a-6a15eb81b4be\" (UID: \"4258d556-fe9b-469a-b37a-6a15eb81b4be\") " Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.020318 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4258d556-fe9b-469a-b37a-6a15eb81b4be" (UID: "4258d556-fe9b-469a-b37a-6a15eb81b4be"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.020371 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4258d556-fe9b-469a-b37a-6a15eb81b4be-kube-api-access-cztq2" (OuterVolumeSpecName: "kube-api-access-cztq2") pod "4258d556-fe9b-469a-b37a-6a15eb81b4be" (UID: "4258d556-fe9b-469a-b37a-6a15eb81b4be"). InnerVolumeSpecName "kube-api-access-cztq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.039189 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "4258d556-fe9b-469a-b37a-6a15eb81b4be" (UID: "4258d556-fe9b-469a-b37a-6a15eb81b4be"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.043421 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "4258d556-fe9b-469a-b37a-6a15eb81b4be" (UID: "4258d556-fe9b-469a-b37a-6a15eb81b4be"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.045929 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-inventory" (OuterVolumeSpecName: "inventory") pod "4258d556-fe9b-469a-b37a-6a15eb81b4be" (UID: "4258d556-fe9b-469a-b37a-6a15eb81b4be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.117612 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.117640 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cztq2\" (UniqueName: \"kubernetes.io/projected/4258d556-fe9b-469a-b37a-6a15eb81b4be-kube-api-access-cztq2\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.117650 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.117659 5118 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.117667 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4258d556-fe9b-469a-b37a-6a15eb81b4be-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.471897 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-h8lzm" event={"ID":"4258d556-fe9b-469a-b37a-6a15eb81b4be","Type":"ContainerDied","Data":"08d3d373faa44c420cb762b4e66b1a99d58c4e44db6f9d38da1e9e2c0d7f31a9"} Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.474014 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08d3d373faa44c420cb762b4e66b1a99d58c4e44db6f9d38da1e9e2c0d7f31a9" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.472010 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-h8lzm" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.566344 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-m96tb"] Feb 23 09:17:17 crc kubenswrapper[5118]: E0223 09:17:17.566839 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4258d556-fe9b-469a-b37a-6a15eb81b4be" containerName="ovn-openstack-openstack-networker" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.566859 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="4258d556-fe9b-469a-b37a-6a15eb81b4be" containerName="ovn-openstack-openstack-networker" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.567082 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="4258d556-fe9b-469a-b37a-6a15eb81b4be" containerName="ovn-openstack-openstack-networker" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.567761 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.569350 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.570741 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.571201 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-c5q2t" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.574397 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.575246 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-m96tb"] Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.626777 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.626831 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.626858 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-inventory\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.626959 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.627300 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv2b8\" (UniqueName: \"kubernetes.io/projected/d0897d57-2e00-4c9a-be28-20a997e5b96f-kube-api-access-tv2b8\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.627421 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.730824 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.730884 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.730913 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-inventory\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.730945 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.731019 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv2b8\" (UniqueName: \"kubernetes.io/projected/d0897d57-2e00-4c9a-be28-20a997e5b96f-kube-api-access-tv2b8\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.731061 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.736312 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.740705 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.741725 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.742405 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-inventory\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.745290 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.758157 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv2b8\" (UniqueName: \"kubernetes.io/projected/d0897d57-2e00-4c9a-be28-20a997e5b96f-kube-api-access-tv2b8\") pod \"neutron-metadata-openstack-openstack-networker-m96tb\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:17 crc kubenswrapper[5118]: I0223 09:17:17.898465 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.014359 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.139639 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-metadata-combined-ca-bundle\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.139795 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ceph\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.139827 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-dhcp-combined-ca-bundle\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.139863 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-bootstrap-combined-ca-bundle\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.139900 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-nova-combined-ca-bundle\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.139943 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ssh-key-openstack-cell1\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.139982 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-sriov-combined-ca-bundle\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.140015 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-libvirt-combined-ca-bundle\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.140846 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-inventory\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.140916 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fv7x\" (UniqueName: \"kubernetes.io/projected/a6383853-b182-4930-87fe-2d27f67c8cf3-kube-api-access-5fv7x\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.140963 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-telemetry-combined-ca-bundle\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.140987 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ovn-combined-ca-bundle\") pod \"a6383853-b182-4930-87fe-2d27f67c8cf3\" (UID: \"a6383853-b182-4930-87fe-2d27f67c8cf3\") " Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.145744 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.146462 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ceph" (OuterVolumeSpecName: "ceph") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.146504 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.146832 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.146986 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.147626 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.148087 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.148676 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.148689 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6383853-b182-4930-87fe-2d27f67c8cf3-kube-api-access-5fv7x" (OuterVolumeSpecName: "kube-api-access-5fv7x") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "kube-api-access-5fv7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.162759 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.175857 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.179955 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-inventory" (OuterVolumeSpecName: "inventory") pod "a6383853-b182-4930-87fe-2d27f67c8cf3" (UID: "a6383853-b182-4930-87fe-2d27f67c8cf3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244380 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244414 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244425 5118 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244437 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244447 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fv7x\" (UniqueName: \"kubernetes.io/projected/a6383853-b182-4930-87fe-2d27f67c8cf3-kube-api-access-5fv7x\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244457 5118 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244466 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244476 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244489 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244537 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244552 5118 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.244563 5118 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6383853-b182-4930-87fe-2d27f67c8cf3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.438809 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-m96tb"] Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.442295 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.482609 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" event={"ID":"a6383853-b182-4930-87fe-2d27f67c8cf3","Type":"ContainerDied","Data":"32067e9d6b6b9f078d73b2b666dd0a7b66e859685b0c8cd20fde2d97a9bae108"} Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.482648 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-jxfs9" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.482657 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32067e9d6b6b9f078d73b2b666dd0a7b66e859685b0c8cd20fde2d97a9bae108" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.484505 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" event={"ID":"d0897d57-2e00-4c9a-be28-20a997e5b96f","Type":"ContainerStarted","Data":"af4e165f5d8e5fc708a69d656b1eaa6803348938194d35d994a7bff23d71a6fc"} Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.561692 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-qtj48"] Feb 23 09:17:18 crc kubenswrapper[5118]: E0223 09:17:18.562133 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6383853-b182-4930-87fe-2d27f67c8cf3" containerName="install-certs-openstack-openstack-cell1" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.562151 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6383853-b182-4930-87fe-2d27f67c8cf3" containerName="install-certs-openstack-openstack-cell1" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.562351 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6383853-b182-4930-87fe-2d27f67c8cf3" containerName="install-certs-openstack-openstack-cell1" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.563016 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.565077 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.565256 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.584934 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-qtj48"] Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.652610 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-inventory\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.652657 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfz5l\" (UniqueName: \"kubernetes.io/projected/afb0350b-ca38-4bda-82cb-78dd02b3eef1-kube-api-access-pfz5l\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.652692 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.652732 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ceph\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.754597 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-inventory\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.754647 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfz5l\" (UniqueName: \"kubernetes.io/projected/afb0350b-ca38-4bda-82cb-78dd02b3eef1-kube-api-access-pfz5l\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.754673 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.754707 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ceph\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.760462 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-inventory\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.760740 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.761743 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ceph\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.773690 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfz5l\" (UniqueName: \"kubernetes.io/projected/afb0350b-ca38-4bda-82cb-78dd02b3eef1-kube-api-access-pfz5l\") pod \"ceph-client-openstack-openstack-cell1-qtj48\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:18 crc kubenswrapper[5118]: I0223 09:17:18.886730 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:19 crc kubenswrapper[5118]: I0223 09:17:19.431469 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-qtj48"] Feb 23 09:17:19 crc kubenswrapper[5118]: I0223 09:17:19.517915 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" event={"ID":"d0897d57-2e00-4c9a-be28-20a997e5b96f","Type":"ContainerStarted","Data":"bbe08366f66a916d01685fa2136dde43400627fce96aceb811eceb4b98d5c7a2"} Feb 23 09:17:19 crc kubenswrapper[5118]: I0223 09:17:19.520823 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" event={"ID":"afb0350b-ca38-4bda-82cb-78dd02b3eef1","Type":"ContainerStarted","Data":"95a7afec8be22f620c57d6e9dee110665e0b9b40081960ea7d91dfabbbc61804"} Feb 23 09:17:19 crc kubenswrapper[5118]: I0223 09:17:19.533277 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" podStartSLOduration=2.029224608 podStartE2EDuration="2.533260548s" podCreationTimestamp="2026-02-23 09:17:17 +0000 UTC" firstStartedPulling="2026-02-23 09:17:18.442111576 +0000 UTC m=+9101.445896149" lastFinishedPulling="2026-02-23 09:17:18.946147476 +0000 UTC m=+9101.949932089" observedRunningTime="2026-02-23 09:17:19.532489429 +0000 UTC m=+9102.536274002" watchObservedRunningTime="2026-02-23 09:17:19.533260548 +0000 UTC m=+9102.537045121" Feb 23 09:17:20 crc kubenswrapper[5118]: I0223 09:17:20.534800 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" event={"ID":"afb0350b-ca38-4bda-82cb-78dd02b3eef1","Type":"ContainerStarted","Data":"b02721c60cb2a25a198c58b83613cc565a4da557064c283362790da1e5d05b60"} Feb 23 09:17:20 crc kubenswrapper[5118]: I0223 09:17:20.561765 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" podStartSLOduration=2.137957886 podStartE2EDuration="2.56173907s" podCreationTimestamp="2026-02-23 09:17:18 +0000 UTC" firstStartedPulling="2026-02-23 09:17:19.436484216 +0000 UTC m=+9102.440268789" lastFinishedPulling="2026-02-23 09:17:19.86026539 +0000 UTC m=+9102.864049973" observedRunningTime="2026-02-23 09:17:20.560173832 +0000 UTC m=+9103.563958415" watchObservedRunningTime="2026-02-23 09:17:20.56173907 +0000 UTC m=+9103.565523653" Feb 23 09:17:25 crc kubenswrapper[5118]: I0223 09:17:25.584025 5118 generic.go:334] "Generic (PLEG): container finished" podID="afb0350b-ca38-4bda-82cb-78dd02b3eef1" containerID="b02721c60cb2a25a198c58b83613cc565a4da557064c283362790da1e5d05b60" exitCode=0 Feb 23 09:17:25 crc kubenswrapper[5118]: I0223 09:17:25.584060 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" event={"ID":"afb0350b-ca38-4bda-82cb-78dd02b3eef1","Type":"ContainerDied","Data":"b02721c60cb2a25a198c58b83613cc565a4da557064c283362790da1e5d05b60"} Feb 23 09:17:26 crc kubenswrapper[5118]: I0223 09:17:26.698315 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:17:26 crc kubenswrapper[5118]: E0223 09:17:26.698946 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.092992 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.252221 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ceph\") pod \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.252344 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-inventory\") pod \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.252616 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ssh-key-openstack-cell1\") pod \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.252661 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfz5l\" (UniqueName: \"kubernetes.io/projected/afb0350b-ca38-4bda-82cb-78dd02b3eef1-kube-api-access-pfz5l\") pod \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\" (UID: \"afb0350b-ca38-4bda-82cb-78dd02b3eef1\") " Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.263374 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ceph" (OuterVolumeSpecName: "ceph") pod "afb0350b-ca38-4bda-82cb-78dd02b3eef1" (UID: "afb0350b-ca38-4bda-82cb-78dd02b3eef1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.269452 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb0350b-ca38-4bda-82cb-78dd02b3eef1-kube-api-access-pfz5l" (OuterVolumeSpecName: "kube-api-access-pfz5l") pod "afb0350b-ca38-4bda-82cb-78dd02b3eef1" (UID: "afb0350b-ca38-4bda-82cb-78dd02b3eef1"). InnerVolumeSpecName "kube-api-access-pfz5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.283525 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "afb0350b-ca38-4bda-82cb-78dd02b3eef1" (UID: "afb0350b-ca38-4bda-82cb-78dd02b3eef1"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.292292 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-inventory" (OuterVolumeSpecName: "inventory") pod "afb0350b-ca38-4bda-82cb-78dd02b3eef1" (UID: "afb0350b-ca38-4bda-82cb-78dd02b3eef1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.356608 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.356652 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfz5l\" (UniqueName: \"kubernetes.io/projected/afb0350b-ca38-4bda-82cb-78dd02b3eef1-kube-api-access-pfz5l\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.356665 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.356683 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afb0350b-ca38-4bda-82cb-78dd02b3eef1-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.604171 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.604165 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-qtj48" event={"ID":"afb0350b-ca38-4bda-82cb-78dd02b3eef1","Type":"ContainerDied","Data":"95a7afec8be22f620c57d6e9dee110665e0b9b40081960ea7d91dfabbbc61804"} Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.604336 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95a7afec8be22f620c57d6e9dee110665e0b9b40081960ea7d91dfabbbc61804" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.691303 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-9zb28"] Feb 23 09:17:27 crc kubenswrapper[5118]: E0223 09:17:27.691766 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb0350b-ca38-4bda-82cb-78dd02b3eef1" containerName="ceph-client-openstack-openstack-cell1" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.691781 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb0350b-ca38-4bda-82cb-78dd02b3eef1" containerName="ceph-client-openstack-openstack-cell1" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.691979 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb0350b-ca38-4bda-82cb-78dd02b3eef1" containerName="ceph-client-openstack-openstack-cell1" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.692712 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.694658 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.694976 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.697571 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.735780 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-9zb28"] Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.868838 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.868906 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.868935 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmmn\" (UniqueName: \"kubernetes.io/projected/15546255-9c2d-4475-93b1-1805b835245f-kube-api-access-4hmmn\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.868967 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ceph\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.869701 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-inventory\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.869866 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/15546255-9c2d-4475-93b1-1805b835245f-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.972275 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-inventory\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.972371 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/15546255-9c2d-4475-93b1-1805b835245f-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.972486 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.972532 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.972598 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hmmn\" (UniqueName: \"kubernetes.io/projected/15546255-9c2d-4475-93b1-1805b835245f-kube-api-access-4hmmn\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.972739 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ceph\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.973936 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/15546255-9c2d-4475-93b1-1805b835245f-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.977298 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-inventory\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.977344 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.977810 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ceph\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:27 crc kubenswrapper[5118]: I0223 09:17:27.978585 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:28 crc kubenswrapper[5118]: I0223 09:17:28.015134 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hmmn\" (UniqueName: \"kubernetes.io/projected/15546255-9c2d-4475-93b1-1805b835245f-kube-api-access-4hmmn\") pod \"ovn-openstack-openstack-cell1-9zb28\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:28 crc kubenswrapper[5118]: I0223 09:17:28.033727 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:17:28 crc kubenswrapper[5118]: I0223 09:17:28.601546 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-9zb28"] Feb 23 09:17:28 crc kubenswrapper[5118]: I0223 09:17:28.614434 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-9zb28" event={"ID":"15546255-9c2d-4475-93b1-1805b835245f","Type":"ContainerStarted","Data":"1400b1c337a787b63b2c02095e3b9af75163ee972bd0ebaea36c66d02d7030d0"} Feb 23 09:17:29 crc kubenswrapper[5118]: I0223 09:17:29.626608 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-9zb28" event={"ID":"15546255-9c2d-4475-93b1-1805b835245f","Type":"ContainerStarted","Data":"1b793c3ed0154fcb8641aec2d4f01b4692fa83225fef3251d641bb72928430a6"} Feb 23 09:17:29 crc kubenswrapper[5118]: I0223 09:17:29.646244 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-9zb28" podStartSLOduration=2.177106565 podStartE2EDuration="2.646222133s" podCreationTimestamp="2026-02-23 09:17:27 +0000 UTC" firstStartedPulling="2026-02-23 09:17:28.589818138 +0000 UTC m=+9111.593602711" lastFinishedPulling="2026-02-23 09:17:29.058933706 +0000 UTC m=+9112.062718279" observedRunningTime="2026-02-23 09:17:29.644534192 +0000 UTC m=+9112.648318765" watchObservedRunningTime="2026-02-23 09:17:29.646222133 +0000 UTC m=+9112.650006716" Feb 23 09:17:39 crc kubenswrapper[5118]: I0223 09:17:39.698280 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:17:39 crc kubenswrapper[5118]: E0223 09:17:39.699060 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:17:52 crc kubenswrapper[5118]: I0223 09:17:52.696768 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:17:52 crc kubenswrapper[5118]: E0223 09:17:52.697662 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:18:03 crc kubenswrapper[5118]: I0223 09:18:03.698349 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:18:03 crc kubenswrapper[5118]: E0223 09:18:03.699213 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:18:14 crc kubenswrapper[5118]: I0223 09:18:14.697383 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:18:14 crc kubenswrapper[5118]: E0223 09:18:14.698460 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:18:18 crc kubenswrapper[5118]: I0223 09:18:18.110987 5118 generic.go:334] "Generic (PLEG): container finished" podID="d0897d57-2e00-4c9a-be28-20a997e5b96f" containerID="bbe08366f66a916d01685fa2136dde43400627fce96aceb811eceb4b98d5c7a2" exitCode=0 Feb 23 09:18:18 crc kubenswrapper[5118]: I0223 09:18:18.111125 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" event={"ID":"d0897d57-2e00-4c9a-be28-20a997e5b96f","Type":"ContainerDied","Data":"bbe08366f66a916d01685fa2136dde43400627fce96aceb811eceb4b98d5c7a2"} Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.663429 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.786574 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-metadata-combined-ca-bundle\") pod \"d0897d57-2e00-4c9a-be28-20a997e5b96f\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.786649 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"d0897d57-2e00-4c9a-be28-20a997e5b96f\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.786710 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-nova-metadata-neutron-config-0\") pod \"d0897d57-2e00-4c9a-be28-20a997e5b96f\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.786909 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-inventory\") pod \"d0897d57-2e00-4c9a-be28-20a997e5b96f\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.786926 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-ssh-key-openstack-networker\") pod \"d0897d57-2e00-4c9a-be28-20a997e5b96f\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.786963 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv2b8\" (UniqueName: \"kubernetes.io/projected/d0897d57-2e00-4c9a-be28-20a997e5b96f-kube-api-access-tv2b8\") pod \"d0897d57-2e00-4c9a-be28-20a997e5b96f\" (UID: \"d0897d57-2e00-4c9a-be28-20a997e5b96f\") " Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.792325 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0897d57-2e00-4c9a-be28-20a997e5b96f-kube-api-access-tv2b8" (OuterVolumeSpecName: "kube-api-access-tv2b8") pod "d0897d57-2e00-4c9a-be28-20a997e5b96f" (UID: "d0897d57-2e00-4c9a-be28-20a997e5b96f"). InnerVolumeSpecName "kube-api-access-tv2b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.813251 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d0897d57-2e00-4c9a-be28-20a997e5b96f" (UID: "d0897d57-2e00-4c9a-be28-20a997e5b96f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.828215 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "d0897d57-2e00-4c9a-be28-20a997e5b96f" (UID: "d0897d57-2e00-4c9a-be28-20a997e5b96f"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.828316 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "d0897d57-2e00-4c9a-be28-20a997e5b96f" (UID: "d0897d57-2e00-4c9a-be28-20a997e5b96f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.831229 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-inventory" (OuterVolumeSpecName: "inventory") pod "d0897d57-2e00-4c9a-be28-20a997e5b96f" (UID: "d0897d57-2e00-4c9a-be28-20a997e5b96f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.831387 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "d0897d57-2e00-4c9a-be28-20a997e5b96f" (UID: "d0897d57-2e00-4c9a-be28-20a997e5b96f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.889561 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.889596 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.889609 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv2b8\" (UniqueName: \"kubernetes.io/projected/d0897d57-2e00-4c9a-be28-20a997e5b96f-kube-api-access-tv2b8\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.889618 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.889630 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:19 crc kubenswrapper[5118]: I0223 09:18:19.889641 5118 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/d0897d57-2e00-4c9a-be28-20a997e5b96f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:20 crc kubenswrapper[5118]: I0223 09:18:20.136026 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" event={"ID":"d0897d57-2e00-4c9a-be28-20a997e5b96f","Type":"ContainerDied","Data":"af4e165f5d8e5fc708a69d656b1eaa6803348938194d35d994a7bff23d71a6fc"} Feb 23 09:18:20 crc kubenswrapper[5118]: I0223 09:18:20.136088 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-m96tb" Feb 23 09:18:20 crc kubenswrapper[5118]: I0223 09:18:20.136125 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af4e165f5d8e5fc708a69d656b1eaa6803348938194d35d994a7bff23d71a6fc" Feb 23 09:18:28 crc kubenswrapper[5118]: I0223 09:18:28.698067 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:18:28 crc kubenswrapper[5118]: E0223 09:18:28.698895 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:18:41 crc kubenswrapper[5118]: I0223 09:18:41.697412 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:18:41 crc kubenswrapper[5118]: E0223 09:18:41.698076 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:18:47 crc kubenswrapper[5118]: I0223 09:18:47.465210 5118 generic.go:334] "Generic (PLEG): container finished" podID="15546255-9c2d-4475-93b1-1805b835245f" containerID="1b793c3ed0154fcb8641aec2d4f01b4692fa83225fef3251d641bb72928430a6" exitCode=0 Feb 23 09:18:47 crc kubenswrapper[5118]: I0223 09:18:47.465282 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-9zb28" event={"ID":"15546255-9c2d-4475-93b1-1805b835245f","Type":"ContainerDied","Data":"1b793c3ed0154fcb8641aec2d4f01b4692fa83225fef3251d641bb72928430a6"} Feb 23 09:18:48 crc kubenswrapper[5118]: I0223 09:18:48.937140 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.110084 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ovn-combined-ca-bundle\") pod \"15546255-9c2d-4475-93b1-1805b835245f\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.110815 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ssh-key-openstack-cell1\") pod \"15546255-9c2d-4475-93b1-1805b835245f\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.111017 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/15546255-9c2d-4475-93b1-1805b835245f-ovncontroller-config-0\") pod \"15546255-9c2d-4475-93b1-1805b835245f\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.111218 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hmmn\" (UniqueName: \"kubernetes.io/projected/15546255-9c2d-4475-93b1-1805b835245f-kube-api-access-4hmmn\") pod \"15546255-9c2d-4475-93b1-1805b835245f\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.111356 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-inventory\") pod \"15546255-9c2d-4475-93b1-1805b835245f\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.111588 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ceph\") pod \"15546255-9c2d-4475-93b1-1805b835245f\" (UID: \"15546255-9c2d-4475-93b1-1805b835245f\") " Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.117728 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "15546255-9c2d-4475-93b1-1805b835245f" (UID: "15546255-9c2d-4475-93b1-1805b835245f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.118263 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ceph" (OuterVolumeSpecName: "ceph") pod "15546255-9c2d-4475-93b1-1805b835245f" (UID: "15546255-9c2d-4475-93b1-1805b835245f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.119861 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15546255-9c2d-4475-93b1-1805b835245f-kube-api-access-4hmmn" (OuterVolumeSpecName: "kube-api-access-4hmmn") pod "15546255-9c2d-4475-93b1-1805b835245f" (UID: "15546255-9c2d-4475-93b1-1805b835245f"). InnerVolumeSpecName "kube-api-access-4hmmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.145793 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-inventory" (OuterVolumeSpecName: "inventory") pod "15546255-9c2d-4475-93b1-1805b835245f" (UID: "15546255-9c2d-4475-93b1-1805b835245f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.149293 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15546255-9c2d-4475-93b1-1805b835245f-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "15546255-9c2d-4475-93b1-1805b835245f" (UID: "15546255-9c2d-4475-93b1-1805b835245f"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.167945 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "15546255-9c2d-4475-93b1-1805b835245f" (UID: "15546255-9c2d-4475-93b1-1805b835245f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.215355 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.215747 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.215763 5118 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/15546255-9c2d-4475-93b1-1805b835245f-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.215775 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hmmn\" (UniqueName: \"kubernetes.io/projected/15546255-9c2d-4475-93b1-1805b835245f-kube-api-access-4hmmn\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.215789 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.215801 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/15546255-9c2d-4475-93b1-1805b835245f-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.489803 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-9zb28" event={"ID":"15546255-9c2d-4475-93b1-1805b835245f","Type":"ContainerDied","Data":"1400b1c337a787b63b2c02095e3b9af75163ee972bd0ebaea36c66d02d7030d0"} Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.489848 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1400b1c337a787b63b2c02095e3b9af75163ee972bd0ebaea36c66d02d7030d0" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.489908 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-9zb28" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.600393 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-chnnl"] Feb 23 09:18:49 crc kubenswrapper[5118]: E0223 09:18:49.600876 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0897d57-2e00-4c9a-be28-20a997e5b96f" containerName="neutron-metadata-openstack-openstack-networker" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.600904 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0897d57-2e00-4c9a-be28-20a997e5b96f" containerName="neutron-metadata-openstack-openstack-networker" Feb 23 09:18:49 crc kubenswrapper[5118]: E0223 09:18:49.600938 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15546255-9c2d-4475-93b1-1805b835245f" containerName="ovn-openstack-openstack-cell1" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.600947 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="15546255-9c2d-4475-93b1-1805b835245f" containerName="ovn-openstack-openstack-cell1" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.601150 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0897d57-2e00-4c9a-be28-20a997e5b96f" containerName="neutron-metadata-openstack-openstack-networker" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.601185 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="15546255-9c2d-4475-93b1-1805b835245f" containerName="ovn-openstack-openstack-cell1" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.601919 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.605416 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.605452 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.605499 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.606102 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.606272 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.606277 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.637689 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-chnnl"] Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.729088 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.729163 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.729236 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx7pp\" (UniqueName: \"kubernetes.io/projected/837cd892-bf64-4ba8-9805-290c7fb37f26-kube-api-access-rx7pp\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.729520 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.729589 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.729642 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.729731 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.831833 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.831909 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.831944 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.832021 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.832632 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.832668 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.832764 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx7pp\" (UniqueName: \"kubernetes.io/projected/837cd892-bf64-4ba8-9805-290c7fb37f26-kube-api-access-rx7pp\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.836219 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.836300 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.836315 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.836777 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.837293 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.837400 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.856190 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx7pp\" (UniqueName: \"kubernetes.io/projected/837cd892-bf64-4ba8-9805-290c7fb37f26-kube-api-access-rx7pp\") pod \"neutron-metadata-openstack-openstack-cell1-chnnl\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:49 crc kubenswrapper[5118]: I0223 09:18:49.941585 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:18:50 crc kubenswrapper[5118]: I0223 09:18:50.480822 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-chnnl"] Feb 23 09:18:50 crc kubenswrapper[5118]: I0223 09:18:50.502694 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" event={"ID":"837cd892-bf64-4ba8-9805-290c7fb37f26","Type":"ContainerStarted","Data":"d5a98f1af22185ec68753dde0418a9077b6c51fe8d74d20dbd493bec60ed9c80"} Feb 23 09:18:51 crc kubenswrapper[5118]: I0223 09:18:51.513561 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" event={"ID":"837cd892-bf64-4ba8-9805-290c7fb37f26","Type":"ContainerStarted","Data":"de50ab2e30de01a0ef5975c3f12940f45ac6920b71703c08edd8a32c8031883a"} Feb 23 09:18:51 crc kubenswrapper[5118]: I0223 09:18:51.544472 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" podStartSLOduration=2.119279596 podStartE2EDuration="2.544449627s" podCreationTimestamp="2026-02-23 09:18:49 +0000 UTC" firstStartedPulling="2026-02-23 09:18:50.481247478 +0000 UTC m=+9193.485032051" lastFinishedPulling="2026-02-23 09:18:50.906417509 +0000 UTC m=+9193.910202082" observedRunningTime="2026-02-23 09:18:51.536950275 +0000 UTC m=+9194.540734848" watchObservedRunningTime="2026-02-23 09:18:51.544449627 +0000 UTC m=+9194.548234200" Feb 23 09:18:53 crc kubenswrapper[5118]: I0223 09:18:53.697849 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:18:53 crc kubenswrapper[5118]: E0223 09:18:53.698606 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:19:07 crc kubenswrapper[5118]: I0223 09:19:07.709546 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:19:07 crc kubenswrapper[5118]: E0223 09:19:07.710940 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:19:21 crc kubenswrapper[5118]: I0223 09:19:21.697621 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:19:21 crc kubenswrapper[5118]: E0223 09:19:21.700475 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:19:32 crc kubenswrapper[5118]: I0223 09:19:32.697390 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:19:32 crc kubenswrapper[5118]: E0223 09:19:32.698075 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:19:43 crc kubenswrapper[5118]: I0223 09:19:43.697876 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:19:44 crc kubenswrapper[5118]: I0223 09:19:44.135949 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"f0c958d89c259c74afc6e743f24c284b530cfbbe89a48ecbdcf33294b5a6d635"} Feb 23 09:19:52 crc kubenswrapper[5118]: I0223 09:19:52.226511 5118 generic.go:334] "Generic (PLEG): container finished" podID="837cd892-bf64-4ba8-9805-290c7fb37f26" containerID="de50ab2e30de01a0ef5975c3f12940f45ac6920b71703c08edd8a32c8031883a" exitCode=0 Feb 23 09:19:52 crc kubenswrapper[5118]: I0223 09:19:52.226587 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" event={"ID":"837cd892-bf64-4ba8-9805-290c7fb37f26","Type":"ContainerDied","Data":"de50ab2e30de01a0ef5975c3f12940f45ac6920b71703c08edd8a32c8031883a"} Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.733897 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.852594 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-nova-metadata-neutron-config-0\") pod \"837cd892-bf64-4ba8-9805-290c7fb37f26\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.852990 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ssh-key-openstack-cell1\") pod \"837cd892-bf64-4ba8-9805-290c7fb37f26\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.853072 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-metadata-combined-ca-bundle\") pod \"837cd892-bf64-4ba8-9805-290c7fb37f26\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.853190 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-inventory\") pod \"837cd892-bf64-4ba8-9805-290c7fb37f26\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.853291 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ceph\") pod \"837cd892-bf64-4ba8-9805-290c7fb37f26\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.853433 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-ovn-metadata-agent-neutron-config-0\") pod \"837cd892-bf64-4ba8-9805-290c7fb37f26\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.853668 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx7pp\" (UniqueName: \"kubernetes.io/projected/837cd892-bf64-4ba8-9805-290c7fb37f26-kube-api-access-rx7pp\") pod \"837cd892-bf64-4ba8-9805-290c7fb37f26\" (UID: \"837cd892-bf64-4ba8-9805-290c7fb37f26\") " Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.859763 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "837cd892-bf64-4ba8-9805-290c7fb37f26" (UID: "837cd892-bf64-4ba8-9805-290c7fb37f26"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.859762 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837cd892-bf64-4ba8-9805-290c7fb37f26-kube-api-access-rx7pp" (OuterVolumeSpecName: "kube-api-access-rx7pp") pod "837cd892-bf64-4ba8-9805-290c7fb37f26" (UID: "837cd892-bf64-4ba8-9805-290c7fb37f26"). InnerVolumeSpecName "kube-api-access-rx7pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.860388 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ceph" (OuterVolumeSpecName: "ceph") pod "837cd892-bf64-4ba8-9805-290c7fb37f26" (UID: "837cd892-bf64-4ba8-9805-290c7fb37f26"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.888520 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "837cd892-bf64-4ba8-9805-290c7fb37f26" (UID: "837cd892-bf64-4ba8-9805-290c7fb37f26"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.889210 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "837cd892-bf64-4ba8-9805-290c7fb37f26" (UID: "837cd892-bf64-4ba8-9805-290c7fb37f26"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.900334 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-inventory" (OuterVolumeSpecName: "inventory") pod "837cd892-bf64-4ba8-9805-290c7fb37f26" (UID: "837cd892-bf64-4ba8-9805-290c7fb37f26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.900431 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "837cd892-bf64-4ba8-9805-290c7fb37f26" (UID: "837cd892-bf64-4ba8-9805-290c7fb37f26"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.956578 5118 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.956872 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.956979 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.957138 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.957256 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.957355 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/837cd892-bf64-4ba8-9805-290c7fb37f26-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:19:53 crc kubenswrapper[5118]: I0223 09:19:53.957462 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx7pp\" (UniqueName: \"kubernetes.io/projected/837cd892-bf64-4ba8-9805-290c7fb37f26-kube-api-access-rx7pp\") on node \"crc\" DevicePath \"\"" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.265748 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" event={"ID":"837cd892-bf64-4ba8-9805-290c7fb37f26","Type":"ContainerDied","Data":"d5a98f1af22185ec68753dde0418a9077b6c51fe8d74d20dbd493bec60ed9c80"} Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.265798 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5a98f1af22185ec68753dde0418a9077b6c51fe8d74d20dbd493bec60ed9c80" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.265835 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-chnnl" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.372960 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-xcx95"] Feb 23 09:19:54 crc kubenswrapper[5118]: E0223 09:19:54.373729 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837cd892-bf64-4ba8-9805-290c7fb37f26" containerName="neutron-metadata-openstack-openstack-cell1" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.373756 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="837cd892-bf64-4ba8-9805-290c7fb37f26" containerName="neutron-metadata-openstack-openstack-cell1" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.375915 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="837cd892-bf64-4ba8-9805-290c7fb37f26" containerName="neutron-metadata-openstack-openstack-cell1" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.377257 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.379432 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.380915 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.381073 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.381609 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.381745 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.387942 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-xcx95"] Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.467878 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.467988 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.468033 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-inventory\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.468080 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn7b4\" (UniqueName: \"kubernetes.io/projected/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-kube-api-access-tn7b4\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.468158 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.468237 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ceph\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.569973 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.570076 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.570223 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-inventory\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.570327 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn7b4\" (UniqueName: \"kubernetes.io/projected/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-kube-api-access-tn7b4\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.570393 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.570477 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ceph\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.575861 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.575885 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-inventory\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.576205 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.581947 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ceph\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.584457 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.585719 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn7b4\" (UniqueName: \"kubernetes.io/projected/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-kube-api-access-tn7b4\") pod \"libvirt-openstack-openstack-cell1-xcx95\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:54 crc kubenswrapper[5118]: I0223 09:19:54.703634 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:19:56 crc kubenswrapper[5118]: I0223 09:19:56.183159 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-xcx95"] Feb 23 09:19:56 crc kubenswrapper[5118]: I0223 09:19:56.289252 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-xcx95" event={"ID":"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c","Type":"ContainerStarted","Data":"1e312b3a485820e35e4f05219b5d424d339a4145a18735a0e088f031c2357053"} Feb 23 09:19:57 crc kubenswrapper[5118]: I0223 09:19:57.306707 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-xcx95" event={"ID":"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c","Type":"ContainerStarted","Data":"dc0205e680093128c69f4fa7fc7be50af604f23e97c09f6d6ab821537228cf5c"} Feb 23 09:19:57 crc kubenswrapper[5118]: I0223 09:19:57.335950 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-xcx95" podStartSLOduration=2.906953424 podStartE2EDuration="3.335933146s" podCreationTimestamp="2026-02-23 09:19:54 +0000 UTC" firstStartedPulling="2026-02-23 09:19:56.176715661 +0000 UTC m=+9259.180500234" lastFinishedPulling="2026-02-23 09:19:56.605695383 +0000 UTC m=+9259.609479956" observedRunningTime="2026-02-23 09:19:57.328137958 +0000 UTC m=+9260.331922551" watchObservedRunningTime="2026-02-23 09:19:57.335933146 +0000 UTC m=+9260.339717719" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.039209 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xtczp"] Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.044385 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.074318 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xtczp"] Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.120212 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-utilities\") pod \"certified-operators-xtczp\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.120354 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-catalog-content\") pod \"certified-operators-xtczp\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.120440 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnspf\" (UniqueName: \"kubernetes.io/projected/f63f6680-e32a-42c5-ac38-66cb2dc57601-kube-api-access-bnspf\") pod \"certified-operators-xtczp\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.223130 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-catalog-content\") pod \"certified-operators-xtczp\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.223732 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnspf\" (UniqueName: \"kubernetes.io/projected/f63f6680-e32a-42c5-ac38-66cb2dc57601-kube-api-access-bnspf\") pod \"certified-operators-xtczp\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.223847 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-utilities\") pod \"certified-operators-xtczp\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.223910 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-catalog-content\") pod \"certified-operators-xtczp\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.224495 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-utilities\") pod \"certified-operators-xtczp\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.523438 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnspf\" (UniqueName: \"kubernetes.io/projected/f63f6680-e32a-42c5-ac38-66cb2dc57601-kube-api-access-bnspf\") pod \"certified-operators-xtczp\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:23 crc kubenswrapper[5118]: I0223 09:20:23.674817 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:24 crc kubenswrapper[5118]: I0223 09:20:24.202023 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xtczp"] Feb 23 09:20:24 crc kubenswrapper[5118]: W0223 09:20:24.211293 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf63f6680_e32a_42c5_ac38_66cb2dc57601.slice/crio-11d1b237a4b55094cb55c346ca37b3ff4b01bdc3984193ba34076d82f12d24fc WatchSource:0}: Error finding container 11d1b237a4b55094cb55c346ca37b3ff4b01bdc3984193ba34076d82f12d24fc: Status 404 returned error can't find the container with id 11d1b237a4b55094cb55c346ca37b3ff4b01bdc3984193ba34076d82f12d24fc Feb 23 09:20:24 crc kubenswrapper[5118]: I0223 09:20:24.643231 5118 generic.go:334] "Generic (PLEG): container finished" podID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerID="94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51" exitCode=0 Feb 23 09:20:24 crc kubenswrapper[5118]: I0223 09:20:24.643291 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtczp" event={"ID":"f63f6680-e32a-42c5-ac38-66cb2dc57601","Type":"ContainerDied","Data":"94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51"} Feb 23 09:20:24 crc kubenswrapper[5118]: I0223 09:20:24.643323 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtczp" event={"ID":"f63f6680-e32a-42c5-ac38-66cb2dc57601","Type":"ContainerStarted","Data":"11d1b237a4b55094cb55c346ca37b3ff4b01bdc3984193ba34076d82f12d24fc"} Feb 23 09:20:25 crc kubenswrapper[5118]: I0223 09:20:25.662356 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtczp" event={"ID":"f63f6680-e32a-42c5-ac38-66cb2dc57601","Type":"ContainerStarted","Data":"aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d"} Feb 23 09:20:27 crc kubenswrapper[5118]: I0223 09:20:27.689892 5118 generic.go:334] "Generic (PLEG): container finished" podID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerID="aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d" exitCode=0 Feb 23 09:20:27 crc kubenswrapper[5118]: I0223 09:20:27.689992 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtczp" event={"ID":"f63f6680-e32a-42c5-ac38-66cb2dc57601","Type":"ContainerDied","Data":"aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d"} Feb 23 09:20:28 crc kubenswrapper[5118]: I0223 09:20:28.719379 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtczp" event={"ID":"f63f6680-e32a-42c5-ac38-66cb2dc57601","Type":"ContainerStarted","Data":"368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a"} Feb 23 09:20:28 crc kubenswrapper[5118]: I0223 09:20:28.748447 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xtczp" podStartSLOduration=2.195834614 podStartE2EDuration="5.748424997s" podCreationTimestamp="2026-02-23 09:20:23 +0000 UTC" firstStartedPulling="2026-02-23 09:20:24.648595077 +0000 UTC m=+9287.652379650" lastFinishedPulling="2026-02-23 09:20:28.20118528 +0000 UTC m=+9291.204970033" observedRunningTime="2026-02-23 09:20:28.741818447 +0000 UTC m=+9291.745603010" watchObservedRunningTime="2026-02-23 09:20:28.748424997 +0000 UTC m=+9291.752209570" Feb 23 09:20:33 crc kubenswrapper[5118]: I0223 09:20:33.676439 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:33 crc kubenswrapper[5118]: I0223 09:20:33.678536 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:34 crc kubenswrapper[5118]: I0223 09:20:34.888725 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xtczp" podUID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerName="registry-server" probeResult="failure" output=< Feb 23 09:20:34 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:20:34 crc kubenswrapper[5118]: > Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.318290 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4x76n"] Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.322342 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.332924 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4x76n"] Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.455406 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-utilities\") pod \"redhat-operators-4x76n\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.455547 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57sf6\" (UniqueName: \"kubernetes.io/projected/82a3c98a-8a77-423f-bfe3-0391c70f9301-kube-api-access-57sf6\") pod \"redhat-operators-4x76n\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.455663 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-catalog-content\") pod \"redhat-operators-4x76n\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.558017 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-utilities\") pod \"redhat-operators-4x76n\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.558312 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57sf6\" (UniqueName: \"kubernetes.io/projected/82a3c98a-8a77-423f-bfe3-0391c70f9301-kube-api-access-57sf6\") pod \"redhat-operators-4x76n\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.562567 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-catalog-content\") pod \"redhat-operators-4x76n\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.558539 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-utilities\") pod \"redhat-operators-4x76n\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.563236 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-catalog-content\") pod \"redhat-operators-4x76n\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.580728 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57sf6\" (UniqueName: \"kubernetes.io/projected/82a3c98a-8a77-423f-bfe3-0391c70f9301-kube-api-access-57sf6\") pod \"redhat-operators-4x76n\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:41 crc kubenswrapper[5118]: I0223 09:20:41.650656 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:42 crc kubenswrapper[5118]: I0223 09:20:42.141583 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4x76n"] Feb 23 09:20:42 crc kubenswrapper[5118]: I0223 09:20:42.868835 5118 generic.go:334] "Generic (PLEG): container finished" podID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerID="7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68" exitCode=0 Feb 23 09:20:42 crc kubenswrapper[5118]: I0223 09:20:42.869403 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4x76n" event={"ID":"82a3c98a-8a77-423f-bfe3-0391c70f9301","Type":"ContainerDied","Data":"7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68"} Feb 23 09:20:42 crc kubenswrapper[5118]: I0223 09:20:42.869458 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4x76n" event={"ID":"82a3c98a-8a77-423f-bfe3-0391c70f9301","Type":"ContainerStarted","Data":"832b4a581c67a00bcf1f2db8cb358b16a7ed1a9582a3af84b1fd9230284ffb29"} Feb 23 09:20:43 crc kubenswrapper[5118]: I0223 09:20:43.759158 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:43 crc kubenswrapper[5118]: I0223 09:20:43.826994 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:43 crc kubenswrapper[5118]: I0223 09:20:43.884533 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4x76n" event={"ID":"82a3c98a-8a77-423f-bfe3-0391c70f9301","Type":"ContainerStarted","Data":"92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976"} Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.087054 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xtczp"] Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.087567 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xtczp" podUID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerName="registry-server" containerID="cri-o://368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a" gracePeriod=2 Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.561811 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.678780 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-catalog-content\") pod \"f63f6680-e32a-42c5-ac38-66cb2dc57601\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.678925 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnspf\" (UniqueName: \"kubernetes.io/projected/f63f6680-e32a-42c5-ac38-66cb2dc57601-kube-api-access-bnspf\") pod \"f63f6680-e32a-42c5-ac38-66cb2dc57601\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.678989 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-utilities\") pod \"f63f6680-e32a-42c5-ac38-66cb2dc57601\" (UID: \"f63f6680-e32a-42c5-ac38-66cb2dc57601\") " Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.679786 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-utilities" (OuterVolumeSpecName: "utilities") pod "f63f6680-e32a-42c5-ac38-66cb2dc57601" (UID: "f63f6680-e32a-42c5-ac38-66cb2dc57601"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.686379 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63f6680-e32a-42c5-ac38-66cb2dc57601-kube-api-access-bnspf" (OuterVolumeSpecName: "kube-api-access-bnspf") pod "f63f6680-e32a-42c5-ac38-66cb2dc57601" (UID: "f63f6680-e32a-42c5-ac38-66cb2dc57601"). InnerVolumeSpecName "kube-api-access-bnspf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.730935 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f63f6680-e32a-42c5-ac38-66cb2dc57601" (UID: "f63f6680-e32a-42c5-ac38-66cb2dc57601"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.782304 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.782347 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnspf\" (UniqueName: \"kubernetes.io/projected/f63f6680-e32a-42c5-ac38-66cb2dc57601-kube-api-access-bnspf\") on node \"crc\" DevicePath \"\"" Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.782359 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63f6680-e32a-42c5-ac38-66cb2dc57601-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.924646 5118 generic.go:334] "Generic (PLEG): container finished" podID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerID="368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a" exitCode=0 Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.924696 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtczp" event={"ID":"f63f6680-e32a-42c5-ac38-66cb2dc57601","Type":"ContainerDied","Data":"368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a"} Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.924725 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xtczp" event={"ID":"f63f6680-e32a-42c5-ac38-66cb2dc57601","Type":"ContainerDied","Data":"11d1b237a4b55094cb55c346ca37b3ff4b01bdc3984193ba34076d82f12d24fc"} Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.924747 5118 scope.go:117] "RemoveContainer" containerID="368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a" Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.924760 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xtczp" Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.967198 5118 scope.go:117] "RemoveContainer" containerID="aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d" Feb 23 09:20:46 crc kubenswrapper[5118]: I0223 09:20:46.988923 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xtczp"] Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.004172 5118 scope.go:117] "RemoveContainer" containerID="94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51" Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.011183 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xtczp"] Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.056644 5118 scope.go:117] "RemoveContainer" containerID="368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a" Feb 23 09:20:47 crc kubenswrapper[5118]: E0223 09:20:47.057138 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a\": container with ID starting with 368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a not found: ID does not exist" containerID="368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a" Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.057187 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a"} err="failed to get container status \"368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a\": rpc error: code = NotFound desc = could not find container \"368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a\": container with ID starting with 368e9ce714d2a40c73d2151ea4054d1b5917af037d6add9b5f103d9ab53b9a7a not found: ID does not exist" Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.057214 5118 scope.go:117] "RemoveContainer" containerID="aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d" Feb 23 09:20:47 crc kubenswrapper[5118]: E0223 09:20:47.057537 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d\": container with ID starting with aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d not found: ID does not exist" containerID="aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d" Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.057571 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d"} err="failed to get container status \"aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d\": rpc error: code = NotFound desc = could not find container \"aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d\": container with ID starting with aa0dc7c5cfaf6166dde2ca4371267c95c0c2b79ee56fd7b792cf4b9802d67f6d not found: ID does not exist" Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.057593 5118 scope.go:117] "RemoveContainer" containerID="94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51" Feb 23 09:20:47 crc kubenswrapper[5118]: E0223 09:20:47.058041 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51\": container with ID starting with 94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51 not found: ID does not exist" containerID="94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51" Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.058070 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51"} err="failed to get container status \"94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51\": rpc error: code = NotFound desc = could not find container \"94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51\": container with ID starting with 94267d66a17aee3daa63ad76741c8df8c65a1b24e27361846698de4144574a51 not found: ID does not exist" Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.721870 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63f6680-e32a-42c5-ac38-66cb2dc57601" path="/var/lib/kubelet/pods/f63f6680-e32a-42c5-ac38-66cb2dc57601/volumes" Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.936647 5118 generic.go:334] "Generic (PLEG): container finished" podID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerID="92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976" exitCode=0 Feb 23 09:20:47 crc kubenswrapper[5118]: I0223 09:20:47.936719 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4x76n" event={"ID":"82a3c98a-8a77-423f-bfe3-0391c70f9301","Type":"ContainerDied","Data":"92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976"} Feb 23 09:20:48 crc kubenswrapper[5118]: I0223 09:20:48.964906 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4x76n" event={"ID":"82a3c98a-8a77-423f-bfe3-0391c70f9301","Type":"ContainerStarted","Data":"27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56"} Feb 23 09:20:48 crc kubenswrapper[5118]: I0223 09:20:48.986216 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4x76n" podStartSLOduration=2.50486492 podStartE2EDuration="7.98619759s" podCreationTimestamp="2026-02-23 09:20:41 +0000 UTC" firstStartedPulling="2026-02-23 09:20:42.871624359 +0000 UTC m=+9305.875408932" lastFinishedPulling="2026-02-23 09:20:48.352957019 +0000 UTC m=+9311.356741602" observedRunningTime="2026-02-23 09:20:48.984124891 +0000 UTC m=+9311.987909494" watchObservedRunningTime="2026-02-23 09:20:48.98619759 +0000 UTC m=+9311.989982163" Feb 23 09:20:51 crc kubenswrapper[5118]: I0223 09:20:51.651078 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:51 crc kubenswrapper[5118]: I0223 09:20:51.652348 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:20:52 crc kubenswrapper[5118]: I0223 09:20:52.701129 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4x76n" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="registry-server" probeResult="failure" output=< Feb 23 09:20:52 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:20:52 crc kubenswrapper[5118]: > Feb 23 09:21:02 crc kubenswrapper[5118]: I0223 09:21:02.704664 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4x76n" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="registry-server" probeResult="failure" output=< Feb 23 09:21:02 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:21:02 crc kubenswrapper[5118]: > Feb 23 09:21:12 crc kubenswrapper[5118]: I0223 09:21:12.717390 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4x76n" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="registry-server" probeResult="failure" output=< Feb 23 09:21:12 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:21:12 crc kubenswrapper[5118]: > Feb 23 09:21:21 crc kubenswrapper[5118]: I0223 09:21:21.709263 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:21:21 crc kubenswrapper[5118]: I0223 09:21:21.776062 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:21:21 crc kubenswrapper[5118]: I0223 09:21:21.976302 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4x76n"] Feb 23 09:21:23 crc kubenswrapper[5118]: I0223 09:21:23.384493 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4x76n" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="registry-server" containerID="cri-o://27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56" gracePeriod=2 Feb 23 09:21:23 crc kubenswrapper[5118]: I0223 09:21:23.832361 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.015255 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-utilities\") pod \"82a3c98a-8a77-423f-bfe3-0391c70f9301\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.015433 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57sf6\" (UniqueName: \"kubernetes.io/projected/82a3c98a-8a77-423f-bfe3-0391c70f9301-kube-api-access-57sf6\") pod \"82a3c98a-8a77-423f-bfe3-0391c70f9301\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.015686 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-catalog-content\") pod \"82a3c98a-8a77-423f-bfe3-0391c70f9301\" (UID: \"82a3c98a-8a77-423f-bfe3-0391c70f9301\") " Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.016042 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-utilities" (OuterVolumeSpecName: "utilities") pod "82a3c98a-8a77-423f-bfe3-0391c70f9301" (UID: "82a3c98a-8a77-423f-bfe3-0391c70f9301"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.016549 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.029171 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a3c98a-8a77-423f-bfe3-0391c70f9301-kube-api-access-57sf6" (OuterVolumeSpecName: "kube-api-access-57sf6") pod "82a3c98a-8a77-423f-bfe3-0391c70f9301" (UID: "82a3c98a-8a77-423f-bfe3-0391c70f9301"). InnerVolumeSpecName "kube-api-access-57sf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.118346 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57sf6\" (UniqueName: \"kubernetes.io/projected/82a3c98a-8a77-423f-bfe3-0391c70f9301-kube-api-access-57sf6\") on node \"crc\" DevicePath \"\"" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.141849 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82a3c98a-8a77-423f-bfe3-0391c70f9301" (UID: "82a3c98a-8a77-423f-bfe3-0391c70f9301"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.220306 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a3c98a-8a77-423f-bfe3-0391c70f9301-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.395331 5118 generic.go:334] "Generic (PLEG): container finished" podID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerID="27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56" exitCode=0 Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.395381 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4x76n" event={"ID":"82a3c98a-8a77-423f-bfe3-0391c70f9301","Type":"ContainerDied","Data":"27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56"} Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.395446 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4x76n" event={"ID":"82a3c98a-8a77-423f-bfe3-0391c70f9301","Type":"ContainerDied","Data":"832b4a581c67a00bcf1f2db8cb358b16a7ed1a9582a3af84b1fd9230284ffb29"} Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.395475 5118 scope.go:117] "RemoveContainer" containerID="27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.396361 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4x76n" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.416451 5118 scope.go:117] "RemoveContainer" containerID="92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.432596 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4x76n"] Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.444053 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4x76n"] Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.468836 5118 scope.go:117] "RemoveContainer" containerID="7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.492328 5118 scope.go:117] "RemoveContainer" containerID="27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56" Feb 23 09:21:24 crc kubenswrapper[5118]: E0223 09:21:24.492713 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56\": container with ID starting with 27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56 not found: ID does not exist" containerID="27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.492739 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56"} err="failed to get container status \"27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56\": rpc error: code = NotFound desc = could not find container \"27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56\": container with ID starting with 27c2923e01a433b8fadac28414868f48c46fcd853503c2fb493e99524d7a0b56 not found: ID does not exist" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.492759 5118 scope.go:117] "RemoveContainer" containerID="92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976" Feb 23 09:21:24 crc kubenswrapper[5118]: E0223 09:21:24.493303 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976\": container with ID starting with 92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976 not found: ID does not exist" containerID="92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.493331 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976"} err="failed to get container status \"92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976\": rpc error: code = NotFound desc = could not find container \"92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976\": container with ID starting with 92d808481b16300132e196cd2ae1f8fc1201d2a2e052c4a5bec8977399c6d976 not found: ID does not exist" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.493347 5118 scope.go:117] "RemoveContainer" containerID="7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68" Feb 23 09:21:24 crc kubenswrapper[5118]: E0223 09:21:24.495870 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68\": container with ID starting with 7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68 not found: ID does not exist" containerID="7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68" Feb 23 09:21:24 crc kubenswrapper[5118]: I0223 09:21:24.495914 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68"} err="failed to get container status \"7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68\": rpc error: code = NotFound desc = could not find container \"7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68\": container with ID starting with 7ebae10abb3d2379b47435904d01f95129681285694f3059e225eaabadf90e68 not found: ID does not exist" Feb 23 09:21:25 crc kubenswrapper[5118]: I0223 09:21:25.717000 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" path="/var/lib/kubelet/pods/82a3c98a-8a77-423f-bfe3-0391c70f9301/volumes" Feb 23 09:22:02 crc kubenswrapper[5118]: I0223 09:22:02.975734 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:22:02 crc kubenswrapper[5118]: I0223 09:22:02.976481 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:22:32 crc kubenswrapper[5118]: I0223 09:22:32.975164 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:22:32 crc kubenswrapper[5118]: I0223 09:22:32.975846 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:23:02 crc kubenswrapper[5118]: I0223 09:23:02.975619 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:23:02 crc kubenswrapper[5118]: I0223 09:23:02.976391 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:23:02 crc kubenswrapper[5118]: I0223 09:23:02.976437 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 09:23:02 crc kubenswrapper[5118]: I0223 09:23:02.977036 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0c958d89c259c74afc6e743f24c284b530cfbbe89a48ecbdcf33294b5a6d635"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:23:02 crc kubenswrapper[5118]: I0223 09:23:02.977122 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://f0c958d89c259c74afc6e743f24c284b530cfbbe89a48ecbdcf33294b5a6d635" gracePeriod=600 Feb 23 09:23:03 crc kubenswrapper[5118]: I0223 09:23:03.784025 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="f0c958d89c259c74afc6e743f24c284b530cfbbe89a48ecbdcf33294b5a6d635" exitCode=0 Feb 23 09:23:03 crc kubenswrapper[5118]: I0223 09:23:03.784111 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"f0c958d89c259c74afc6e743f24c284b530cfbbe89a48ecbdcf33294b5a6d635"} Feb 23 09:23:03 crc kubenswrapper[5118]: I0223 09:23:03.784439 5118 scope.go:117] "RemoveContainer" containerID="34838d508acffd605592cd77767000701458e57c421fc0d64a6eaec80bc7f8ae" Feb 23 09:23:04 crc kubenswrapper[5118]: I0223 09:23:04.795363 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9"} Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.212969 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-52v2q"] Feb 23 09:24:02 crc kubenswrapper[5118]: E0223 09:24:02.215064 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerName="extract-utilities" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.215235 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerName="extract-utilities" Feb 23 09:24:02 crc kubenswrapper[5118]: E0223 09:24:02.215289 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerName="registry-server" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.215306 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerName="registry-server" Feb 23 09:24:02 crc kubenswrapper[5118]: E0223 09:24:02.215338 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="registry-server" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.215351 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="registry-server" Feb 23 09:24:02 crc kubenswrapper[5118]: E0223 09:24:02.215373 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="extract-content" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.215388 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="extract-content" Feb 23 09:24:02 crc kubenswrapper[5118]: E0223 09:24:02.215433 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerName="extract-content" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.215447 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerName="extract-content" Feb 23 09:24:02 crc kubenswrapper[5118]: E0223 09:24:02.215468 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="extract-utilities" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.215481 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="extract-utilities" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.216004 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a3c98a-8a77-423f-bfe3-0391c70f9301" containerName="registry-server" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.216062 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63f6680-e32a-42c5-ac38-66cb2dc57601" containerName="registry-server" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.219469 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.248282 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-52v2q"] Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.298194 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jctlg\" (UniqueName: \"kubernetes.io/projected/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-kube-api-access-jctlg\") pod \"community-operators-52v2q\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.298291 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-utilities\") pod \"community-operators-52v2q\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.298383 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-catalog-content\") pod \"community-operators-52v2q\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.400726 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-catalog-content\") pod \"community-operators-52v2q\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.400885 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jctlg\" (UniqueName: \"kubernetes.io/projected/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-kube-api-access-jctlg\") pod \"community-operators-52v2q\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.400917 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-utilities\") pod \"community-operators-52v2q\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.401805 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-utilities\") pod \"community-operators-52v2q\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.405985 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-catalog-content\") pod \"community-operators-52v2q\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.421776 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jctlg\" (UniqueName: \"kubernetes.io/projected/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-kube-api-access-jctlg\") pod \"community-operators-52v2q\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:02 crc kubenswrapper[5118]: I0223 09:24:02.546170 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:03 crc kubenswrapper[5118]: I0223 09:24:03.119498 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-52v2q"] Feb 23 09:24:03 crc kubenswrapper[5118]: I0223 09:24:03.430086 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52v2q" event={"ID":"ffca3485-cdc3-4082-86e8-bd197a4b1b9e","Type":"ContainerStarted","Data":"88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3"} Feb 23 09:24:03 crc kubenswrapper[5118]: I0223 09:24:03.430527 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52v2q" event={"ID":"ffca3485-cdc3-4082-86e8-bd197a4b1b9e","Type":"ContainerStarted","Data":"54563d096d0d201dea8bbe97a6d0115186a6be248fff4befccbdcc199a7c83f8"} Feb 23 09:24:03 crc kubenswrapper[5118]: I0223 09:24:03.435388 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:24:04 crc kubenswrapper[5118]: I0223 09:24:04.443636 5118 generic.go:334] "Generic (PLEG): container finished" podID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerID="88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3" exitCode=0 Feb 23 09:24:04 crc kubenswrapper[5118]: I0223 09:24:04.443758 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52v2q" event={"ID":"ffca3485-cdc3-4082-86e8-bd197a4b1b9e","Type":"ContainerDied","Data":"88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3"} Feb 23 09:24:04 crc kubenswrapper[5118]: I0223 09:24:04.444319 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52v2q" event={"ID":"ffca3485-cdc3-4082-86e8-bd197a4b1b9e","Type":"ContainerStarted","Data":"74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d"} Feb 23 09:24:06 crc kubenswrapper[5118]: I0223 09:24:06.467542 5118 generic.go:334] "Generic (PLEG): container finished" podID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerID="74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d" exitCode=0 Feb 23 09:24:06 crc kubenswrapper[5118]: I0223 09:24:06.467650 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52v2q" event={"ID":"ffca3485-cdc3-4082-86e8-bd197a4b1b9e","Type":"ContainerDied","Data":"74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d"} Feb 23 09:24:08 crc kubenswrapper[5118]: I0223 09:24:08.512558 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52v2q" event={"ID":"ffca3485-cdc3-4082-86e8-bd197a4b1b9e","Type":"ContainerStarted","Data":"eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84"} Feb 23 09:24:08 crc kubenswrapper[5118]: I0223 09:24:08.549492 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-52v2q" podStartSLOduration=3.124187356 podStartE2EDuration="6.549471638s" podCreationTimestamp="2026-02-23 09:24:02 +0000 UTC" firstStartedPulling="2026-02-23 09:24:03.435128214 +0000 UTC m=+9506.438912787" lastFinishedPulling="2026-02-23 09:24:06.860412476 +0000 UTC m=+9509.864197069" observedRunningTime="2026-02-23 09:24:08.538909273 +0000 UTC m=+9511.542693846" watchObservedRunningTime="2026-02-23 09:24:08.549471638 +0000 UTC m=+9511.553256211" Feb 23 09:24:12 crc kubenswrapper[5118]: I0223 09:24:12.547136 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:12 crc kubenswrapper[5118]: I0223 09:24:12.547716 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:12 crc kubenswrapper[5118]: I0223 09:24:12.602183 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:12 crc kubenswrapper[5118]: I0223 09:24:12.671079 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:12 crc kubenswrapper[5118]: I0223 09:24:12.844400 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-52v2q"] Feb 23 09:24:14 crc kubenswrapper[5118]: I0223 09:24:14.592551 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-52v2q" podUID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerName="registry-server" containerID="cri-o://eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84" gracePeriod=2 Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.067338 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.187792 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jctlg\" (UniqueName: \"kubernetes.io/projected/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-kube-api-access-jctlg\") pod \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.188005 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-utilities\") pod \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.188028 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-catalog-content\") pod \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\" (UID: \"ffca3485-cdc3-4082-86e8-bd197a4b1b9e\") " Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.190825 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-utilities" (OuterVolumeSpecName: "utilities") pod "ffca3485-cdc3-4082-86e8-bd197a4b1b9e" (UID: "ffca3485-cdc3-4082-86e8-bd197a4b1b9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.200040 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-kube-api-access-jctlg" (OuterVolumeSpecName: "kube-api-access-jctlg") pod "ffca3485-cdc3-4082-86e8-bd197a4b1b9e" (UID: "ffca3485-cdc3-4082-86e8-bd197a4b1b9e"). InnerVolumeSpecName "kube-api-access-jctlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.265343 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffca3485-cdc3-4082-86e8-bd197a4b1b9e" (UID: "ffca3485-cdc3-4082-86e8-bd197a4b1b9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.290750 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jctlg\" (UniqueName: \"kubernetes.io/projected/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-kube-api-access-jctlg\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.290791 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.290809 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffca3485-cdc3-4082-86e8-bd197a4b1b9e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.605066 5118 generic.go:334] "Generic (PLEG): container finished" podID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerID="eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84" exitCode=0 Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.605135 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52v2q" event={"ID":"ffca3485-cdc3-4082-86e8-bd197a4b1b9e","Type":"ContainerDied","Data":"eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84"} Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.605187 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-52v2q" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.605213 5118 scope.go:117] "RemoveContainer" containerID="eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.605189 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-52v2q" event={"ID":"ffca3485-cdc3-4082-86e8-bd197a4b1b9e","Type":"ContainerDied","Data":"54563d096d0d201dea8bbe97a6d0115186a6be248fff4befccbdcc199a7c83f8"} Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.631331 5118 scope.go:117] "RemoveContainer" containerID="74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.665162 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-52v2q"] Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.671325 5118 scope.go:117] "RemoveContainer" containerID="88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.675047 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-52v2q"] Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.708699 5118 scope.go:117] "RemoveContainer" containerID="eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84" Feb 23 09:24:15 crc kubenswrapper[5118]: E0223 09:24:15.709001 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84\": container with ID starting with eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84 not found: ID does not exist" containerID="eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.709049 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84"} err="failed to get container status \"eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84\": rpc error: code = NotFound desc = could not find container \"eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84\": container with ID starting with eb04514d1488722fb83585b9a732e0c155e61ec4456a067f97f8d1a34bec7c84 not found: ID does not exist" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.709068 5118 scope.go:117] "RemoveContainer" containerID="74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d" Feb 23 09:24:15 crc kubenswrapper[5118]: E0223 09:24:15.709362 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d\": container with ID starting with 74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d not found: ID does not exist" containerID="74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.709383 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d"} err="failed to get container status \"74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d\": rpc error: code = NotFound desc = could not find container \"74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d\": container with ID starting with 74715f062a68f8374575b64e863b50b733167058a61812c503be6315df5a6a0d not found: ID does not exist" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.709395 5118 scope.go:117] "RemoveContainer" containerID="88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3" Feb 23 09:24:15 crc kubenswrapper[5118]: E0223 09:24:15.709560 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3\": container with ID starting with 88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3 not found: ID does not exist" containerID="88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.709578 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3"} err="failed to get container status \"88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3\": rpc error: code = NotFound desc = could not find container \"88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3\": container with ID starting with 88b3f3bc982bba6ae0ef338b8015dfff58242b21ba0cfabc27c8acbaae4372b3 not found: ID does not exist" Feb 23 09:24:15 crc kubenswrapper[5118]: I0223 09:24:15.711332 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" path="/var/lib/kubelet/pods/ffca3485-cdc3-4082-86e8-bd197a4b1b9e/volumes" Feb 23 09:24:46 crc kubenswrapper[5118]: I0223 09:24:46.940642 5118 generic.go:334] "Generic (PLEG): container finished" podID="7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c" containerID="dc0205e680093128c69f4fa7fc7be50af604f23e97c09f6d6ab821537228cf5c" exitCode=0 Feb 23 09:24:46 crc kubenswrapper[5118]: I0223 09:24:46.940784 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-xcx95" event={"ID":"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c","Type":"ContainerDied","Data":"dc0205e680093128c69f4fa7fc7be50af604f23e97c09f6d6ab821537228cf5c"} Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.450617 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.569562 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ssh-key-openstack-cell1\") pod \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.569633 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-secret-0\") pod \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.569717 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-inventory\") pod \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.569852 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ceph\") pod \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.569873 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn7b4\" (UniqueName: \"kubernetes.io/projected/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-kube-api-access-tn7b4\") pod \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.569894 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-combined-ca-bundle\") pod \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\" (UID: \"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c\") " Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.575531 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-kube-api-access-tn7b4" (OuterVolumeSpecName: "kube-api-access-tn7b4") pod "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c" (UID: "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c"). InnerVolumeSpecName "kube-api-access-tn7b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.575592 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c" (UID: "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.579655 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ceph" (OuterVolumeSpecName: "ceph") pod "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c" (UID: "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.599306 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c" (UID: "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.604133 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-inventory" (OuterVolumeSpecName: "inventory") pod "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c" (UID: "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.608905 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c" (UID: "7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.672298 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.672335 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn7b4\" (UniqueName: \"kubernetes.io/projected/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-kube-api-access-tn7b4\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.672351 5118 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.672362 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.672375 5118 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.672386 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.969233 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-xcx95" event={"ID":"7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c","Type":"ContainerDied","Data":"1e312b3a485820e35e4f05219b5d424d339a4145a18735a0e088f031c2357053"} Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.969681 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e312b3a485820e35e4f05219b5d424d339a4145a18735a0e088f031c2357053" Feb 23 09:24:48 crc kubenswrapper[5118]: I0223 09:24:48.969312 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-xcx95" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.183635 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-bvnm9"] Feb 23 09:24:49 crc kubenswrapper[5118]: E0223 09:24:49.184168 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerName="extract-utilities" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.184187 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerName="extract-utilities" Feb 23 09:24:49 crc kubenswrapper[5118]: E0223 09:24:49.184203 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerName="registry-server" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.184210 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerName="registry-server" Feb 23 09:24:49 crc kubenswrapper[5118]: E0223 09:24:49.184231 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c" containerName="libvirt-openstack-openstack-cell1" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.184238 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c" containerName="libvirt-openstack-openstack-cell1" Feb 23 09:24:49 crc kubenswrapper[5118]: E0223 09:24:49.184251 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerName="extract-content" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.184257 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerName="extract-content" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.184443 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffca3485-cdc3-4082-86e8-bd197a4b1b9e" containerName="registry-server" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.184461 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c" containerName="libvirt-openstack-openstack-cell1" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.185210 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.188076 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.188461 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.189174 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.189412 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.189500 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.189555 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.191451 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.220214 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-bvnm9"] Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.283161 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.283269 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.283313 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865h5\" (UniqueName: \"kubernetes.io/projected/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-kube-api-access-865h5\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.283331 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.283429 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.283611 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.283651 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ceph\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.283906 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.284049 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.284232 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.284366 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.284482 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.284608 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-inventory\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386299 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386357 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-inventory\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386383 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386449 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386501 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-865h5\" (UniqueName: \"kubernetes.io/projected/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-kube-api-access-865h5\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386525 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386560 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386613 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386638 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ceph\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386677 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386695 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386750 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.386776 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.387677 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.388605 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.390306 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.390608 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.390620 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.390663 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.392163 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ceph\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.395796 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.395839 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.396679 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-inventory\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.398931 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.401539 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.404968 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-865h5\" (UniqueName: \"kubernetes.io/projected/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-kube-api-access-865h5\") pod \"nova-cell1-openstack-openstack-cell1-bvnm9\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:49 crc kubenswrapper[5118]: I0223 09:24:49.509517 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:24:50 crc kubenswrapper[5118]: I0223 09:24:50.045515 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-bvnm9"] Feb 23 09:24:50 crc kubenswrapper[5118]: I0223 09:24:50.993452 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" event={"ID":"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea","Type":"ContainerStarted","Data":"7305f98177c11bbb57db19ee2891d89956c45c5b0b1c8cff3030693b6c670f1a"} Feb 23 09:24:52 crc kubenswrapper[5118]: I0223 09:24:52.005839 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" event={"ID":"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea","Type":"ContainerStarted","Data":"7d552bf35964f6406f3ed85fd1f3792b6b9af142a986f3924e1a6f37cd960554"} Feb 23 09:24:52 crc kubenswrapper[5118]: I0223 09:24:52.038040 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" podStartSLOduration=2.567979762 podStartE2EDuration="3.038018016s" podCreationTimestamp="2026-02-23 09:24:49 +0000 UTC" firstStartedPulling="2026-02-23 09:24:50.323391857 +0000 UTC m=+9553.327176430" lastFinishedPulling="2026-02-23 09:24:50.793430071 +0000 UTC m=+9553.797214684" observedRunningTime="2026-02-23 09:24:52.028925797 +0000 UTC m=+9555.032710390" watchObservedRunningTime="2026-02-23 09:24:52.038018016 +0000 UTC m=+9555.041802599" Feb 23 09:25:32 crc kubenswrapper[5118]: I0223 09:25:32.975177 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:25:32 crc kubenswrapper[5118]: I0223 09:25:32.975830 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.261598 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wpvwz"] Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.265250 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.294540 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpvwz"] Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.345849 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-utilities\") pod \"redhat-marketplace-wpvwz\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.346212 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-catalog-content\") pod \"redhat-marketplace-wpvwz\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.346254 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtdq\" (UniqueName: \"kubernetes.io/projected/e792ef30-0ceb-4b27-887e-b45bfd591b43-kube-api-access-frtdq\") pod \"redhat-marketplace-wpvwz\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.448167 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-catalog-content\") pod \"redhat-marketplace-wpvwz\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.448211 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frtdq\" (UniqueName: \"kubernetes.io/projected/e792ef30-0ceb-4b27-887e-b45bfd591b43-kube-api-access-frtdq\") pod \"redhat-marketplace-wpvwz\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.448332 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-utilities\") pod \"redhat-marketplace-wpvwz\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.448812 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-utilities\") pod \"redhat-marketplace-wpvwz\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.448987 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-catalog-content\") pod \"redhat-marketplace-wpvwz\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.470679 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frtdq\" (UniqueName: \"kubernetes.io/projected/e792ef30-0ceb-4b27-887e-b45bfd591b43-kube-api-access-frtdq\") pod \"redhat-marketplace-wpvwz\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:57 crc kubenswrapper[5118]: I0223 09:25:57.586131 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:25:58 crc kubenswrapper[5118]: I0223 09:25:58.134234 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpvwz"] Feb 23 09:25:58 crc kubenswrapper[5118]: I0223 09:25:58.737915 5118 generic.go:334] "Generic (PLEG): container finished" podID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerID="dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca" exitCode=0 Feb 23 09:25:58 crc kubenswrapper[5118]: I0223 09:25:58.738356 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpvwz" event={"ID":"e792ef30-0ceb-4b27-887e-b45bfd591b43","Type":"ContainerDied","Data":"dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca"} Feb 23 09:25:58 crc kubenswrapper[5118]: I0223 09:25:58.738403 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpvwz" event={"ID":"e792ef30-0ceb-4b27-887e-b45bfd591b43","Type":"ContainerStarted","Data":"cc5578521266a278c457c047c02faf260630db88bc4888e82804b85019301ff4"} Feb 23 09:26:00 crc kubenswrapper[5118]: I0223 09:26:00.762932 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpvwz" event={"ID":"e792ef30-0ceb-4b27-887e-b45bfd591b43","Type":"ContainerStarted","Data":"b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e"} Feb 23 09:26:01 crc kubenswrapper[5118]: I0223 09:26:01.779846 5118 generic.go:334] "Generic (PLEG): container finished" podID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerID="b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e" exitCode=0 Feb 23 09:26:01 crc kubenswrapper[5118]: I0223 09:26:01.779987 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpvwz" event={"ID":"e792ef30-0ceb-4b27-887e-b45bfd591b43","Type":"ContainerDied","Data":"b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e"} Feb 23 09:26:02 crc kubenswrapper[5118]: I0223 09:26:02.975635 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:26:02 crc kubenswrapper[5118]: I0223 09:26:02.975984 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:26:04 crc kubenswrapper[5118]: I0223 09:26:04.818318 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpvwz" event={"ID":"e792ef30-0ceb-4b27-887e-b45bfd591b43","Type":"ContainerStarted","Data":"807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26"} Feb 23 09:26:04 crc kubenswrapper[5118]: I0223 09:26:04.847284 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wpvwz" podStartSLOduration=2.962696352 podStartE2EDuration="7.84726537s" podCreationTimestamp="2026-02-23 09:25:57 +0000 UTC" firstStartedPulling="2026-02-23 09:25:58.746419291 +0000 UTC m=+9621.750203864" lastFinishedPulling="2026-02-23 09:26:03.630988309 +0000 UTC m=+9626.634772882" observedRunningTime="2026-02-23 09:26:04.837546657 +0000 UTC m=+9627.841331240" watchObservedRunningTime="2026-02-23 09:26:04.84726537 +0000 UTC m=+9627.851049943" Feb 23 09:26:07 crc kubenswrapper[5118]: I0223 09:26:07.586778 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:26:07 crc kubenswrapper[5118]: I0223 09:26:07.587440 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:26:07 crc kubenswrapper[5118]: I0223 09:26:07.638564 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:26:17 crc kubenswrapper[5118]: I0223 09:26:17.636215 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:26:17 crc kubenswrapper[5118]: I0223 09:26:17.686228 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpvwz"] Feb 23 09:26:17 crc kubenswrapper[5118]: I0223 09:26:17.956898 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wpvwz" podUID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerName="registry-server" containerID="cri-o://807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26" gracePeriod=2 Feb 23 09:26:18 crc kubenswrapper[5118]: I0223 09:26:18.940985 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:26:18 crc kubenswrapper[5118]: I0223 09:26:18.974869 5118 generic.go:334] "Generic (PLEG): container finished" podID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerID="807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26" exitCode=0 Feb 23 09:26:18 crc kubenswrapper[5118]: I0223 09:26:18.974920 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpvwz" event={"ID":"e792ef30-0ceb-4b27-887e-b45bfd591b43","Type":"ContainerDied","Data":"807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26"} Feb 23 09:26:18 crc kubenswrapper[5118]: I0223 09:26:18.974949 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpvwz" event={"ID":"e792ef30-0ceb-4b27-887e-b45bfd591b43","Type":"ContainerDied","Data":"cc5578521266a278c457c047c02faf260630db88bc4888e82804b85019301ff4"} Feb 23 09:26:18 crc kubenswrapper[5118]: I0223 09:26:18.974967 5118 scope.go:117] "RemoveContainer" containerID="807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26" Feb 23 09:26:18 crc kubenswrapper[5118]: I0223 09:26:18.974995 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpvwz" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.006718 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frtdq\" (UniqueName: \"kubernetes.io/projected/e792ef30-0ceb-4b27-887e-b45bfd591b43-kube-api-access-frtdq\") pod \"e792ef30-0ceb-4b27-887e-b45bfd591b43\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.006888 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-catalog-content\") pod \"e792ef30-0ceb-4b27-887e-b45bfd591b43\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.006939 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-utilities\") pod \"e792ef30-0ceb-4b27-887e-b45bfd591b43\" (UID: \"e792ef30-0ceb-4b27-887e-b45bfd591b43\") " Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.009602 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-utilities" (OuterVolumeSpecName: "utilities") pod "e792ef30-0ceb-4b27-887e-b45bfd591b43" (UID: "e792ef30-0ceb-4b27-887e-b45bfd591b43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.026994 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e792ef30-0ceb-4b27-887e-b45bfd591b43-kube-api-access-frtdq" (OuterVolumeSpecName: "kube-api-access-frtdq") pod "e792ef30-0ceb-4b27-887e-b45bfd591b43" (UID: "e792ef30-0ceb-4b27-887e-b45bfd591b43"). InnerVolumeSpecName "kube-api-access-frtdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.027777 5118 scope.go:117] "RemoveContainer" containerID="b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.040818 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e792ef30-0ceb-4b27-887e-b45bfd591b43" (UID: "e792ef30-0ceb-4b27-887e-b45bfd591b43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.074230 5118 scope.go:117] "RemoveContainer" containerID="dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.110950 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frtdq\" (UniqueName: \"kubernetes.io/projected/e792ef30-0ceb-4b27-887e-b45bfd591b43-kube-api-access-frtdq\") on node \"crc\" DevicePath \"\"" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.111004 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.111023 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e792ef30-0ceb-4b27-887e-b45bfd591b43-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.115401 5118 scope.go:117] "RemoveContainer" containerID="807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26" Feb 23 09:26:19 crc kubenswrapper[5118]: E0223 09:26:19.116319 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26\": container with ID starting with 807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26 not found: ID does not exist" containerID="807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.116365 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26"} err="failed to get container status \"807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26\": rpc error: code = NotFound desc = could not find container \"807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26\": container with ID starting with 807503f05a7019ee091fb92970a01baeac0e9e5e54c2f791f42340160f685b26 not found: ID does not exist" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.116395 5118 scope.go:117] "RemoveContainer" containerID="b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e" Feb 23 09:26:19 crc kubenswrapper[5118]: E0223 09:26:19.117125 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e\": container with ID starting with b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e not found: ID does not exist" containerID="b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.117187 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e"} err="failed to get container status \"b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e\": rpc error: code = NotFound desc = could not find container \"b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e\": container with ID starting with b34e3340b299634159a261021eeba4b78ab4b857d431812d33f7818de612fb3e not found: ID does not exist" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.117233 5118 scope.go:117] "RemoveContainer" containerID="dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca" Feb 23 09:26:19 crc kubenswrapper[5118]: E0223 09:26:19.117843 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca\": container with ID starting with dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca not found: ID does not exist" containerID="dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.117899 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca"} err="failed to get container status \"dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca\": rpc error: code = NotFound desc = could not find container \"dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca\": container with ID starting with dd0d21e3db9c3c591fdc1f40cc9bb1ae6f80eca3aab3f1f5e877f7efd89609ca not found: ID does not exist" Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.322730 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpvwz"] Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.333668 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpvwz"] Feb 23 09:26:19 crc kubenswrapper[5118]: I0223 09:26:19.711325 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e792ef30-0ceb-4b27-887e-b45bfd591b43" path="/var/lib/kubelet/pods/e792ef30-0ceb-4b27-887e-b45bfd591b43/volumes" Feb 23 09:26:32 crc kubenswrapper[5118]: I0223 09:26:32.975869 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:26:32 crc kubenswrapper[5118]: I0223 09:26:32.976769 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:26:32 crc kubenswrapper[5118]: I0223 09:26:32.976834 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 09:26:32 crc kubenswrapper[5118]: I0223 09:26:32.977875 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:26:32 crc kubenswrapper[5118]: I0223 09:26:32.977969 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" gracePeriod=600 Feb 23 09:26:33 crc kubenswrapper[5118]: E0223 09:26:33.110189 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:26:33 crc kubenswrapper[5118]: I0223 09:26:33.119883 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" exitCode=0 Feb 23 09:26:33 crc kubenswrapper[5118]: I0223 09:26:33.119934 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9"} Feb 23 09:26:33 crc kubenswrapper[5118]: I0223 09:26:33.119972 5118 scope.go:117] "RemoveContainer" containerID="f0c958d89c259c74afc6e743f24c284b530cfbbe89a48ecbdcf33294b5a6d635" Feb 23 09:26:33 crc kubenswrapper[5118]: I0223 09:26:33.120954 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:26:33 crc kubenswrapper[5118]: E0223 09:26:33.123725 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:26:44 crc kubenswrapper[5118]: I0223 09:26:44.697682 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:26:44 crc kubenswrapper[5118]: E0223 09:26:44.698338 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:26:59 crc kubenswrapper[5118]: I0223 09:26:59.697620 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:26:59 crc kubenswrapper[5118]: E0223 09:26:59.698343 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:27:13 crc kubenswrapper[5118]: I0223 09:27:13.698173 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:27:13 crc kubenswrapper[5118]: E0223 09:27:13.699829 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:27:26 crc kubenswrapper[5118]: I0223 09:27:26.698538 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:27:26 crc kubenswrapper[5118]: E0223 09:27:26.700294 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:27:39 crc kubenswrapper[5118]: I0223 09:27:39.702074 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:27:39 crc kubenswrapper[5118]: E0223 09:27:39.703179 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:27:53 crc kubenswrapper[5118]: I0223 09:27:53.701236 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:27:53 crc kubenswrapper[5118]: E0223 09:27:53.703453 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:27:54 crc kubenswrapper[5118]: I0223 09:27:54.046678 5118 generic.go:334] "Generic (PLEG): container finished" podID="c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" containerID="7d552bf35964f6406f3ed85fd1f3792b6b9af142a986f3924e1a6f37cd960554" exitCode=0 Feb 23 09:27:54 crc kubenswrapper[5118]: I0223 09:27:54.046746 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" event={"ID":"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea","Type":"ContainerDied","Data":"7d552bf35964f6406f3ed85fd1f3792b6b9af142a986f3924e1a6f37cd960554"} Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.566939 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.653183 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-3\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.654963 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-0\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.654996 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-2\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.655061 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-inventory\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.655149 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-0\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.655179 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-combined-ca-bundle\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.655203 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-865h5\" (UniqueName: \"kubernetes.io/projected/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-kube-api-access-865h5\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.655388 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-1\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.655434 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-0\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.655464 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ssh-key-openstack-cell1\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.655485 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ceph\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.655547 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-1\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.655577 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-1\") pod \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\" (UID: \"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea\") " Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.661447 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ceph" (OuterVolumeSpecName: "ceph") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.675871 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-kube-api-access-865h5" (OuterVolumeSpecName: "kube-api-access-865h5") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "kube-api-access-865h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.679808 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.698614 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-inventory" (OuterVolumeSpecName: "inventory") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.699959 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.714445 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.717341 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.726025 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.729739 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.737677 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.741377 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.743869 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.754367 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" (UID: "c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758738 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758769 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758781 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758792 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758804 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758819 5118 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758831 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758845 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-865h5\" (UniqueName: \"kubernetes.io/projected/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-kube-api-access-865h5\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758860 5118 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758872 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758882 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758893 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:55 crc kubenswrapper[5118]: I0223 09:27:55.758903 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.069960 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" event={"ID":"c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea","Type":"ContainerDied","Data":"7305f98177c11bbb57db19ee2891d89956c45c5b0b1c8cff3030693b6c670f1a"} Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.070231 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7305f98177c11bbb57db19ee2891d89956c45c5b0b1c8cff3030693b6c670f1a" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.070030 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-bvnm9" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.169641 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-56ljn"] Feb 23 09:27:56 crc kubenswrapper[5118]: E0223 09:27:56.170065 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerName="extract-utilities" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.170084 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerName="extract-utilities" Feb 23 09:27:56 crc kubenswrapper[5118]: E0223 09:27:56.170133 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerName="registry-server" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.170140 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerName="registry-server" Feb 23 09:27:56 crc kubenswrapper[5118]: E0223 09:27:56.170149 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerName="extract-content" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.170156 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerName="extract-content" Feb 23 09:27:56 crc kubenswrapper[5118]: E0223 09:27:56.170173 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" containerName="nova-cell1-openstack-openstack-cell1" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.170178 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" containerName="nova-cell1-openstack-openstack-cell1" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.170393 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e792ef30-0ceb-4b27-887e-b45bfd591b43" containerName="registry-server" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.170414 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea" containerName="nova-cell1-openstack-openstack-cell1" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.171537 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.173873 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.174456 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.174852 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.175405 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.176247 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.182541 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-56ljn"] Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.268123 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.268179 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceph\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.268218 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l9qc\" (UniqueName: \"kubernetes.io/projected/a52048ce-f33c-40af-ac4b-8d6981ff6f60-kube-api-access-5l9qc\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.268247 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.268459 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.268623 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.268767 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.268994 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-inventory\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.370673 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.370771 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.370853 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.371021 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-inventory\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.371819 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.371868 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceph\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.371907 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l9qc\" (UniqueName: \"kubernetes.io/projected/a52048ce-f33c-40af-ac4b-8d6981ff6f60-kube-api-access-5l9qc\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.371963 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.374295 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.376831 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-inventory\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.379604 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.380646 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.381657 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.382297 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceph\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.391835 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.411908 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l9qc\" (UniqueName: \"kubernetes.io/projected/a52048ce-f33c-40af-ac4b-8d6981ff6f60-kube-api-access-5l9qc\") pod \"telemetry-openstack-openstack-cell1-56ljn\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:56 crc kubenswrapper[5118]: I0223 09:27:56.503583 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:27:57 crc kubenswrapper[5118]: I0223 09:27:57.164335 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-56ljn"] Feb 23 09:27:58 crc kubenswrapper[5118]: I0223 09:27:58.091914 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-56ljn" event={"ID":"a52048ce-f33c-40af-ac4b-8d6981ff6f60","Type":"ContainerStarted","Data":"7920f34e5ea254f0204db4c39260ec689fb273994e401351c31777630f0f5eee"} Feb 23 09:27:58 crc kubenswrapper[5118]: I0223 09:27:58.092227 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-56ljn" event={"ID":"a52048ce-f33c-40af-ac4b-8d6981ff6f60","Type":"ContainerStarted","Data":"e246d0c6aa6310ed4d975057d43ec5fa962b8a1817572b9cee51f40c94e37df0"} Feb 23 09:27:58 crc kubenswrapper[5118]: I0223 09:27:58.117303 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-56ljn" podStartSLOduration=1.664243356 podStartE2EDuration="2.117281859s" podCreationTimestamp="2026-02-23 09:27:56 +0000 UTC" firstStartedPulling="2026-02-23 09:27:57.16180198 +0000 UTC m=+9740.165586553" lastFinishedPulling="2026-02-23 09:27:57.614840483 +0000 UTC m=+9740.618625056" observedRunningTime="2026-02-23 09:27:58.111989991 +0000 UTC m=+9741.115774574" watchObservedRunningTime="2026-02-23 09:27:58.117281859 +0000 UTC m=+9741.121066432" Feb 23 09:28:08 crc kubenswrapper[5118]: I0223 09:28:08.698109 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:28:08 crc kubenswrapper[5118]: E0223 09:28:08.698981 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:28:22 crc kubenswrapper[5118]: I0223 09:28:22.698078 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:28:22 crc kubenswrapper[5118]: E0223 09:28:22.699221 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:28:33 crc kubenswrapper[5118]: I0223 09:28:33.697820 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:28:33 crc kubenswrapper[5118]: E0223 09:28:33.698507 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:28:48 crc kubenswrapper[5118]: I0223 09:28:48.697411 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:28:48 crc kubenswrapper[5118]: E0223 09:28:48.698290 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:29:02 crc kubenswrapper[5118]: I0223 09:29:02.697901 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:29:02 crc kubenswrapper[5118]: E0223 09:29:02.698556 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:29:17 crc kubenswrapper[5118]: I0223 09:29:17.703250 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:29:17 crc kubenswrapper[5118]: E0223 09:29:17.703992 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:29:32 crc kubenswrapper[5118]: I0223 09:29:32.697631 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:29:32 crc kubenswrapper[5118]: E0223 09:29:32.698360 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:29:45 crc kubenswrapper[5118]: I0223 09:29:45.701421 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:29:45 crc kubenswrapper[5118]: E0223 09:29:45.702995 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:29:59 crc kubenswrapper[5118]: I0223 09:29:59.698177 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:29:59 crc kubenswrapper[5118]: E0223 09:29:59.700064 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.154004 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6"] Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.155888 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.157978 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.158182 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.182592 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6"] Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.270862 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/532cd860-7918-44d0-9f42-0fad388dc16c-config-volume\") pod \"collect-profiles-29530650-qchp6\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.270923 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/532cd860-7918-44d0-9f42-0fad388dc16c-secret-volume\") pod \"collect-profiles-29530650-qchp6\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.271166 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2spwf\" (UniqueName: \"kubernetes.io/projected/532cd860-7918-44d0-9f42-0fad388dc16c-kube-api-access-2spwf\") pod \"collect-profiles-29530650-qchp6\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.373303 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/532cd860-7918-44d0-9f42-0fad388dc16c-config-volume\") pod \"collect-profiles-29530650-qchp6\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.373372 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/532cd860-7918-44d0-9f42-0fad388dc16c-secret-volume\") pod \"collect-profiles-29530650-qchp6\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.373436 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2spwf\" (UniqueName: \"kubernetes.io/projected/532cd860-7918-44d0-9f42-0fad388dc16c-kube-api-access-2spwf\") pod \"collect-profiles-29530650-qchp6\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.374226 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/532cd860-7918-44d0-9f42-0fad388dc16c-config-volume\") pod \"collect-profiles-29530650-qchp6\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.379609 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/532cd860-7918-44d0-9f42-0fad388dc16c-secret-volume\") pod \"collect-profiles-29530650-qchp6\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.390121 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2spwf\" (UniqueName: \"kubernetes.io/projected/532cd860-7918-44d0-9f42-0fad388dc16c-kube-api-access-2spwf\") pod \"collect-profiles-29530650-qchp6\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.490938 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:00 crc kubenswrapper[5118]: I0223 09:30:00.922674 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6"] Feb 23 09:30:01 crc kubenswrapper[5118]: I0223 09:30:01.367649 5118 generic.go:334] "Generic (PLEG): container finished" podID="532cd860-7918-44d0-9f42-0fad388dc16c" containerID="f2c89673d5a4b994e6bb0e8931888d55a0fd2871f87313418b941c4d6304b711" exitCode=0 Feb 23 09:30:01 crc kubenswrapper[5118]: I0223 09:30:01.367735 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" event={"ID":"532cd860-7918-44d0-9f42-0fad388dc16c","Type":"ContainerDied","Data":"f2c89673d5a4b994e6bb0e8931888d55a0fd2871f87313418b941c4d6304b711"} Feb 23 09:30:01 crc kubenswrapper[5118]: I0223 09:30:01.367770 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" event={"ID":"532cd860-7918-44d0-9f42-0fad388dc16c","Type":"ContainerStarted","Data":"8bf9043d7b3d5e6b44091ac1e1df263eed0bd12118db6dfd0841dc9f50e68846"} Feb 23 09:30:02 crc kubenswrapper[5118]: I0223 09:30:02.773551 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:02 crc kubenswrapper[5118]: I0223 09:30:02.832301 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2spwf\" (UniqueName: \"kubernetes.io/projected/532cd860-7918-44d0-9f42-0fad388dc16c-kube-api-access-2spwf\") pod \"532cd860-7918-44d0-9f42-0fad388dc16c\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " Feb 23 09:30:02 crc kubenswrapper[5118]: I0223 09:30:02.832387 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/532cd860-7918-44d0-9f42-0fad388dc16c-secret-volume\") pod \"532cd860-7918-44d0-9f42-0fad388dc16c\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " Feb 23 09:30:02 crc kubenswrapper[5118]: I0223 09:30:02.832502 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/532cd860-7918-44d0-9f42-0fad388dc16c-config-volume\") pod \"532cd860-7918-44d0-9f42-0fad388dc16c\" (UID: \"532cd860-7918-44d0-9f42-0fad388dc16c\") " Feb 23 09:30:02 crc kubenswrapper[5118]: I0223 09:30:02.833977 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532cd860-7918-44d0-9f42-0fad388dc16c-config-volume" (OuterVolumeSpecName: "config-volume") pod "532cd860-7918-44d0-9f42-0fad388dc16c" (UID: "532cd860-7918-44d0-9f42-0fad388dc16c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:30:02 crc kubenswrapper[5118]: I0223 09:30:02.838250 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532cd860-7918-44d0-9f42-0fad388dc16c-kube-api-access-2spwf" (OuterVolumeSpecName: "kube-api-access-2spwf") pod "532cd860-7918-44d0-9f42-0fad388dc16c" (UID: "532cd860-7918-44d0-9f42-0fad388dc16c"). InnerVolumeSpecName "kube-api-access-2spwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:30:02 crc kubenswrapper[5118]: I0223 09:30:02.838440 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532cd860-7918-44d0-9f42-0fad388dc16c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "532cd860-7918-44d0-9f42-0fad388dc16c" (UID: "532cd860-7918-44d0-9f42-0fad388dc16c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:30:02 crc kubenswrapper[5118]: I0223 09:30:02.934850 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/532cd860-7918-44d0-9f42-0fad388dc16c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:30:02 crc kubenswrapper[5118]: I0223 09:30:02.934889 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2spwf\" (UniqueName: \"kubernetes.io/projected/532cd860-7918-44d0-9f42-0fad388dc16c-kube-api-access-2spwf\") on node \"crc\" DevicePath \"\"" Feb 23 09:30:02 crc kubenswrapper[5118]: I0223 09:30:02.934899 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/532cd860-7918-44d0-9f42-0fad388dc16c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:30:03 crc kubenswrapper[5118]: I0223 09:30:03.400950 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" event={"ID":"532cd860-7918-44d0-9f42-0fad388dc16c","Type":"ContainerDied","Data":"8bf9043d7b3d5e6b44091ac1e1df263eed0bd12118db6dfd0841dc9f50e68846"} Feb 23 09:30:03 crc kubenswrapper[5118]: I0223 09:30:03.401592 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bf9043d7b3d5e6b44091ac1e1df263eed0bd12118db6dfd0841dc9f50e68846" Feb 23 09:30:03 crc kubenswrapper[5118]: I0223 09:30:03.400989 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6" Feb 23 09:30:03 crc kubenswrapper[5118]: I0223 09:30:03.866780 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc"] Feb 23 09:30:03 crc kubenswrapper[5118]: I0223 09:30:03.875282 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530605-lfmnc"] Feb 23 09:30:05 crc kubenswrapper[5118]: I0223 09:30:05.714154 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3523972d-1d1d-48dc-9cef-537318ac933f" path="/var/lib/kubelet/pods/3523972d-1d1d-48dc-9cef-537318ac933f/volumes" Feb 23 09:30:07 crc kubenswrapper[5118]: I0223 09:30:07.508065 5118 scope.go:117] "RemoveContainer" containerID="60457dc2873dba059482d074740ef754f88622c1391d18ffc2c9bbad22daeed1" Feb 23 09:30:10 crc kubenswrapper[5118]: I0223 09:30:10.697830 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:30:10 crc kubenswrapper[5118]: E0223 09:30:10.698655 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:30:25 crc kubenswrapper[5118]: I0223 09:30:25.698373 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:30:25 crc kubenswrapper[5118]: E0223 09:30:25.699334 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:30:40 crc kubenswrapper[5118]: I0223 09:30:40.697411 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:30:40 crc kubenswrapper[5118]: E0223 09:30:40.698607 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:30:54 crc kubenswrapper[5118]: I0223 09:30:54.697233 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:30:54 crc kubenswrapper[5118]: E0223 09:30:54.699913 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:31:08 crc kubenswrapper[5118]: I0223 09:31:08.697339 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:31:08 crc kubenswrapper[5118]: E0223 09:31:08.698084 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:31:20 crc kubenswrapper[5118]: I0223 09:31:20.697321 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:31:20 crc kubenswrapper[5118]: E0223 09:31:20.698487 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:31:32 crc kubenswrapper[5118]: I0223 09:31:32.697766 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:31:32 crc kubenswrapper[5118]: E0223 09:31:32.698960 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.796716 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tc62l"] Feb 23 09:31:44 crc kubenswrapper[5118]: E0223 09:31:44.797630 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532cd860-7918-44d0-9f42-0fad388dc16c" containerName="collect-profiles" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.797641 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="532cd860-7918-44d0-9f42-0fad388dc16c" containerName="collect-profiles" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.797835 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="532cd860-7918-44d0-9f42-0fad388dc16c" containerName="collect-profiles" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.799161 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.824387 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tc62l"] Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.840339 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-catalog-content\") pod \"redhat-operators-tc62l\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.840412 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-utilities\") pod \"redhat-operators-tc62l\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.840440 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stjvl\" (UniqueName: \"kubernetes.io/projected/40dd765e-3681-44ba-ac44-f2755fd992d4-kube-api-access-stjvl\") pod \"redhat-operators-tc62l\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.942815 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-catalog-content\") pod \"redhat-operators-tc62l\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.942888 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-utilities\") pod \"redhat-operators-tc62l\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.942917 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stjvl\" (UniqueName: \"kubernetes.io/projected/40dd765e-3681-44ba-ac44-f2755fd992d4-kube-api-access-stjvl\") pod \"redhat-operators-tc62l\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.943370 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-catalog-content\") pod \"redhat-operators-tc62l\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:44 crc kubenswrapper[5118]: I0223 09:31:44.943452 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-utilities\") pod \"redhat-operators-tc62l\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:45 crc kubenswrapper[5118]: I0223 09:31:45.325053 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stjvl\" (UniqueName: \"kubernetes.io/projected/40dd765e-3681-44ba-ac44-f2755fd992d4-kube-api-access-stjvl\") pod \"redhat-operators-tc62l\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:45 crc kubenswrapper[5118]: I0223 09:31:45.424776 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:45 crc kubenswrapper[5118]: I0223 09:31:45.933458 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tc62l"] Feb 23 09:31:46 crc kubenswrapper[5118]: I0223 09:31:46.610288 5118 generic.go:334] "Generic (PLEG): container finished" podID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerID="af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439" exitCode=0 Feb 23 09:31:46 crc kubenswrapper[5118]: I0223 09:31:46.610377 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tc62l" event={"ID":"40dd765e-3681-44ba-ac44-f2755fd992d4","Type":"ContainerDied","Data":"af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439"} Feb 23 09:31:46 crc kubenswrapper[5118]: I0223 09:31:46.610623 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tc62l" event={"ID":"40dd765e-3681-44ba-ac44-f2755fd992d4","Type":"ContainerStarted","Data":"7a2e0240ccf146736ffa3f880ac3c6e2541d73abe6e4367122c24ab301366a02"} Feb 23 09:31:46 crc kubenswrapper[5118]: I0223 09:31:46.612789 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:31:47 crc kubenswrapper[5118]: I0223 09:31:47.626306 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tc62l" event={"ID":"40dd765e-3681-44ba-ac44-f2755fd992d4","Type":"ContainerStarted","Data":"35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f"} Feb 23 09:31:47 crc kubenswrapper[5118]: I0223 09:31:47.704953 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:31:48 crc kubenswrapper[5118]: I0223 09:31:48.641782 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"5c08dfe76506d185a30f9c69e70151ba1ea8fe6e89ad7edb993f0bd9e6b86c08"} Feb 23 09:31:52 crc kubenswrapper[5118]: I0223 09:31:52.695720 5118 generic.go:334] "Generic (PLEG): container finished" podID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerID="35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f" exitCode=0 Feb 23 09:31:52 crc kubenswrapper[5118]: I0223 09:31:52.695808 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tc62l" event={"ID":"40dd765e-3681-44ba-ac44-f2755fd992d4","Type":"ContainerDied","Data":"35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f"} Feb 23 09:31:53 crc kubenswrapper[5118]: I0223 09:31:53.716062 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tc62l" event={"ID":"40dd765e-3681-44ba-ac44-f2755fd992d4","Type":"ContainerStarted","Data":"dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c"} Feb 23 09:31:53 crc kubenswrapper[5118]: I0223 09:31:53.744536 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tc62l" podStartSLOduration=3.233353383 podStartE2EDuration="9.744510984s" podCreationTimestamp="2026-02-23 09:31:44 +0000 UTC" firstStartedPulling="2026-02-23 09:31:46.612589651 +0000 UTC m=+9969.616374224" lastFinishedPulling="2026-02-23 09:31:53.123747242 +0000 UTC m=+9976.127531825" observedRunningTime="2026-02-23 09:31:53.742043314 +0000 UTC m=+9976.745827897" watchObservedRunningTime="2026-02-23 09:31:53.744510984 +0000 UTC m=+9976.748295577" Feb 23 09:31:55 crc kubenswrapper[5118]: I0223 09:31:55.424897 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:55 crc kubenswrapper[5118]: I0223 09:31:55.425296 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:31:56 crc kubenswrapper[5118]: I0223 09:31:56.487832 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tc62l" podUID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerName="registry-server" probeResult="failure" output=< Feb 23 09:31:56 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:31:56 crc kubenswrapper[5118]: > Feb 23 09:32:05 crc kubenswrapper[5118]: I0223 09:32:05.490722 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:32:05 crc kubenswrapper[5118]: I0223 09:32:05.551083 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:32:05 crc kubenswrapper[5118]: I0223 09:32:05.739514 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tc62l"] Feb 23 09:32:06 crc kubenswrapper[5118]: I0223 09:32:06.857856 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tc62l" podUID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerName="registry-server" containerID="cri-o://dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c" gracePeriod=2 Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.671256 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.773724 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-catalog-content\") pod \"40dd765e-3681-44ba-ac44-f2755fd992d4\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.773763 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stjvl\" (UniqueName: \"kubernetes.io/projected/40dd765e-3681-44ba-ac44-f2755fd992d4-kube-api-access-stjvl\") pod \"40dd765e-3681-44ba-ac44-f2755fd992d4\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.773982 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-utilities\") pod \"40dd765e-3681-44ba-ac44-f2755fd992d4\" (UID: \"40dd765e-3681-44ba-ac44-f2755fd992d4\") " Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.774943 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-utilities" (OuterVolumeSpecName: "utilities") pod "40dd765e-3681-44ba-ac44-f2755fd992d4" (UID: "40dd765e-3681-44ba-ac44-f2755fd992d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.796383 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40dd765e-3681-44ba-ac44-f2755fd992d4-kube-api-access-stjvl" (OuterVolumeSpecName: "kube-api-access-stjvl") pod "40dd765e-3681-44ba-ac44-f2755fd992d4" (UID: "40dd765e-3681-44ba-ac44-f2755fd992d4"). InnerVolumeSpecName "kube-api-access-stjvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.868266 5118 generic.go:334] "Generic (PLEG): container finished" podID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerID="dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c" exitCode=0 Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.868560 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tc62l" event={"ID":"40dd765e-3681-44ba-ac44-f2755fd992d4","Type":"ContainerDied","Data":"dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c"} Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.868586 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tc62l" event={"ID":"40dd765e-3681-44ba-ac44-f2755fd992d4","Type":"ContainerDied","Data":"7a2e0240ccf146736ffa3f880ac3c6e2541d73abe6e4367122c24ab301366a02"} Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.868602 5118 scope.go:117] "RemoveContainer" containerID="dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.868722 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tc62l" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.876302 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.876327 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stjvl\" (UniqueName: \"kubernetes.io/projected/40dd765e-3681-44ba-ac44-f2755fd992d4-kube-api-access-stjvl\") on node \"crc\" DevicePath \"\"" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.895968 5118 scope.go:117] "RemoveContainer" containerID="35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.923508 5118 scope.go:117] "RemoveContainer" containerID="af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.964172 5118 scope.go:117] "RemoveContainer" containerID="dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c" Feb 23 09:32:07 crc kubenswrapper[5118]: E0223 09:32:07.964608 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c\": container with ID starting with dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c not found: ID does not exist" containerID="dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.964641 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c"} err="failed to get container status \"dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c\": rpc error: code = NotFound desc = could not find container \"dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c\": container with ID starting with dfeaeca686dcd8eea6c09605b2407f912c3e43a8fdd171d6ef04250096a1a29c not found: ID does not exist" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.964663 5118 scope.go:117] "RemoveContainer" containerID="35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f" Feb 23 09:32:07 crc kubenswrapper[5118]: E0223 09:32:07.964948 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f\": container with ID starting with 35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f not found: ID does not exist" containerID="35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.964977 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f"} err="failed to get container status \"35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f\": rpc error: code = NotFound desc = could not find container \"35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f\": container with ID starting with 35e3a02d976f3a0e329024169afa1df6cc7c6f44038b8be072682e487da77c2f not found: ID does not exist" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.964992 5118 scope.go:117] "RemoveContainer" containerID="af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439" Feb 23 09:32:07 crc kubenswrapper[5118]: E0223 09:32:07.965355 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439\": container with ID starting with af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439 not found: ID does not exist" containerID="af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.965377 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439"} err="failed to get container status \"af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439\": rpc error: code = NotFound desc = could not find container \"af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439\": container with ID starting with af4f3614b688eed3c8102eaf943555bec6b51b7d651dfb94574f2c3d4434d439 not found: ID does not exist" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.977954 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40dd765e-3681-44ba-ac44-f2755fd992d4" (UID: "40dd765e-3681-44ba-ac44-f2755fd992d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:32:07 crc kubenswrapper[5118]: I0223 09:32:07.978265 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40dd765e-3681-44ba-ac44-f2755fd992d4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:32:08 crc kubenswrapper[5118]: I0223 09:32:08.201585 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tc62l"] Feb 23 09:32:08 crc kubenswrapper[5118]: I0223 09:32:08.211853 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tc62l"] Feb 23 09:32:09 crc kubenswrapper[5118]: I0223 09:32:09.732662 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40dd765e-3681-44ba-ac44-f2755fd992d4" path="/var/lib/kubelet/pods/40dd765e-3681-44ba-ac44-f2755fd992d4/volumes" Feb 23 09:33:18 crc kubenswrapper[5118]: I0223 09:33:18.529642 5118 trace.go:236] Trace[697879196]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (23-Feb-2026 09:33:17.442) (total time: 1086ms): Feb 23 09:33:18 crc kubenswrapper[5118]: Trace[697879196]: [1.086875979s] [1.086875979s] END Feb 23 09:33:23 crc kubenswrapper[5118]: I0223 09:33:23.678783 5118 generic.go:334] "Generic (PLEG): container finished" podID="a52048ce-f33c-40af-ac4b-8d6981ff6f60" containerID="7920f34e5ea254f0204db4c39260ec689fb273994e401351c31777630f0f5eee" exitCode=0 Feb 23 09:33:23 crc kubenswrapper[5118]: I0223 09:33:23.678839 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-56ljn" event={"ID":"a52048ce-f33c-40af-ac4b-8d6981ff6f60","Type":"ContainerDied","Data":"7920f34e5ea254f0204db4c39260ec689fb273994e401351c31777630f0f5eee"} Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.152943 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.263510 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-0\") pod \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.263593 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ssh-key-openstack-cell1\") pod \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.263636 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l9qc\" (UniqueName: \"kubernetes.io/projected/a52048ce-f33c-40af-ac4b-8d6981ff6f60-kube-api-access-5l9qc\") pod \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.263724 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-2\") pod \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.264407 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-1\") pod \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.264535 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-inventory\") pod \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.264570 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceph\") pod \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.264623 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-telemetry-combined-ca-bundle\") pod \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\" (UID: \"a52048ce-f33c-40af-ac4b-8d6981ff6f60\") " Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.269606 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52048ce-f33c-40af-ac4b-8d6981ff6f60-kube-api-access-5l9qc" (OuterVolumeSpecName: "kube-api-access-5l9qc") pod "a52048ce-f33c-40af-ac4b-8d6981ff6f60" (UID: "a52048ce-f33c-40af-ac4b-8d6981ff6f60"). InnerVolumeSpecName "kube-api-access-5l9qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.269833 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a52048ce-f33c-40af-ac4b-8d6981ff6f60" (UID: "a52048ce-f33c-40af-ac4b-8d6981ff6f60"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.273209 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceph" (OuterVolumeSpecName: "ceph") pod "a52048ce-f33c-40af-ac4b-8d6981ff6f60" (UID: "a52048ce-f33c-40af-ac4b-8d6981ff6f60"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.292495 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a52048ce-f33c-40af-ac4b-8d6981ff6f60" (UID: "a52048ce-f33c-40af-ac4b-8d6981ff6f60"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.297490 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-inventory" (OuterVolumeSpecName: "inventory") pod "a52048ce-f33c-40af-ac4b-8d6981ff6f60" (UID: "a52048ce-f33c-40af-ac4b-8d6981ff6f60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.300426 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a52048ce-f33c-40af-ac4b-8d6981ff6f60" (UID: "a52048ce-f33c-40af-ac4b-8d6981ff6f60"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.301674 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a52048ce-f33c-40af-ac4b-8d6981ff6f60" (UID: "a52048ce-f33c-40af-ac4b-8d6981ff6f60"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.312359 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a52048ce-f33c-40af-ac4b-8d6981ff6f60" (UID: "a52048ce-f33c-40af-ac4b-8d6981ff6f60"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.367839 5118 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.367877 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.367891 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l9qc\" (UniqueName: \"kubernetes.io/projected/a52048ce-f33c-40af-ac4b-8d6981ff6f60-kube-api-access-5l9qc\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.367907 5118 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.367920 5118 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.367933 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.367944 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.367955 5118 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52048ce-f33c-40af-ac4b-8d6981ff6f60-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.704464 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-56ljn" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.714539 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-56ljn" event={"ID":"a52048ce-f33c-40af-ac4b-8d6981ff6f60","Type":"ContainerDied","Data":"e246d0c6aa6310ed4d975057d43ec5fa962b8a1817572b9cee51f40c94e37df0"} Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.714585 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e246d0c6aa6310ed4d975057d43ec5fa962b8a1817572b9cee51f40c94e37df0" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.972844 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-64pnn"] Feb 23 09:33:25 crc kubenswrapper[5118]: E0223 09:33:25.973361 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerName="extract-content" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.973387 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerName="extract-content" Feb 23 09:33:25 crc kubenswrapper[5118]: E0223 09:33:25.973406 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52048ce-f33c-40af-ac4b-8d6981ff6f60" containerName="telemetry-openstack-openstack-cell1" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.973416 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52048ce-f33c-40af-ac4b-8d6981ff6f60" containerName="telemetry-openstack-openstack-cell1" Feb 23 09:33:25 crc kubenswrapper[5118]: E0223 09:33:25.973435 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerName="registry-server" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.973444 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerName="registry-server" Feb 23 09:33:25 crc kubenswrapper[5118]: E0223 09:33:25.973458 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerName="extract-utilities" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.973467 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerName="extract-utilities" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.973718 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52048ce-f33c-40af-ac4b-8d6981ff6f60" containerName="telemetry-openstack-openstack-cell1" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.973761 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="40dd765e-3681-44ba-ac44-f2755fd992d4" containerName="registry-server" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.974654 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.977023 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.977452 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.977521 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.977677 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.979974 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:33:25 crc kubenswrapper[5118]: I0223 09:33:25.984165 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-64pnn"] Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.080352 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.080650 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.080868 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.081364 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.081733 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.081799 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksfrt\" (UniqueName: \"kubernetes.io/projected/7b2f0e3e-3ab3-4f48-9823-2253d2267393-kube-api-access-ksfrt\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.184012 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.184776 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksfrt\" (UniqueName: \"kubernetes.io/projected/7b2f0e3e-3ab3-4f48-9823-2253d2267393-kube-api-access-ksfrt\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.184942 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.185131 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.185295 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.185403 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.188791 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.189577 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.189745 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.190237 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.193861 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.213832 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksfrt\" (UniqueName: \"kubernetes.io/projected/7b2f0e3e-3ab3-4f48-9823-2253d2267393-kube-api-access-ksfrt\") pod \"neutron-sriov-openstack-openstack-cell1-64pnn\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.310280 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:33:26 crc kubenswrapper[5118]: I0223 09:33:26.966445 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-64pnn"] Feb 23 09:33:27 crc kubenswrapper[5118]: I0223 09:33:27.737170 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" event={"ID":"7b2f0e3e-3ab3-4f48-9823-2253d2267393","Type":"ContainerStarted","Data":"4455fb905eaf282a0ae837df7f92930cec62b52c706312ea656937d1cee4d755"} Feb 23 09:33:27 crc kubenswrapper[5118]: I0223 09:33:27.737472 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" event={"ID":"7b2f0e3e-3ab3-4f48-9823-2253d2267393","Type":"ContainerStarted","Data":"60cd5eed51cbbbe9e0ebc58bf0a3b5d453b7dc504dc3520707bbded8705c8c65"} Feb 23 09:33:27 crc kubenswrapper[5118]: I0223 09:33:27.765817 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" podStartSLOduration=2.337979743 podStartE2EDuration="2.765784907s" podCreationTimestamp="2026-02-23 09:33:25 +0000 UTC" firstStartedPulling="2026-02-23 09:33:26.975442974 +0000 UTC m=+10069.979227547" lastFinishedPulling="2026-02-23 09:33:27.403248138 +0000 UTC m=+10070.407032711" observedRunningTime="2026-02-23 09:33:27.759933706 +0000 UTC m=+10070.763718289" watchObservedRunningTime="2026-02-23 09:33:27.765784907 +0000 UTC m=+10070.769569490" Feb 23 09:34:02 crc kubenswrapper[5118]: I0223 09:34:02.975425 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:34:02 crc kubenswrapper[5118]: I0223 09:34:02.976075 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:34:21 crc kubenswrapper[5118]: I0223 09:34:21.905431 5118 generic.go:334] "Generic (PLEG): container finished" podID="7b2f0e3e-3ab3-4f48-9823-2253d2267393" containerID="4455fb905eaf282a0ae837df7f92930cec62b52c706312ea656937d1cee4d755" exitCode=0 Feb 23 09:34:21 crc kubenswrapper[5118]: I0223 09:34:21.905499 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" event={"ID":"7b2f0e3e-3ab3-4f48-9823-2253d2267393","Type":"ContainerDied","Data":"4455fb905eaf282a0ae837df7f92930cec62b52c706312ea656937d1cee4d755"} Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.387548 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.473053 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksfrt\" (UniqueName: \"kubernetes.io/projected/7b2f0e3e-3ab3-4f48-9823-2253d2267393-kube-api-access-ksfrt\") pod \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.473161 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-combined-ca-bundle\") pod \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.473257 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-inventory\") pod \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.473300 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-agent-neutron-config-0\") pod \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.473323 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ceph\") pod \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.473355 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ssh-key-openstack-cell1\") pod \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\" (UID: \"7b2f0e3e-3ab3-4f48-9823-2253d2267393\") " Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.479893 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ceph" (OuterVolumeSpecName: "ceph") pod "7b2f0e3e-3ab3-4f48-9823-2253d2267393" (UID: "7b2f0e3e-3ab3-4f48-9823-2253d2267393"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.479918 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2f0e3e-3ab3-4f48-9823-2253d2267393-kube-api-access-ksfrt" (OuterVolumeSpecName: "kube-api-access-ksfrt") pod "7b2f0e3e-3ab3-4f48-9823-2253d2267393" (UID: "7b2f0e3e-3ab3-4f48-9823-2253d2267393"). InnerVolumeSpecName "kube-api-access-ksfrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.498647 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "7b2f0e3e-3ab3-4f48-9823-2253d2267393" (UID: "7b2f0e3e-3ab3-4f48-9823-2253d2267393"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.506780 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-inventory" (OuterVolumeSpecName: "inventory") pod "7b2f0e3e-3ab3-4f48-9823-2253d2267393" (UID: "7b2f0e3e-3ab3-4f48-9823-2253d2267393"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.517506 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7b2f0e3e-3ab3-4f48-9823-2253d2267393" (UID: "7b2f0e3e-3ab3-4f48-9823-2253d2267393"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.520620 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "7b2f0e3e-3ab3-4f48-9823-2253d2267393" (UID: "7b2f0e3e-3ab3-4f48-9823-2253d2267393"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.575239 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.575286 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.575298 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.575306 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.575318 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksfrt\" (UniqueName: \"kubernetes.io/projected/7b2f0e3e-3ab3-4f48-9823-2253d2267393-kube-api-access-ksfrt\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.575326 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2f0e3e-3ab3-4f48-9823-2253d2267393-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.934286 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" event={"ID":"7b2f0e3e-3ab3-4f48-9823-2253d2267393","Type":"ContainerDied","Data":"60cd5eed51cbbbe9e0ebc58bf0a3b5d453b7dc504dc3520707bbded8705c8c65"} Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.934325 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60cd5eed51cbbbe9e0ebc58bf0a3b5d453b7dc504dc3520707bbded8705c8c65" Feb 23 09:34:23 crc kubenswrapper[5118]: I0223 09:34:23.934375 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-64pnn" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.079232 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6"] Feb 23 09:34:24 crc kubenswrapper[5118]: E0223 09:34:24.079999 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2f0e3e-3ab3-4f48-9823-2253d2267393" containerName="neutron-sriov-openstack-openstack-cell1" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.080024 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2f0e3e-3ab3-4f48-9823-2253d2267393" containerName="neutron-sriov-openstack-openstack-cell1" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.080304 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2f0e3e-3ab3-4f48-9823-2253d2267393" containerName="neutron-sriov-openstack-openstack-cell1" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.081231 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.084121 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.084125 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.084142 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.084449 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.085994 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.095967 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6"] Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.186735 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.186802 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.186941 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.187005 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.187061 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clw4m\" (UniqueName: \"kubernetes.io/projected/ad2694f7-370d-4478-959a-4668f0b0fefc-kube-api-access-clw4m\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.187142 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.289227 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.289352 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clw4m\" (UniqueName: \"kubernetes.io/projected/ad2694f7-370d-4478-959a-4668f0b0fefc-kube-api-access-clw4m\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.289424 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.289610 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.289660 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.290015 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.293938 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.294021 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.295350 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.305687 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.306743 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.309347 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clw4m\" (UniqueName: \"kubernetes.io/projected/ad2694f7-370d-4478-959a-4668f0b0fefc-kube-api-access-clw4m\") pod \"neutron-dhcp-openstack-openstack-cell1-7q7f6\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:24 crc kubenswrapper[5118]: I0223 09:34:24.398378 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:34:25 crc kubenswrapper[5118]: I0223 09:34:25.066925 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6"] Feb 23 09:34:25 crc kubenswrapper[5118]: W0223 09:34:25.070161 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad2694f7_370d_4478_959a_4668f0b0fefc.slice/crio-ecdb068ae731966693b292fcd4849215ea102c67397c1834251009bdd33d99ac WatchSource:0}: Error finding container ecdb068ae731966693b292fcd4849215ea102c67397c1834251009bdd33d99ac: Status 404 returned error can't find the container with id ecdb068ae731966693b292fcd4849215ea102c67397c1834251009bdd33d99ac Feb 23 09:34:25 crc kubenswrapper[5118]: I0223 09:34:25.991786 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" event={"ID":"ad2694f7-370d-4478-959a-4668f0b0fefc","Type":"ContainerStarted","Data":"c0c803946548e200a2039a02718b0cb5d1a17a0681623b75563faf0d5ca96825"} Feb 23 09:34:25 crc kubenswrapper[5118]: I0223 09:34:25.992149 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" event={"ID":"ad2694f7-370d-4478-959a-4668f0b0fefc","Type":"ContainerStarted","Data":"ecdb068ae731966693b292fcd4849215ea102c67397c1834251009bdd33d99ac"} Feb 23 09:34:26 crc kubenswrapper[5118]: I0223 09:34:26.023054 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" podStartSLOduration=1.5188937070000001 podStartE2EDuration="2.023033664s" podCreationTimestamp="2026-02-23 09:34:24 +0000 UTC" firstStartedPulling="2026-02-23 09:34:25.072628298 +0000 UTC m=+10128.076412871" lastFinishedPulling="2026-02-23 09:34:25.576768215 +0000 UTC m=+10128.580552828" observedRunningTime="2026-02-23 09:34:26.015051152 +0000 UTC m=+10129.018835725" watchObservedRunningTime="2026-02-23 09:34:26.023033664 +0000 UTC m=+10129.026818257" Feb 23 09:34:32 crc kubenswrapper[5118]: I0223 09:34:32.975414 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:34:32 crc kubenswrapper[5118]: I0223 09:34:32.976058 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:35:02 crc kubenswrapper[5118]: I0223 09:35:02.975391 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:35:02 crc kubenswrapper[5118]: I0223 09:35:02.976036 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:35:02 crc kubenswrapper[5118]: I0223 09:35:02.976224 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 09:35:02 crc kubenswrapper[5118]: I0223 09:35:02.977067 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c08dfe76506d185a30f9c69e70151ba1ea8fe6e89ad7edb993f0bd9e6b86c08"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:35:02 crc kubenswrapper[5118]: I0223 09:35:02.977463 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://5c08dfe76506d185a30f9c69e70151ba1ea8fe6e89ad7edb993f0bd9e6b86c08" gracePeriod=600 Feb 23 09:35:03 crc kubenswrapper[5118]: I0223 09:35:03.419401 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="5c08dfe76506d185a30f9c69e70151ba1ea8fe6e89ad7edb993f0bd9e6b86c08" exitCode=0 Feb 23 09:35:03 crc kubenswrapper[5118]: I0223 09:35:03.419589 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"5c08dfe76506d185a30f9c69e70151ba1ea8fe6e89ad7edb993f0bd9e6b86c08"} Feb 23 09:35:03 crc kubenswrapper[5118]: I0223 09:35:03.419746 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86"} Feb 23 09:35:03 crc kubenswrapper[5118]: I0223 09:35:03.419773 5118 scope.go:117] "RemoveContainer" containerID="ac3cf26c3fb19f92224ab2ae11a809c616faa0f80c75e9d319dff0610a5dfcd9" Feb 23 09:35:27 crc kubenswrapper[5118]: I0223 09:35:27.681965 5118 generic.go:334] "Generic (PLEG): container finished" podID="ad2694f7-370d-4478-959a-4668f0b0fefc" containerID="c0c803946548e200a2039a02718b0cb5d1a17a0681623b75563faf0d5ca96825" exitCode=0 Feb 23 09:35:27 crc kubenswrapper[5118]: I0223 09:35:27.682193 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" event={"ID":"ad2694f7-370d-4478-959a-4668f0b0fefc","Type":"ContainerDied","Data":"c0c803946548e200a2039a02718b0cb5d1a17a0681623b75563faf0d5ca96825"} Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.199003 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.311968 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clw4m\" (UniqueName: \"kubernetes.io/projected/ad2694f7-370d-4478-959a-4668f0b0fefc-kube-api-access-clw4m\") pod \"ad2694f7-370d-4478-959a-4668f0b0fefc\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.312019 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ceph\") pod \"ad2694f7-370d-4478-959a-4668f0b0fefc\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.312067 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-agent-neutron-config-0\") pod \"ad2694f7-370d-4478-959a-4668f0b0fefc\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.312116 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ssh-key-openstack-cell1\") pod \"ad2694f7-370d-4478-959a-4668f0b0fefc\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.312143 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-combined-ca-bundle\") pod \"ad2694f7-370d-4478-959a-4668f0b0fefc\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.312195 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-inventory\") pod \"ad2694f7-370d-4478-959a-4668f0b0fefc\" (UID: \"ad2694f7-370d-4478-959a-4668f0b0fefc\") " Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.318293 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ceph" (OuterVolumeSpecName: "ceph") pod "ad2694f7-370d-4478-959a-4668f0b0fefc" (UID: "ad2694f7-370d-4478-959a-4668f0b0fefc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.319454 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2694f7-370d-4478-959a-4668f0b0fefc-kube-api-access-clw4m" (OuterVolumeSpecName: "kube-api-access-clw4m") pod "ad2694f7-370d-4478-959a-4668f0b0fefc" (UID: "ad2694f7-370d-4478-959a-4668f0b0fefc"). InnerVolumeSpecName "kube-api-access-clw4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.319721 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "ad2694f7-370d-4478-959a-4668f0b0fefc" (UID: "ad2694f7-370d-4478-959a-4668f0b0fefc"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.341870 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-inventory" (OuterVolumeSpecName: "inventory") pod "ad2694f7-370d-4478-959a-4668f0b0fefc" (UID: "ad2694f7-370d-4478-959a-4668f0b0fefc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.342422 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ad2694f7-370d-4478-959a-4668f0b0fefc" (UID: "ad2694f7-370d-4478-959a-4668f0b0fefc"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.344985 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "ad2694f7-370d-4478-959a-4668f0b0fefc" (UID: "ad2694f7-370d-4478-959a-4668f0b0fefc"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.416186 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clw4m\" (UniqueName: \"kubernetes.io/projected/ad2694f7-370d-4478-959a-4668f0b0fefc-kube-api-access-clw4m\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.416483 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.416641 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.416795 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.416944 5118 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.417084 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad2694f7-370d-4478-959a-4668f0b0fefc-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.712232 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.715637 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-7q7f6" event={"ID":"ad2694f7-370d-4478-959a-4668f0b0fefc","Type":"ContainerDied","Data":"ecdb068ae731966693b292fcd4849215ea102c67397c1834251009bdd33d99ac"} Feb 23 09:35:29 crc kubenswrapper[5118]: I0223 09:35:29.715693 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecdb068ae731966693b292fcd4849215ea102c67397c1834251009bdd33d99ac" Feb 23 09:35:49 crc kubenswrapper[5118]: I0223 09:35:49.441458 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 09:35:49 crc kubenswrapper[5118]: I0223 09:35:49.445022 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="5314407a-1b6c-4ad5-89aa-c876d1aaa379" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8e3a321083b8fe17b275eb3b0b34b396e8af35cd8e7b81623428981c66d976f7" gracePeriod=30 Feb 23 09:35:49 crc kubenswrapper[5118]: I0223 09:35:49.470408 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 09:35:49 crc kubenswrapper[5118]: I0223 09:35:49.470647 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="fe565b17-17ee-4719-969b-0e4ee9cda9b5" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b" gracePeriod=30 Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.569298 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.570139 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" containerName="nova-api-log" containerID="cri-o://7cecc2661030d87870d8645913fc54230ed2c8e4402656e658611a330679ec48" gracePeriod=30 Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.570716 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" containerName="nova-api-api" containerID="cri-o://49bcaced6476503e7831711bb37e6b1e48108345338319e243caad11e097fa80" gracePeriod=30 Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.590248 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.590482 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e" containerName="nova-scheduler-scheduler" containerID="cri-o://b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9" gracePeriod=30 Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.616820 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.617085 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerName="nova-metadata-log" containerID="cri-o://404cb13a74f8a0d840e7ef560b45a9bba5286fdbba1c44359a1dc0c3d3698763" gracePeriod=30 Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.617256 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerName="nova-metadata-metadata" containerID="cri-o://c56fbd16f80d712125d16d5cc5cf324d93722944ee8505318a95e6cc986a0499" gracePeriod=30 Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.836498 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.941230 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-config-data\") pod \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.941452 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs5nv\" (UniqueName: \"kubernetes.io/projected/fe565b17-17ee-4719-969b-0e4ee9cda9b5-kube-api-access-qs5nv\") pod \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.941684 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-combined-ca-bundle\") pod \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\" (UID: \"fe565b17-17ee-4719-969b-0e4ee9cda9b5\") " Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.950254 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe565b17-17ee-4719-969b-0e4ee9cda9b5-kube-api-access-qs5nv" (OuterVolumeSpecName: "kube-api-access-qs5nv") pod "fe565b17-17ee-4719-969b-0e4ee9cda9b5" (UID: "fe565b17-17ee-4719-969b-0e4ee9cda9b5"). InnerVolumeSpecName "kube-api-access-qs5nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.953852 5118 generic.go:334] "Generic (PLEG): container finished" podID="fe565b17-17ee-4719-969b-0e4ee9cda9b5" containerID="d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b" exitCode=0 Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.953935 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fe565b17-17ee-4719-969b-0e4ee9cda9b5","Type":"ContainerDied","Data":"d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b"} Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.953962 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fe565b17-17ee-4719-969b-0e4ee9cda9b5","Type":"ContainerDied","Data":"8140d2d30c43be9152c73623a0b0a737ce28c27bfe3450591ff90fdefdd7ee55"} Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.953979 5118 scope.go:117] "RemoveContainer" containerID="d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b" Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.954175 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.959977 5118 generic.go:334] "Generic (PLEG): container finished" podID="5203da4c-1034-4cfa-9671-18b8de107d14" containerID="7cecc2661030d87870d8645913fc54230ed2c8e4402656e658611a330679ec48" exitCode=143 Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.960040 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5203da4c-1034-4cfa-9671-18b8de107d14","Type":"ContainerDied","Data":"7cecc2661030d87870d8645913fc54230ed2c8e4402656e658611a330679ec48"} Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.961330 5118 generic.go:334] "Generic (PLEG): container finished" podID="5314407a-1b6c-4ad5-89aa-c876d1aaa379" containerID="8e3a321083b8fe17b275eb3b0b34b396e8af35cd8e7b81623428981c66d976f7" exitCode=0 Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.961360 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5314407a-1b6c-4ad5-89aa-c876d1aaa379","Type":"ContainerDied","Data":"8e3a321083b8fe17b275eb3b0b34b396e8af35cd8e7b81623428981c66d976f7"} Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.964019 5118 generic.go:334] "Generic (PLEG): container finished" podID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerID="404cb13a74f8a0d840e7ef560b45a9bba5286fdbba1c44359a1dc0c3d3698763" exitCode=143 Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.964051 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fa1ea1b-b64d-48fd-a760-75dce9412009","Type":"ContainerDied","Data":"404cb13a74f8a0d840e7ef560b45a9bba5286fdbba1c44359a1dc0c3d3698763"} Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.968459 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-config-data" (OuterVolumeSpecName: "config-data") pod "fe565b17-17ee-4719-969b-0e4ee9cda9b5" (UID: "fe565b17-17ee-4719-969b-0e4ee9cda9b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.977256 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe565b17-17ee-4719-969b-0e4ee9cda9b5" (UID: "fe565b17-17ee-4719-969b-0e4ee9cda9b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.985408 5118 scope.go:117] "RemoveContainer" containerID="d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b" Feb 23 09:35:50 crc kubenswrapper[5118]: E0223 09:35:50.985773 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b\": container with ID starting with d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b not found: ID does not exist" containerID="d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b" Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.985812 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b"} err="failed to get container status \"d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b\": rpc error: code = NotFound desc = could not find container \"d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b\": container with ID starting with d60b491b44172a0e39ee9dc0f5dfdcd47cc9ef329794618e75f8b7f0fb41480b not found: ID does not exist" Feb 23 09:35:50 crc kubenswrapper[5118]: I0223 09:35:50.994983 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.043272 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-combined-ca-bundle\") pod \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.043343 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-config-data\") pod \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.043432 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9m5h\" (UniqueName: \"kubernetes.io/projected/5314407a-1b6c-4ad5-89aa-c876d1aaa379-kube-api-access-z9m5h\") pod \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\" (UID: \"5314407a-1b6c-4ad5-89aa-c876d1aaa379\") " Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.043849 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs5nv\" (UniqueName: \"kubernetes.io/projected/fe565b17-17ee-4719-969b-0e4ee9cda9b5-kube-api-access-qs5nv\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.043867 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.043878 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe565b17-17ee-4719-969b-0e4ee9cda9b5-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.047206 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5314407a-1b6c-4ad5-89aa-c876d1aaa379-kube-api-access-z9m5h" (OuterVolumeSpecName: "kube-api-access-z9m5h") pod "5314407a-1b6c-4ad5-89aa-c876d1aaa379" (UID: "5314407a-1b6c-4ad5-89aa-c876d1aaa379"). InnerVolumeSpecName "kube-api-access-z9m5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.069160 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5314407a-1b6c-4ad5-89aa-c876d1aaa379" (UID: "5314407a-1b6c-4ad5-89aa-c876d1aaa379"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.080740 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-config-data" (OuterVolumeSpecName: "config-data") pod "5314407a-1b6c-4ad5-89aa-c876d1aaa379" (UID: "5314407a-1b6c-4ad5-89aa-c876d1aaa379"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.146212 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.146246 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5314407a-1b6c-4ad5-89aa-c876d1aaa379-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.146256 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9m5h\" (UniqueName: \"kubernetes.io/projected/5314407a-1b6c-4ad5-89aa-c876d1aaa379-kube-api-access-z9m5h\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.289112 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.297480 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.310688 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 09:35:51 crc kubenswrapper[5118]: E0223 09:35:51.311071 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2694f7-370d-4478-959a-4668f0b0fefc" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.311087 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2694f7-370d-4478-959a-4668f0b0fefc" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 23 09:35:51 crc kubenswrapper[5118]: E0223 09:35:51.311164 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5314407a-1b6c-4ad5-89aa-c876d1aaa379" containerName="nova-cell0-conductor-conductor" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.311171 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5314407a-1b6c-4ad5-89aa-c876d1aaa379" containerName="nova-cell0-conductor-conductor" Feb 23 09:35:51 crc kubenswrapper[5118]: E0223 09:35:51.311183 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe565b17-17ee-4719-969b-0e4ee9cda9b5" containerName="nova-cell1-conductor-conductor" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.311190 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe565b17-17ee-4719-969b-0e4ee9cda9b5" containerName="nova-cell1-conductor-conductor" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.311388 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2694f7-370d-4478-959a-4668f0b0fefc" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.311399 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5314407a-1b6c-4ad5-89aa-c876d1aaa379" containerName="nova-cell0-conductor-conductor" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.311412 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe565b17-17ee-4719-969b-0e4ee9cda9b5" containerName="nova-cell1-conductor-conductor" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.312058 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.315520 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.349963 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f801af4d-7c56-423f-9590-3c9a9d813356-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f801af4d-7c56-423f-9590-3c9a9d813356\") " pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.350018 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clszd\" (UniqueName: \"kubernetes.io/projected/f801af4d-7c56-423f-9590-3c9a9d813356-kube-api-access-clszd\") pod \"nova-cell1-conductor-0\" (UID: \"f801af4d-7c56-423f-9590-3c9a9d813356\") " pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.350183 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f801af4d-7c56-423f-9590-3c9a9d813356-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f801af4d-7c56-423f-9590-3c9a9d813356\") " pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.350213 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.452111 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f801af4d-7c56-423f-9590-3c9a9d813356-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f801af4d-7c56-423f-9590-3c9a9d813356\") " pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.452261 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f801af4d-7c56-423f-9590-3c9a9d813356-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f801af4d-7c56-423f-9590-3c9a9d813356\") " pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.452279 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clszd\" (UniqueName: \"kubernetes.io/projected/f801af4d-7c56-423f-9590-3c9a9d813356-kube-api-access-clszd\") pod \"nova-cell1-conductor-0\" (UID: \"f801af4d-7c56-423f-9590-3c9a9d813356\") " pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.457152 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f801af4d-7c56-423f-9590-3c9a9d813356-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f801af4d-7c56-423f-9590-3c9a9d813356\") " pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.457306 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f801af4d-7c56-423f-9590-3c9a9d813356-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f801af4d-7c56-423f-9590-3c9a9d813356\") " pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.469965 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clszd\" (UniqueName: \"kubernetes.io/projected/f801af4d-7c56-423f-9590-3c9a9d813356-kube-api-access-clszd\") pod \"nova-cell1-conductor-0\" (UID: \"f801af4d-7c56-423f-9590-3c9a9d813356\") " pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.629225 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.707761 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe565b17-17ee-4719-969b-0e4ee9cda9b5" path="/var/lib/kubelet/pods/fe565b17-17ee-4719-969b-0e4ee9cda9b5/volumes" Feb 23 09:35:51 crc kubenswrapper[5118]: E0223 09:35:51.800328 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5314407a_1b6c_4ad5_89aa_c876d1aaa379.slice/crio-bcac2794880d02397dd8d438704ee9a87df0de3a68d5262c3477eeca56ed994c\": RecentStats: unable to find data in memory cache]" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.981521 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5314407a-1b6c-4ad5-89aa-c876d1aaa379","Type":"ContainerDied","Data":"bcac2794880d02397dd8d438704ee9a87df0de3a68d5262c3477eeca56ed994c"} Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.981597 5118 scope.go:117] "RemoveContainer" containerID="8e3a321083b8fe17b275eb3b0b34b396e8af35cd8e7b81623428981c66d976f7" Feb 23 09:35:51 crc kubenswrapper[5118]: I0223 09:35:51.981850 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.029408 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.040063 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.048180 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.049687 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.055048 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.056197 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.135535 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.166004 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f83d0f7-2228-4c9c-8990-04ccd42dafc6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8f83d0f7-2228-4c9c-8990-04ccd42dafc6\") " pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.166173 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbbjf\" (UniqueName: \"kubernetes.io/projected/8f83d0f7-2228-4c9c-8990-04ccd42dafc6-kube-api-access-pbbjf\") pod \"nova-cell0-conductor-0\" (UID: \"8f83d0f7-2228-4c9c-8990-04ccd42dafc6\") " pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.166261 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f83d0f7-2228-4c9c-8990-04ccd42dafc6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8f83d0f7-2228-4c9c-8990-04ccd42dafc6\") " pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.268081 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f83d0f7-2228-4c9c-8990-04ccd42dafc6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8f83d0f7-2228-4c9c-8990-04ccd42dafc6\") " pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.268269 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbbjf\" (UniqueName: \"kubernetes.io/projected/8f83d0f7-2228-4c9c-8990-04ccd42dafc6-kube-api-access-pbbjf\") pod \"nova-cell0-conductor-0\" (UID: \"8f83d0f7-2228-4c9c-8990-04ccd42dafc6\") " pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.268332 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f83d0f7-2228-4c9c-8990-04ccd42dafc6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8f83d0f7-2228-4c9c-8990-04ccd42dafc6\") " pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.272261 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f83d0f7-2228-4c9c-8990-04ccd42dafc6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8f83d0f7-2228-4c9c-8990-04ccd42dafc6\") " pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.273195 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f83d0f7-2228-4c9c-8990-04ccd42dafc6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8f83d0f7-2228-4c9c-8990-04ccd42dafc6\") " pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.285808 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbbjf\" (UniqueName: \"kubernetes.io/projected/8f83d0f7-2228-4c9c-8990-04ccd42dafc6-kube-api-access-pbbjf\") pod \"nova-cell0-conductor-0\" (UID: \"8f83d0f7-2228-4c9c-8990-04ccd42dafc6\") " pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.372063 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:52 crc kubenswrapper[5118]: E0223 09:35:52.398672 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 09:35:52 crc kubenswrapper[5118]: E0223 09:35:52.400278 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 09:35:52 crc kubenswrapper[5118]: E0223 09:35:52.401614 5118 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 09:35:52 crc kubenswrapper[5118]: E0223 09:35:52.401648 5118 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e" containerName="nova-scheduler-scheduler" Feb 23 09:35:52 crc kubenswrapper[5118]: I0223 09:35:52.848405 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.003562 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8f83d0f7-2228-4c9c-8990-04ccd42dafc6","Type":"ContainerStarted","Data":"045943b5cde240938589899ccfb062d1c9565f28485d896515667712240da074"} Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.005448 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f801af4d-7c56-423f-9590-3c9a9d813356","Type":"ContainerStarted","Data":"2ee8aa956731501aa0440304d00440f89c7f1e425a28a5f337aeb7a30a9c09b8"} Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.005482 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f801af4d-7c56-423f-9590-3c9a9d813356","Type":"ContainerStarted","Data":"705fc3e699d7677f796e6cfc41036b5643b153a10829d86164db53e6f42c68db"} Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.005683 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.032844 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.032826677 podStartE2EDuration="2.032826677s" podCreationTimestamp="2026-02-23 09:35:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:35:53.026576185 +0000 UTC m=+10216.030360768" watchObservedRunningTime="2026-02-23 09:35:53.032826677 +0000 UTC m=+10216.036611250" Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.594247 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.697655 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-config-data\") pod \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.697765 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-combined-ca-bundle\") pod \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.697802 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkfvk\" (UniqueName: \"kubernetes.io/projected/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-kube-api-access-nkfvk\") pod \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\" (UID: \"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e\") " Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.720800 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-kube-api-access-nkfvk" (OuterVolumeSpecName: "kube-api-access-nkfvk") pod "aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e" (UID: "aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e"). InnerVolumeSpecName "kube-api-access-nkfvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.729250 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5314407a-1b6c-4ad5-89aa-c876d1aaa379" path="/var/lib/kubelet/pods/5314407a-1b6c-4ad5-89aa-c876d1aaa379/volumes" Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.764620 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e" (UID: "aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.767428 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-config-data" (OuterVolumeSpecName: "config-data") pod "aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e" (UID: "aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.802347 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.802386 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:53 crc kubenswrapper[5118]: I0223 09:35:53.802403 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkfvk\" (UniqueName: \"kubernetes.io/projected/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e-kube-api-access-nkfvk\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.024243 5118 generic.go:334] "Generic (PLEG): container finished" podID="5203da4c-1034-4cfa-9671-18b8de107d14" containerID="49bcaced6476503e7831711bb37e6b1e48108345338319e243caad11e097fa80" exitCode=0 Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.024699 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5203da4c-1034-4cfa-9671-18b8de107d14","Type":"ContainerDied","Data":"49bcaced6476503e7831711bb37e6b1e48108345338319e243caad11e097fa80"} Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.031473 5118 generic.go:334] "Generic (PLEG): container finished" podID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerID="c56fbd16f80d712125d16d5cc5cf324d93722944ee8505318a95e6cc986a0499" exitCode=0 Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.031579 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fa1ea1b-b64d-48fd-a760-75dce9412009","Type":"ContainerDied","Data":"c56fbd16f80d712125d16d5cc5cf324d93722944ee8505318a95e6cc986a0499"} Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.033722 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8f83d0f7-2228-4c9c-8990-04ccd42dafc6","Type":"ContainerStarted","Data":"305ee6055efd544ab8c8571164731384cc64c3ad94c4bc53e5e769448db07f2d"} Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.034425 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.036420 5118 generic.go:334] "Generic (PLEG): container finished" podID="aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e" containerID="b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9" exitCode=0 Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.036925 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e","Type":"ContainerDied","Data":"b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9"} Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.036973 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e","Type":"ContainerDied","Data":"00a165fd24be78aae22c99fffcd59ae365a879b1fa85030f431941a23e5d2145"} Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.036996 5118 scope.go:117] "RemoveContainer" containerID="b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.038726 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.073320 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.073298596 podStartE2EDuration="2.073298596s" podCreationTimestamp="2026-02-23 09:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:35:54.054681355 +0000 UTC m=+10217.058465938" watchObservedRunningTime="2026-02-23 09:35:54.073298596 +0000 UTC m=+10217.077083169" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.095999 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.104629 5118 scope.go:117] "RemoveContainer" containerID="b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9" Feb 23 09:35:54 crc kubenswrapper[5118]: E0223 09:35:54.110302 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9\": container with ID starting with b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9 not found: ID does not exist" containerID="b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.110360 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9"} err="failed to get container status \"b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9\": rpc error: code = NotFound desc = could not find container \"b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9\": container with ID starting with b7e6d5e556d6e4d4735e9f0afa94bde850256a6e6c183b4dd057349aa43cb5e9 not found: ID does not exist" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.121961 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.137348 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 09:35:54 crc kubenswrapper[5118]: E0223 09:35:54.137708 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e" containerName="nova-scheduler-scheduler" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.137728 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e" containerName="nova-scheduler-scheduler" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.137972 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e" containerName="nova-scheduler-scheduler" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.138710 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.140313 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.146233 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.155219 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.213223 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5203da4c-1034-4cfa-9671-18b8de107d14-logs\") pod \"5203da4c-1034-4cfa-9671-18b8de107d14\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.213263 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-config-data\") pod \"5203da4c-1034-4cfa-9671-18b8de107d14\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.213285 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-combined-ca-bundle\") pod \"5203da4c-1034-4cfa-9671-18b8de107d14\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.213464 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ktrf\" (UniqueName: \"kubernetes.io/projected/5203da4c-1034-4cfa-9671-18b8de107d14-kube-api-access-9ktrf\") pod \"5203da4c-1034-4cfa-9671-18b8de107d14\" (UID: \"5203da4c-1034-4cfa-9671-18b8de107d14\") " Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.213717 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd75b7e3-5c92-4b37-adba-ef56c1507da6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd75b7e3-5c92-4b37-adba-ef56c1507da6\") " pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.213750 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xdq\" (UniqueName: \"kubernetes.io/projected/bd75b7e3-5c92-4b37-adba-ef56c1507da6-kube-api-access-s8xdq\") pod \"nova-scheduler-0\" (UID: \"bd75b7e3-5c92-4b37-adba-ef56c1507da6\") " pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.213815 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd75b7e3-5c92-4b37-adba-ef56c1507da6-config-data\") pod \"nova-scheduler-0\" (UID: \"bd75b7e3-5c92-4b37-adba-ef56c1507da6\") " pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.217403 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5203da4c-1034-4cfa-9671-18b8de107d14-logs" (OuterVolumeSpecName: "logs") pod "5203da4c-1034-4cfa-9671-18b8de107d14" (UID: "5203da4c-1034-4cfa-9671-18b8de107d14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.222371 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5203da4c-1034-4cfa-9671-18b8de107d14-kube-api-access-9ktrf" (OuterVolumeSpecName: "kube-api-access-9ktrf") pod "5203da4c-1034-4cfa-9671-18b8de107d14" (UID: "5203da4c-1034-4cfa-9671-18b8de107d14"). InnerVolumeSpecName "kube-api-access-9ktrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.247735 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5203da4c-1034-4cfa-9671-18b8de107d14" (UID: "5203da4c-1034-4cfa-9671-18b8de107d14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.248886 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-config-data" (OuterVolumeSpecName: "config-data") pod "5203da4c-1034-4cfa-9671-18b8de107d14" (UID: "5203da4c-1034-4cfa-9671-18b8de107d14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.315965 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd75b7e3-5c92-4b37-adba-ef56c1507da6-config-data\") pod \"nova-scheduler-0\" (UID: \"bd75b7e3-5c92-4b37-adba-ef56c1507da6\") " pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.316544 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd75b7e3-5c92-4b37-adba-ef56c1507da6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd75b7e3-5c92-4b37-adba-ef56c1507da6\") " pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.316620 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xdq\" (UniqueName: \"kubernetes.io/projected/bd75b7e3-5c92-4b37-adba-ef56c1507da6-kube-api-access-s8xdq\") pod \"nova-scheduler-0\" (UID: \"bd75b7e3-5c92-4b37-adba-ef56c1507da6\") " pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.316676 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ktrf\" (UniqueName: \"kubernetes.io/projected/5203da4c-1034-4cfa-9671-18b8de107d14-kube-api-access-9ktrf\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.316687 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5203da4c-1034-4cfa-9671-18b8de107d14-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.316696 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.316705 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5203da4c-1034-4cfa-9671-18b8de107d14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.319971 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd75b7e3-5c92-4b37-adba-ef56c1507da6-config-data\") pod \"nova-scheduler-0\" (UID: \"bd75b7e3-5c92-4b37-adba-ef56c1507da6\") " pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.322981 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.331456 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xdq\" (UniqueName: \"kubernetes.io/projected/bd75b7e3-5c92-4b37-adba-ef56c1507da6-kube-api-access-s8xdq\") pod \"nova-scheduler-0\" (UID: \"bd75b7e3-5c92-4b37-adba-ef56c1507da6\") " pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.333723 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd75b7e3-5c92-4b37-adba-ef56c1507da6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd75b7e3-5c92-4b37-adba-ef56c1507da6\") " pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.418079 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fa1ea1b-b64d-48fd-a760-75dce9412009-logs\") pod \"2fa1ea1b-b64d-48fd-a760-75dce9412009\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.418162 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-combined-ca-bundle\") pod \"2fa1ea1b-b64d-48fd-a760-75dce9412009\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.418182 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-config-data\") pod \"2fa1ea1b-b64d-48fd-a760-75dce9412009\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.418303 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt5nk\" (UniqueName: \"kubernetes.io/projected/2fa1ea1b-b64d-48fd-a760-75dce9412009-kube-api-access-rt5nk\") pod \"2fa1ea1b-b64d-48fd-a760-75dce9412009\" (UID: \"2fa1ea1b-b64d-48fd-a760-75dce9412009\") " Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.421787 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fa1ea1b-b64d-48fd-a760-75dce9412009-logs" (OuterVolumeSpecName: "logs") pod "2fa1ea1b-b64d-48fd-a760-75dce9412009" (UID: "2fa1ea1b-b64d-48fd-a760-75dce9412009"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.423781 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa1ea1b-b64d-48fd-a760-75dce9412009-kube-api-access-rt5nk" (OuterVolumeSpecName: "kube-api-access-rt5nk") pod "2fa1ea1b-b64d-48fd-a760-75dce9412009" (UID: "2fa1ea1b-b64d-48fd-a760-75dce9412009"). InnerVolumeSpecName "kube-api-access-rt5nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.443602 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-config-data" (OuterVolumeSpecName: "config-data") pod "2fa1ea1b-b64d-48fd-a760-75dce9412009" (UID: "2fa1ea1b-b64d-48fd-a760-75dce9412009"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.465895 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fa1ea1b-b64d-48fd-a760-75dce9412009" (UID: "2fa1ea1b-b64d-48fd-a760-75dce9412009"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.482087 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.524596 5118 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fa1ea1b-b64d-48fd-a760-75dce9412009-logs\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.524642 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.524655 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fa1ea1b-b64d-48fd-a760-75dce9412009-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.524669 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt5nk\" (UniqueName: \"kubernetes.io/projected/2fa1ea1b-b64d-48fd-a760-75dce9412009-kube-api-access-rt5nk\") on node \"crc\" DevicePath \"\"" Feb 23 09:35:54 crc kubenswrapper[5118]: I0223 09:35:54.967581 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 09:35:54 crc kubenswrapper[5118]: W0223 09:35:54.974748 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd75b7e3_5c92_4b37_adba_ef56c1507da6.slice/crio-ea86d53697f24d0d9dc6ccafdf9065c624b757b3f0e85ccc43744d7d96d5eaec WatchSource:0}: Error finding container ea86d53697f24d0d9dc6ccafdf9065c624b757b3f0e85ccc43744d7d96d5eaec: Status 404 returned error can't find the container with id ea86d53697f24d0d9dc6ccafdf9065c624b757b3f0e85ccc43744d7d96d5eaec Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.053507 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5203da4c-1034-4cfa-9671-18b8de107d14","Type":"ContainerDied","Data":"8909dce1f456926b718dbf7ab8c780846a9ea8cf2e84096c7f89a5754501a243"} Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.053555 5118 scope.go:117] "RemoveContainer" containerID="49bcaced6476503e7831711bb37e6b1e48108345338319e243caad11e097fa80" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.053689 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.058826 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2fa1ea1b-b64d-48fd-a760-75dce9412009","Type":"ContainerDied","Data":"cc29465f9ae403d3f12a55c357d0f476f2ff0d1dbfdcd07d3053d4fda66479ec"} Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.058914 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.064895 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd75b7e3-5c92-4b37-adba-ef56c1507da6","Type":"ContainerStarted","Data":"ea86d53697f24d0d9dc6ccafdf9065c624b757b3f0e85ccc43744d7d96d5eaec"} Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.138581 5118 scope.go:117] "RemoveContainer" containerID="7cecc2661030d87870d8645913fc54230ed2c8e4402656e658611a330679ec48" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.160354 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.176931 5118 scope.go:117] "RemoveContainer" containerID="c56fbd16f80d712125d16d5cc5cf324d93722944ee8505318a95e6cc986a0499" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.180444 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.188919 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.202476 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.215482 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 09:35:55 crc kubenswrapper[5118]: E0223 09:35:55.216088 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerName="nova-metadata-log" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.220026 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerName="nova-metadata-log" Feb 23 09:35:55 crc kubenswrapper[5118]: E0223 09:35:55.220166 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" containerName="nova-api-api" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.220221 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" containerName="nova-api-api" Feb 23 09:35:55 crc kubenswrapper[5118]: E0223 09:35:55.220269 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerName="nova-metadata-metadata" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.220319 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerName="nova-metadata-metadata" Feb 23 09:35:55 crc kubenswrapper[5118]: E0223 09:35:55.220402 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" containerName="nova-api-log" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.220456 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" containerName="nova-api-log" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.220747 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" containerName="nova-api-log" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.220821 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerName="nova-metadata-metadata" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.220880 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" containerName="nova-api-api" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.220929 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" containerName="nova-metadata-log" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.222023 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.225407 5118 scope.go:117] "RemoveContainer" containerID="404cb13a74f8a0d840e7ef560b45a9bba5286fdbba1c44359a1dc0c3d3698763" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.229282 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.230934 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.241278 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.243165 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.245184 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.255069 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1483f138-18e5-4827-81a6-db3d0b06551e-config-data\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.255175 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1483f138-18e5-4827-81a6-db3d0b06551e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.255373 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1483f138-18e5-4827-81a6-db3d0b06551e-logs\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.255461 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv29g\" (UniqueName: \"kubernetes.io/projected/1483f138-18e5-4827-81a6-db3d0b06551e-kube-api-access-qv29g\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.264837 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.356853 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg9bh\" (UniqueName: \"kubernetes.io/projected/bf5e7919-f61d-4017-9be8-205d13874bd5-kube-api-access-rg9bh\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.356929 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5e7919-f61d-4017-9be8-205d13874bd5-logs\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.356996 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5e7919-f61d-4017-9be8-205d13874bd5-config-data\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.357039 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1483f138-18e5-4827-81a6-db3d0b06551e-config-data\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.357067 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1483f138-18e5-4827-81a6-db3d0b06551e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.357366 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1483f138-18e5-4827-81a6-db3d0b06551e-logs\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.357422 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5e7919-f61d-4017-9be8-205d13874bd5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.357528 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv29g\" (UniqueName: \"kubernetes.io/projected/1483f138-18e5-4827-81a6-db3d0b06551e-kube-api-access-qv29g\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.358043 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1483f138-18e5-4827-81a6-db3d0b06551e-logs\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.361985 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1483f138-18e5-4827-81a6-db3d0b06551e-config-data\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.362983 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1483f138-18e5-4827-81a6-db3d0b06551e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.376409 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv29g\" (UniqueName: \"kubernetes.io/projected/1483f138-18e5-4827-81a6-db3d0b06551e-kube-api-access-qv29g\") pod \"nova-metadata-0\" (UID: \"1483f138-18e5-4827-81a6-db3d0b06551e\") " pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.459510 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5e7919-f61d-4017-9be8-205d13874bd5-config-data\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.459866 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5e7919-f61d-4017-9be8-205d13874bd5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.460000 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg9bh\" (UniqueName: \"kubernetes.io/projected/bf5e7919-f61d-4017-9be8-205d13874bd5-kube-api-access-rg9bh\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.460081 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5e7919-f61d-4017-9be8-205d13874bd5-logs\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.460870 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5e7919-f61d-4017-9be8-205d13874bd5-logs\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.463812 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5e7919-f61d-4017-9be8-205d13874bd5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.466498 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5e7919-f61d-4017-9be8-205d13874bd5-config-data\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.478465 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg9bh\" (UniqueName: \"kubernetes.io/projected/bf5e7919-f61d-4017-9be8-205d13874bd5-kube-api-access-rg9bh\") pod \"nova-api-0\" (UID: \"bf5e7919-f61d-4017-9be8-205d13874bd5\") " pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.543464 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.565725 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.709384 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fa1ea1b-b64d-48fd-a760-75dce9412009" path="/var/lib/kubelet/pods/2fa1ea1b-b64d-48fd-a760-75dce9412009/volumes" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.710452 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5203da4c-1034-4cfa-9671-18b8de107d14" path="/var/lib/kubelet/pods/5203da4c-1034-4cfa-9671-18b8de107d14/volumes" Feb 23 09:35:55 crc kubenswrapper[5118]: I0223 09:35:55.711152 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e" path="/var/lib/kubelet/pods/aa6e0c82-0828-4ecb-a1af-fbbd299a2c0e/volumes" Feb 23 09:35:56 crc kubenswrapper[5118]: W0223 09:35:56.023820 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1483f138_18e5_4827_81a6_db3d0b06551e.slice/crio-db4988ee64b6caa9d62769eff1be34e0691d90ddb39fa3585da6e9e1af5fae77 WatchSource:0}: Error finding container db4988ee64b6caa9d62769eff1be34e0691d90ddb39fa3585da6e9e1af5fae77: Status 404 returned error can't find the container with id db4988ee64b6caa9d62769eff1be34e0691d90ddb39fa3585da6e9e1af5fae77 Feb 23 09:35:56 crc kubenswrapper[5118]: I0223 09:35:56.030356 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 09:35:56 crc kubenswrapper[5118]: I0223 09:35:56.084514 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd75b7e3-5c92-4b37-adba-ef56c1507da6","Type":"ContainerStarted","Data":"793831dfb595288b0456af86d56d6cf38720c7f0f1aceba412d4147970a36b76"} Feb 23 09:35:56 crc kubenswrapper[5118]: I0223 09:35:56.088439 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1483f138-18e5-4827-81a6-db3d0b06551e","Type":"ContainerStarted","Data":"db4988ee64b6caa9d62769eff1be34e0691d90ddb39fa3585da6e9e1af5fae77"} Feb 23 09:35:56 crc kubenswrapper[5118]: W0223 09:35:56.119838 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf5e7919_f61d_4017_9be8_205d13874bd5.slice/crio-446eb372775200daa143f1d27753ce264c29ec935724a081358d5b2a762644ad WatchSource:0}: Error finding container 446eb372775200daa143f1d27753ce264c29ec935724a081358d5b2a762644ad: Status 404 returned error can't find the container with id 446eb372775200daa143f1d27753ce264c29ec935724a081358d5b2a762644ad Feb 23 09:35:56 crc kubenswrapper[5118]: I0223 09:35:56.121011 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 09:35:56 crc kubenswrapper[5118]: I0223 09:35:56.128225 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.128207903 podStartE2EDuration="2.128207903s" podCreationTimestamp="2026-02-23 09:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:35:56.103456603 +0000 UTC m=+10219.107241176" watchObservedRunningTime="2026-02-23 09:35:56.128207903 +0000 UTC m=+10219.131992496" Feb 23 09:35:57 crc kubenswrapper[5118]: I0223 09:35:57.112038 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf5e7919-f61d-4017-9be8-205d13874bd5","Type":"ContainerStarted","Data":"40d8a1e39a6407519293460ca2a75aaba6aeb921d6103f2eaa0c2005f06248f8"} Feb 23 09:35:57 crc kubenswrapper[5118]: I0223 09:35:57.112515 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf5e7919-f61d-4017-9be8-205d13874bd5","Type":"ContainerStarted","Data":"fee76688091f0dacf9aad4d1c6320ef2ab409d33eca8bedab16055202586978b"} Feb 23 09:35:57 crc kubenswrapper[5118]: I0223 09:35:57.112538 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bf5e7919-f61d-4017-9be8-205d13874bd5","Type":"ContainerStarted","Data":"446eb372775200daa143f1d27753ce264c29ec935724a081358d5b2a762644ad"} Feb 23 09:35:57 crc kubenswrapper[5118]: I0223 09:35:57.117571 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1483f138-18e5-4827-81a6-db3d0b06551e","Type":"ContainerStarted","Data":"287d8d8d4ef85519d4f148fd5fef04cb16034c2926cea766db1018e0541a6a0c"} Feb 23 09:35:57 crc kubenswrapper[5118]: I0223 09:35:57.117615 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1483f138-18e5-4827-81a6-db3d0b06551e","Type":"ContainerStarted","Data":"95b6b49772a2659ed11b1492ca37aad76784f1bba60d8c3c9605537d994bdf0b"} Feb 23 09:35:57 crc kubenswrapper[5118]: I0223 09:35:57.146504 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.146447733 podStartE2EDuration="2.146447733s" podCreationTimestamp="2026-02-23 09:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:35:57.13764239 +0000 UTC m=+10220.141426973" watchObservedRunningTime="2026-02-23 09:35:57.146447733 +0000 UTC m=+10220.150232316" Feb 23 09:35:57 crc kubenswrapper[5118]: I0223 09:35:57.170280 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.170262191 podStartE2EDuration="2.170262191s" podCreationTimestamp="2026-02-23 09:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:35:57.165623999 +0000 UTC m=+10220.169408582" watchObservedRunningTime="2026-02-23 09:35:57.170262191 +0000 UTC m=+10220.174046764" Feb 23 09:35:59 crc kubenswrapper[5118]: I0223 09:35:59.483730 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 09:36:00 crc kubenswrapper[5118]: I0223 09:36:00.545252 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 09:36:00 crc kubenswrapper[5118]: I0223 09:36:00.545672 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 09:36:01 crc kubenswrapper[5118]: I0223 09:36:01.657180 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 23 09:36:02 crc kubenswrapper[5118]: I0223 09:36:02.401623 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 23 09:36:04 crc kubenswrapper[5118]: I0223 09:36:04.483151 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 09:36:04 crc kubenswrapper[5118]: I0223 09:36:04.516488 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 09:36:05 crc kubenswrapper[5118]: I0223 09:36:05.241444 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 09:36:05 crc kubenswrapper[5118]: I0223 09:36:05.545436 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 09:36:05 crc kubenswrapper[5118]: I0223 09:36:05.545502 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 09:36:05 crc kubenswrapper[5118]: I0223 09:36:05.566946 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 09:36:05 crc kubenswrapper[5118]: I0223 09:36:05.567011 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 09:36:06 crc kubenswrapper[5118]: I0223 09:36:06.710337 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bf5e7919-f61d-4017-9be8-205d13874bd5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 09:36:06 crc kubenswrapper[5118]: I0223 09:36:06.710333 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1483f138-18e5-4827-81a6-db3d0b06551e" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.200:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 09:36:06 crc kubenswrapper[5118]: I0223 09:36:06.710378 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bf5e7919-f61d-4017-9be8-205d13874bd5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 09:36:06 crc kubenswrapper[5118]: I0223 09:36:06.710419 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1483f138-18e5-4827-81a6-db3d0b06551e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.200:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 09:36:15 crc kubenswrapper[5118]: I0223 09:36:15.547246 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 09:36:15 crc kubenswrapper[5118]: I0223 09:36:15.547827 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 09:36:15 crc kubenswrapper[5118]: I0223 09:36:15.551027 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 09:36:15 crc kubenswrapper[5118]: I0223 09:36:15.551493 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 09:36:15 crc kubenswrapper[5118]: I0223 09:36:15.570374 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 09:36:15 crc kubenswrapper[5118]: I0223 09:36:15.570641 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 09:36:15 crc kubenswrapper[5118]: I0223 09:36:15.570892 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 09:36:15 crc kubenswrapper[5118]: I0223 09:36:15.570907 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 09:36:15 crc kubenswrapper[5118]: I0223 09:36:15.573395 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 09:36:15 crc kubenswrapper[5118]: I0223 09:36:15.581787 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.829313 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5"] Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.830851 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.833234 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.833260 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-xrvtg" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.833239 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.833334 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.833601 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.850315 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5"] Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.850491 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.851659 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937133 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937206 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937257 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937316 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937350 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937383 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937415 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937453 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937476 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937505 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937550 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937589 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:16 crc kubenswrapper[5118]: I0223 09:36:16.937608 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsqv5\" (UniqueName: \"kubernetes.io/projected/8285a362-7003-43c3-969a-88ee748d09e8-kube-api-access-bsqv5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039413 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039471 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsqv5\" (UniqueName: \"kubernetes.io/projected/8285a362-7003-43c3-969a-88ee748d09e8-kube-api-access-bsqv5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039530 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039564 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039627 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039686 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039719 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039761 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039810 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039861 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039891 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039918 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.039976 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.041718 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.042339 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.046362 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.047741 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.051276 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.051595 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.051727 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.051821 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.052387 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.053546 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.053858 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.056344 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.058058 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsqv5\" (UniqueName: \"kubernetes.io/projected/8285a362-7003-43c3-969a-88ee748d09e8-kube-api-access-bsqv5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.156895 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:36:17 crc kubenswrapper[5118]: W0223 09:36:17.883359 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8285a362_7003_43c3_969a_88ee748d09e8.slice/crio-a967409e74a9b24276fdc92d8c37a067540b8d63f0f81757b3e88abf5dc9c7b7 WatchSource:0}: Error finding container a967409e74a9b24276fdc92d8c37a067540b8d63f0f81757b3e88abf5dc9c7b7: Status 404 returned error can't find the container with id a967409e74a9b24276fdc92d8c37a067540b8d63f0f81757b3e88abf5dc9c7b7 Feb 23 09:36:17 crc kubenswrapper[5118]: I0223 09:36:17.885898 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5"] Feb 23 09:36:18 crc kubenswrapper[5118]: I0223 09:36:18.396088 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" event={"ID":"8285a362-7003-43c3-969a-88ee748d09e8","Type":"ContainerStarted","Data":"a967409e74a9b24276fdc92d8c37a067540b8d63f0f81757b3e88abf5dc9c7b7"} Feb 23 09:36:19 crc kubenswrapper[5118]: I0223 09:36:19.411534 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" event={"ID":"8285a362-7003-43c3-969a-88ee748d09e8","Type":"ContainerStarted","Data":"4196d1fb3dde22d651c51ebfd9f4a02ff41401cfd69c4d3848d0c7176686a464"} Feb 23 09:36:19 crc kubenswrapper[5118]: I0223 09:36:19.439987 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" podStartSLOduration=2.959793178 podStartE2EDuration="3.439969001s" podCreationTimestamp="2026-02-23 09:36:16 +0000 UTC" firstStartedPulling="2026-02-23 09:36:17.885438338 +0000 UTC m=+10240.889222901" lastFinishedPulling="2026-02-23 09:36:18.365614151 +0000 UTC m=+10241.369398724" observedRunningTime="2026-02-23 09:36:19.431739282 +0000 UTC m=+10242.435523855" watchObservedRunningTime="2026-02-23 09:36:19.439969001 +0000 UTC m=+10242.443753574" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.383292 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8ckvb"] Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.386296 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.397145 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-catalog-content\") pod \"redhat-marketplace-8ckvb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.397210 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6d5j\" (UniqueName: \"kubernetes.io/projected/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-kube-api-access-n6d5j\") pod \"redhat-marketplace-8ckvb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.397432 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-utilities\") pod \"redhat-marketplace-8ckvb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.412141 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ckvb"] Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.498841 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-utilities\") pod \"redhat-marketplace-8ckvb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.498939 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-catalog-content\") pod \"redhat-marketplace-8ckvb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.498971 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6d5j\" (UniqueName: \"kubernetes.io/projected/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-kube-api-access-n6d5j\") pod \"redhat-marketplace-8ckvb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.499350 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-utilities\") pod \"redhat-marketplace-8ckvb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.499381 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-catalog-content\") pod \"redhat-marketplace-8ckvb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.519601 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6d5j\" (UniqueName: \"kubernetes.io/projected/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-kube-api-access-n6d5j\") pod \"redhat-marketplace-8ckvb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:10 crc kubenswrapper[5118]: I0223 09:37:10.713797 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:11 crc kubenswrapper[5118]: I0223 09:37:11.206601 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ckvb"] Feb 23 09:37:11 crc kubenswrapper[5118]: I0223 09:37:11.990002 5118 generic.go:334] "Generic (PLEG): container finished" podID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerID="0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee" exitCode=0 Feb 23 09:37:11 crc kubenswrapper[5118]: I0223 09:37:11.990137 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ckvb" event={"ID":"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb","Type":"ContainerDied","Data":"0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee"} Feb 23 09:37:11 crc kubenswrapper[5118]: I0223 09:37:11.990359 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ckvb" event={"ID":"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb","Type":"ContainerStarted","Data":"ba28beb4d64c6b285da37ed0078dc603dfaf95eef1608200ef504d20e9987501"} Feb 23 09:37:11 crc kubenswrapper[5118]: I0223 09:37:11.994533 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:37:13 crc kubenswrapper[5118]: I0223 09:37:13.003390 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ckvb" event={"ID":"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb","Type":"ContainerStarted","Data":"dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d"} Feb 23 09:37:14 crc kubenswrapper[5118]: I0223 09:37:14.017543 5118 generic.go:334] "Generic (PLEG): container finished" podID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerID="dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d" exitCode=0 Feb 23 09:37:14 crc kubenswrapper[5118]: I0223 09:37:14.017638 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ckvb" event={"ID":"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb","Type":"ContainerDied","Data":"dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d"} Feb 23 09:37:15 crc kubenswrapper[5118]: I0223 09:37:15.028830 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ckvb" event={"ID":"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb","Type":"ContainerStarted","Data":"3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1"} Feb 23 09:37:15 crc kubenswrapper[5118]: I0223 09:37:15.055642 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8ckvb" podStartSLOduration=2.6518692230000003 podStartE2EDuration="5.055616949s" podCreationTimestamp="2026-02-23 09:37:10 +0000 UTC" firstStartedPulling="2026-02-23 09:37:11.994238146 +0000 UTC m=+10294.998022729" lastFinishedPulling="2026-02-23 09:37:14.397985872 +0000 UTC m=+10297.401770455" observedRunningTime="2026-02-23 09:37:15.046336443 +0000 UTC m=+10298.050121016" watchObservedRunningTime="2026-02-23 09:37:15.055616949 +0000 UTC m=+10298.059401532" Feb 23 09:37:20 crc kubenswrapper[5118]: I0223 09:37:20.714268 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:20 crc kubenswrapper[5118]: I0223 09:37:20.714671 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:20 crc kubenswrapper[5118]: I0223 09:37:20.767092 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:21 crc kubenswrapper[5118]: I0223 09:37:21.174290 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:21 crc kubenswrapper[5118]: I0223 09:37:21.242636 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ckvb"] Feb 23 09:37:23 crc kubenswrapper[5118]: I0223 09:37:23.141298 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8ckvb" podUID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerName="registry-server" containerID="cri-o://3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1" gracePeriod=2 Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.100435 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.153596 5118 generic.go:334] "Generic (PLEG): container finished" podID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerID="3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1" exitCode=0 Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.153670 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ckvb" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.153667 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ckvb" event={"ID":"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb","Type":"ContainerDied","Data":"3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1"} Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.154077 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ckvb" event={"ID":"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb","Type":"ContainerDied","Data":"ba28beb4d64c6b285da37ed0078dc603dfaf95eef1608200ef504d20e9987501"} Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.154115 5118 scope.go:117] "RemoveContainer" containerID="3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.172596 5118 scope.go:117] "RemoveContainer" containerID="dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.195007 5118 scope.go:117] "RemoveContainer" containerID="0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.199370 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6d5j\" (UniqueName: \"kubernetes.io/projected/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-kube-api-access-n6d5j\") pod \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.199582 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-utilities\") pod \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.199713 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-catalog-content\") pod \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\" (UID: \"b4dfe0ef-68ba-4318-9bc9-2c930b3067bb\") " Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.203615 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-utilities" (OuterVolumeSpecName: "utilities") pod "b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" (UID: "b4dfe0ef-68ba-4318-9bc9-2c930b3067bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.216316 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-kube-api-access-n6d5j" (OuterVolumeSpecName: "kube-api-access-n6d5j") pod "b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" (UID: "b4dfe0ef-68ba-4318-9bc9-2c930b3067bb"). InnerVolumeSpecName "kube-api-access-n6d5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.231268 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" (UID: "b4dfe0ef-68ba-4318-9bc9-2c930b3067bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.299811 5118 scope.go:117] "RemoveContainer" containerID="3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1" Feb 23 09:37:24 crc kubenswrapper[5118]: E0223 09:37:24.300180 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1\": container with ID starting with 3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1 not found: ID does not exist" containerID="3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.300248 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1"} err="failed to get container status \"3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1\": rpc error: code = NotFound desc = could not find container \"3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1\": container with ID starting with 3d04105b1830013501eb6cb6d333a66e579ad3013caf0544f182d5950995adc1 not found: ID does not exist" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.300391 5118 scope.go:117] "RemoveContainer" containerID="dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.301496 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.301515 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.301524 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6d5j\" (UniqueName: \"kubernetes.io/projected/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb-kube-api-access-n6d5j\") on node \"crc\" DevicePath \"\"" Feb 23 09:37:24 crc kubenswrapper[5118]: E0223 09:37:24.303521 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d\": container with ID starting with dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d not found: ID does not exist" containerID="dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.303556 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d"} err="failed to get container status \"dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d\": rpc error: code = NotFound desc = could not find container \"dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d\": container with ID starting with dd4fc13b33c00bdd5a691bf94f1018ab229cbc5621463f24c4830bd97aa6d87d not found: ID does not exist" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.303578 5118 scope.go:117] "RemoveContainer" containerID="0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee" Feb 23 09:37:24 crc kubenswrapper[5118]: E0223 09:37:24.303837 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee\": container with ID starting with 0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee not found: ID does not exist" containerID="0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.303862 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee"} err="failed to get container status \"0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee\": rpc error: code = NotFound desc = could not find container \"0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee\": container with ID starting with 0c060527d872071321784b1e480b3f024b56039972b4c9e846bc4f38bacc62ee not found: ID does not exist" Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.501690 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ckvb"] Feb 23 09:37:24 crc kubenswrapper[5118]: I0223 09:37:24.514978 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ckvb"] Feb 23 09:37:25 crc kubenswrapper[5118]: I0223 09:37:25.713432 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" path="/var/lib/kubelet/pods/b4dfe0ef-68ba-4318-9bc9-2c930b3067bb/volumes" Feb 23 09:37:32 crc kubenswrapper[5118]: I0223 09:37:32.975524 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:37:32 crc kubenswrapper[5118]: I0223 09:37:32.976842 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.688344 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sb5zr"] Feb 23 09:38:02 crc kubenswrapper[5118]: E0223 09:38:02.689357 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerName="extract-utilities" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.689371 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerName="extract-utilities" Feb 23 09:38:02 crc kubenswrapper[5118]: E0223 09:38:02.689392 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerName="extract-content" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.689400 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerName="extract-content" Feb 23 09:38:02 crc kubenswrapper[5118]: E0223 09:38:02.689418 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerName="registry-server" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.689428 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerName="registry-server" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.689675 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4dfe0ef-68ba-4318-9bc9-2c930b3067bb" containerName="registry-server" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.691417 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.738047 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sb5zr"] Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.763472 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-catalog-content\") pod \"community-operators-sb5zr\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.763632 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvhsc\" (UniqueName: \"kubernetes.io/projected/9af813b3-395a-4a44-8371-edd425a6f173-kube-api-access-pvhsc\") pod \"community-operators-sb5zr\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.763813 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-utilities\") pod \"community-operators-sb5zr\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.865551 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-catalog-content\") pod \"community-operators-sb5zr\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.865684 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvhsc\" (UniqueName: \"kubernetes.io/projected/9af813b3-395a-4a44-8371-edd425a6f173-kube-api-access-pvhsc\") pod \"community-operators-sb5zr\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.865730 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-utilities\") pod \"community-operators-sb5zr\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.866282 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-utilities\") pod \"community-operators-sb5zr\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.866541 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-catalog-content\") pod \"community-operators-sb5zr\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.881259 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fn8zt"] Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.883212 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.897891 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvhsc\" (UniqueName: \"kubernetes.io/projected/9af813b3-395a-4a44-8371-edd425a6f173-kube-api-access-pvhsc\") pod \"community-operators-sb5zr\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.916897 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fn8zt"] Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.967332 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-catalog-content\") pod \"certified-operators-fn8zt\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.967516 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-utilities\") pod \"certified-operators-fn8zt\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.967682 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf89d\" (UniqueName: \"kubernetes.io/projected/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-kube-api-access-qf89d\") pod \"certified-operators-fn8zt\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.975505 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:38:02 crc kubenswrapper[5118]: I0223 09:38:02.975558 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:38:03 crc kubenswrapper[5118]: I0223 09:38:03.030058 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:03 crc kubenswrapper[5118]: I0223 09:38:03.072423 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-utilities\") pod \"certified-operators-fn8zt\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:03 crc kubenswrapper[5118]: I0223 09:38:03.072532 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf89d\" (UniqueName: \"kubernetes.io/projected/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-kube-api-access-qf89d\") pod \"certified-operators-fn8zt\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:03 crc kubenswrapper[5118]: I0223 09:38:03.072656 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-catalog-content\") pod \"certified-operators-fn8zt\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:03 crc kubenswrapper[5118]: I0223 09:38:03.073594 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-catalog-content\") pod \"certified-operators-fn8zt\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:03 crc kubenswrapper[5118]: I0223 09:38:03.073903 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-utilities\") pod \"certified-operators-fn8zt\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:03 crc kubenswrapper[5118]: I0223 09:38:03.097567 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf89d\" (UniqueName: \"kubernetes.io/projected/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-kube-api-access-qf89d\") pod \"certified-operators-fn8zt\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:03 crc kubenswrapper[5118]: I0223 09:38:03.202636 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:03 crc kubenswrapper[5118]: I0223 09:38:03.717466 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sb5zr"] Feb 23 09:38:03 crc kubenswrapper[5118]: I0223 09:38:03.981317 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fn8zt"] Feb 23 09:38:04 crc kubenswrapper[5118]: W0223 09:38:04.014656 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae5e13f_b7ee_4e86_ac2a_63221fb573b1.slice/crio-f80ea7eaf850e7538161052876e6536a8db968b3c9d0b9db0d046c5f2d477c5f WatchSource:0}: Error finding container f80ea7eaf850e7538161052876e6536a8db968b3c9d0b9db0d046c5f2d477c5f: Status 404 returned error can't find the container with id f80ea7eaf850e7538161052876e6536a8db968b3c9d0b9db0d046c5f2d477c5f Feb 23 09:38:04 crc kubenswrapper[5118]: I0223 09:38:04.631319 5118 generic.go:334] "Generic (PLEG): container finished" podID="9af813b3-395a-4a44-8371-edd425a6f173" containerID="44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60" exitCode=0 Feb 23 09:38:04 crc kubenswrapper[5118]: I0223 09:38:04.631502 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5zr" event={"ID":"9af813b3-395a-4a44-8371-edd425a6f173","Type":"ContainerDied","Data":"44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60"} Feb 23 09:38:04 crc kubenswrapper[5118]: I0223 09:38:04.631683 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5zr" event={"ID":"9af813b3-395a-4a44-8371-edd425a6f173","Type":"ContainerStarted","Data":"001b362d5524a6cac8af2e55e4f39d34442998d0e47c95d643a1c51cdcbcf38b"} Feb 23 09:38:04 crc kubenswrapper[5118]: I0223 09:38:04.634690 5118 generic.go:334] "Generic (PLEG): container finished" podID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerID="2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530" exitCode=0 Feb 23 09:38:04 crc kubenswrapper[5118]: I0223 09:38:04.634736 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn8zt" event={"ID":"aae5e13f-b7ee-4e86-ac2a-63221fb573b1","Type":"ContainerDied","Data":"2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530"} Feb 23 09:38:04 crc kubenswrapper[5118]: I0223 09:38:04.634760 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn8zt" event={"ID":"aae5e13f-b7ee-4e86-ac2a-63221fb573b1","Type":"ContainerStarted","Data":"f80ea7eaf850e7538161052876e6536a8db968b3c9d0b9db0d046c5f2d477c5f"} Feb 23 09:38:05 crc kubenswrapper[5118]: I0223 09:38:05.645436 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn8zt" event={"ID":"aae5e13f-b7ee-4e86-ac2a-63221fb573b1","Type":"ContainerStarted","Data":"03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff"} Feb 23 09:38:05 crc kubenswrapper[5118]: I0223 09:38:05.650818 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5zr" event={"ID":"9af813b3-395a-4a44-8371-edd425a6f173","Type":"ContainerStarted","Data":"0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9"} Feb 23 09:38:07 crc kubenswrapper[5118]: I0223 09:38:07.672137 5118 generic.go:334] "Generic (PLEG): container finished" podID="9af813b3-395a-4a44-8371-edd425a6f173" containerID="0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9" exitCode=0 Feb 23 09:38:07 crc kubenswrapper[5118]: I0223 09:38:07.672205 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5zr" event={"ID":"9af813b3-395a-4a44-8371-edd425a6f173","Type":"ContainerDied","Data":"0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9"} Feb 23 09:38:07 crc kubenswrapper[5118]: I0223 09:38:07.677447 5118 generic.go:334] "Generic (PLEG): container finished" podID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerID="03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff" exitCode=0 Feb 23 09:38:07 crc kubenswrapper[5118]: I0223 09:38:07.677482 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn8zt" event={"ID":"aae5e13f-b7ee-4e86-ac2a-63221fb573b1","Type":"ContainerDied","Data":"03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff"} Feb 23 09:38:08 crc kubenswrapper[5118]: I0223 09:38:08.690235 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5zr" event={"ID":"9af813b3-395a-4a44-8371-edd425a6f173","Type":"ContainerStarted","Data":"353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc"} Feb 23 09:38:08 crc kubenswrapper[5118]: I0223 09:38:08.693424 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn8zt" event={"ID":"aae5e13f-b7ee-4e86-ac2a-63221fb573b1","Type":"ContainerStarted","Data":"af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e"} Feb 23 09:38:08 crc kubenswrapper[5118]: I0223 09:38:08.722001 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sb5zr" podStartSLOduration=3.301419709 podStartE2EDuration="6.72197595s" podCreationTimestamp="2026-02-23 09:38:02 +0000 UTC" firstStartedPulling="2026-02-23 09:38:04.634067787 +0000 UTC m=+10347.637852360" lastFinishedPulling="2026-02-23 09:38:08.054624018 +0000 UTC m=+10351.058408601" observedRunningTime="2026-02-23 09:38:08.705943592 +0000 UTC m=+10351.709728175" watchObservedRunningTime="2026-02-23 09:38:08.72197595 +0000 UTC m=+10351.725760543" Feb 23 09:38:08 crc kubenswrapper[5118]: I0223 09:38:08.737228 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fn8zt" podStartSLOduration=3.205999375 podStartE2EDuration="6.73721051s" podCreationTimestamp="2026-02-23 09:38:02 +0000 UTC" firstStartedPulling="2026-02-23 09:38:04.643864164 +0000 UTC m=+10347.647648737" lastFinishedPulling="2026-02-23 09:38:08.175075289 +0000 UTC m=+10351.178859872" observedRunningTime="2026-02-23 09:38:08.727917924 +0000 UTC m=+10351.731702527" watchObservedRunningTime="2026-02-23 09:38:08.73721051 +0000 UTC m=+10351.740995083" Feb 23 09:38:13 crc kubenswrapper[5118]: I0223 09:38:13.031387 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:13 crc kubenswrapper[5118]: I0223 09:38:13.031786 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:13 crc kubenswrapper[5118]: I0223 09:38:13.083693 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:13 crc kubenswrapper[5118]: I0223 09:38:13.203872 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:13 crc kubenswrapper[5118]: I0223 09:38:13.204008 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:13 crc kubenswrapper[5118]: I0223 09:38:13.806311 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:13 crc kubenswrapper[5118]: I0223 09:38:13.867163 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sb5zr"] Feb 23 09:38:14 crc kubenswrapper[5118]: I0223 09:38:14.293645 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-fn8zt" podUID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerName="registry-server" probeResult="failure" output=< Feb 23 09:38:14 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:38:14 crc kubenswrapper[5118]: > Feb 23 09:38:15 crc kubenswrapper[5118]: I0223 09:38:15.775900 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sb5zr" podUID="9af813b3-395a-4a44-8371-edd425a6f173" containerName="registry-server" containerID="cri-o://353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc" gracePeriod=2 Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.332404 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.361206 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-catalog-content\") pod \"9af813b3-395a-4a44-8371-edd425a6f173\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.361527 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-utilities\") pod \"9af813b3-395a-4a44-8371-edd425a6f173\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.361668 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvhsc\" (UniqueName: \"kubernetes.io/projected/9af813b3-395a-4a44-8371-edd425a6f173-kube-api-access-pvhsc\") pod \"9af813b3-395a-4a44-8371-edd425a6f173\" (UID: \"9af813b3-395a-4a44-8371-edd425a6f173\") " Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.362213 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-utilities" (OuterVolumeSpecName: "utilities") pod "9af813b3-395a-4a44-8371-edd425a6f173" (UID: "9af813b3-395a-4a44-8371-edd425a6f173"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.362436 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.372334 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af813b3-395a-4a44-8371-edd425a6f173-kube-api-access-pvhsc" (OuterVolumeSpecName: "kube-api-access-pvhsc") pod "9af813b3-395a-4a44-8371-edd425a6f173" (UID: "9af813b3-395a-4a44-8371-edd425a6f173"). InnerVolumeSpecName "kube-api-access-pvhsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.443269 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9af813b3-395a-4a44-8371-edd425a6f173" (UID: "9af813b3-395a-4a44-8371-edd425a6f173"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.464818 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af813b3-395a-4a44-8371-edd425a6f173-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.464850 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvhsc\" (UniqueName: \"kubernetes.io/projected/9af813b3-395a-4a44-8371-edd425a6f173-kube-api-access-pvhsc\") on node \"crc\" DevicePath \"\"" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.786747 5118 generic.go:334] "Generic (PLEG): container finished" podID="9af813b3-395a-4a44-8371-edd425a6f173" containerID="353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc" exitCode=0 Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.786792 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5zr" event={"ID":"9af813b3-395a-4a44-8371-edd425a6f173","Type":"ContainerDied","Data":"353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc"} Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.786818 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb5zr" event={"ID":"9af813b3-395a-4a44-8371-edd425a6f173","Type":"ContainerDied","Data":"001b362d5524a6cac8af2e55e4f39d34442998d0e47c95d643a1c51cdcbcf38b"} Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.786835 5118 scope.go:117] "RemoveContainer" containerID="353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.788235 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb5zr" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.808611 5118 scope.go:117] "RemoveContainer" containerID="0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.832242 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sb5zr"] Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.852453 5118 scope.go:117] "RemoveContainer" containerID="44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.854947 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sb5zr"] Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.907258 5118 scope.go:117] "RemoveContainer" containerID="353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc" Feb 23 09:38:16 crc kubenswrapper[5118]: E0223 09:38:16.908326 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc\": container with ID starting with 353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc not found: ID does not exist" containerID="353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.908378 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc"} err="failed to get container status \"353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc\": rpc error: code = NotFound desc = could not find container \"353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc\": container with ID starting with 353b39d1990381b8270fcbb5cace9bfda6b84accaf6b6680f6794d134dc35afc not found: ID does not exist" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.908410 5118 scope.go:117] "RemoveContainer" containerID="0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9" Feb 23 09:38:16 crc kubenswrapper[5118]: E0223 09:38:16.908889 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9\": container with ID starting with 0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9 not found: ID does not exist" containerID="0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.908921 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9"} err="failed to get container status \"0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9\": rpc error: code = NotFound desc = could not find container \"0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9\": container with ID starting with 0458f6211a08e323e258271e037321ea6210f8d77694af488bc0f167d2f310c9 not found: ID does not exist" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.908944 5118 scope.go:117] "RemoveContainer" containerID="44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60" Feb 23 09:38:16 crc kubenswrapper[5118]: E0223 09:38:16.909364 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60\": container with ID starting with 44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60 not found: ID does not exist" containerID="44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60" Feb 23 09:38:16 crc kubenswrapper[5118]: I0223 09:38:16.909406 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60"} err="failed to get container status \"44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60\": rpc error: code = NotFound desc = could not find container \"44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60\": container with ID starting with 44682b4c621be251fb91f3d030d39674a8091bbc9081c323d78df0d0f6ec1f60 not found: ID does not exist" Feb 23 09:38:17 crc kubenswrapper[5118]: I0223 09:38:17.710742 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af813b3-395a-4a44-8371-edd425a6f173" path="/var/lib/kubelet/pods/9af813b3-395a-4a44-8371-edd425a6f173/volumes" Feb 23 09:38:23 crc kubenswrapper[5118]: I0223 09:38:23.253474 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:23 crc kubenswrapper[5118]: I0223 09:38:23.309983 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:23 crc kubenswrapper[5118]: I0223 09:38:23.498491 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fn8zt"] Feb 23 09:38:24 crc kubenswrapper[5118]: I0223 09:38:24.869261 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fn8zt" podUID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerName="registry-server" containerID="cri-o://af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e" gracePeriod=2 Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.420253 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.480631 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-catalog-content\") pod \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.480737 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-utilities\") pod \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.481324 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf89d\" (UniqueName: \"kubernetes.io/projected/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-kube-api-access-qf89d\") pod \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\" (UID: \"aae5e13f-b7ee-4e86-ac2a-63221fb573b1\") " Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.481720 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-utilities" (OuterVolumeSpecName: "utilities") pod "aae5e13f-b7ee-4e86-ac2a-63221fb573b1" (UID: "aae5e13f-b7ee-4e86-ac2a-63221fb573b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.482752 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.487456 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-kube-api-access-qf89d" (OuterVolumeSpecName: "kube-api-access-qf89d") pod "aae5e13f-b7ee-4e86-ac2a-63221fb573b1" (UID: "aae5e13f-b7ee-4e86-ac2a-63221fb573b1"). InnerVolumeSpecName "kube-api-access-qf89d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.538632 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aae5e13f-b7ee-4e86-ac2a-63221fb573b1" (UID: "aae5e13f-b7ee-4e86-ac2a-63221fb573b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.585018 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.585325 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf89d\" (UniqueName: \"kubernetes.io/projected/aae5e13f-b7ee-4e86-ac2a-63221fb573b1-kube-api-access-qf89d\") on node \"crc\" DevicePath \"\"" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.883725 5118 generic.go:334] "Generic (PLEG): container finished" podID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerID="af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e" exitCode=0 Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.883774 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn8zt" event={"ID":"aae5e13f-b7ee-4e86-ac2a-63221fb573b1","Type":"ContainerDied","Data":"af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e"} Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.883804 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fn8zt" event={"ID":"aae5e13f-b7ee-4e86-ac2a-63221fb573b1","Type":"ContainerDied","Data":"f80ea7eaf850e7538161052876e6536a8db968b3c9d0b9db0d046c5f2d477c5f"} Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.883824 5118 scope.go:117] "RemoveContainer" containerID="af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.883882 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fn8zt" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.923594 5118 scope.go:117] "RemoveContainer" containerID="03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.929019 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fn8zt"] Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.945357 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fn8zt"] Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.954340 5118 scope.go:117] "RemoveContainer" containerID="2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.989895 5118 scope.go:117] "RemoveContainer" containerID="af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e" Feb 23 09:38:25 crc kubenswrapper[5118]: E0223 09:38:25.991809 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e\": container with ID starting with af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e not found: ID does not exist" containerID="af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.991845 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e"} err="failed to get container status \"af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e\": rpc error: code = NotFound desc = could not find container \"af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e\": container with ID starting with af07fea1b050fbf79a23d0855720a0117cc4daf1c0cd16399c53b58c765d8b6e not found: ID does not exist" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.991869 5118 scope.go:117] "RemoveContainer" containerID="03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff" Feb 23 09:38:25 crc kubenswrapper[5118]: E0223 09:38:25.993307 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff\": container with ID starting with 03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff not found: ID does not exist" containerID="03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.993434 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff"} err="failed to get container status \"03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff\": rpc error: code = NotFound desc = could not find container \"03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff\": container with ID starting with 03f89e3b8dbe4889af8d6645056d804f2d36dce36f2606a857addf47080082ff not found: ID does not exist" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.993544 5118 scope.go:117] "RemoveContainer" containerID="2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530" Feb 23 09:38:25 crc kubenswrapper[5118]: E0223 09:38:25.994242 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530\": container with ID starting with 2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530 not found: ID does not exist" containerID="2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530" Feb 23 09:38:25 crc kubenswrapper[5118]: I0223 09:38:25.994265 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530"} err="failed to get container status \"2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530\": rpc error: code = NotFound desc = could not find container \"2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530\": container with ID starting with 2b68818236f89d233167b07c7544ba2f219e64ad71367a39d5963fc0a67d7530 not found: ID does not exist" Feb 23 09:38:27 crc kubenswrapper[5118]: I0223 09:38:27.715880 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" path="/var/lib/kubelet/pods/aae5e13f-b7ee-4e86-ac2a-63221fb573b1/volumes" Feb 23 09:38:32 crc kubenswrapper[5118]: I0223 09:38:32.975623 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:38:32 crc kubenswrapper[5118]: I0223 09:38:32.976050 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:38:32 crc kubenswrapper[5118]: I0223 09:38:32.976173 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 09:38:32 crc kubenswrapper[5118]: I0223 09:38:32.977607 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:38:32 crc kubenswrapper[5118]: I0223 09:38:32.977753 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" gracePeriod=600 Feb 23 09:38:33 crc kubenswrapper[5118]: E0223 09:38:33.108799 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:38:33 crc kubenswrapper[5118]: I0223 09:38:33.968359 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" exitCode=0 Feb 23 09:38:33 crc kubenswrapper[5118]: I0223 09:38:33.968409 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86"} Feb 23 09:38:33 crc kubenswrapper[5118]: I0223 09:38:33.968858 5118 scope.go:117] "RemoveContainer" containerID="5c08dfe76506d185a30f9c69e70151ba1ea8fe6e89ad7edb993f0bd9e6b86c08" Feb 23 09:38:33 crc kubenswrapper[5118]: I0223 09:38:33.970079 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:38:33 crc kubenswrapper[5118]: E0223 09:38:33.970668 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:38:47 crc kubenswrapper[5118]: I0223 09:38:47.713150 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:38:47 crc kubenswrapper[5118]: E0223 09:38:47.714527 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:39:02 crc kubenswrapper[5118]: I0223 09:39:02.698056 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:39:02 crc kubenswrapper[5118]: E0223 09:39:02.698833 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:39:09 crc kubenswrapper[5118]: I0223 09:39:09.366413 5118 generic.go:334] "Generic (PLEG): container finished" podID="8285a362-7003-43c3-969a-88ee748d09e8" containerID="4196d1fb3dde22d651c51ebfd9f4a02ff41401cfd69c4d3848d0c7176686a464" exitCode=0 Feb 23 09:39:09 crc kubenswrapper[5118]: I0223 09:39:09.366500 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" event={"ID":"8285a362-7003-43c3-969a-88ee748d09e8","Type":"ContainerDied","Data":"4196d1fb3dde22d651c51ebfd9f4a02ff41401cfd69c4d3848d0c7176686a464"} Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.880739 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.967656 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-1\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.967723 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-2\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.967745 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-0\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.967769 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-combined-ca-bundle\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.967805 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-3\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.967917 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-1\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.967963 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-0\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.967990 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsqv5\" (UniqueName: \"kubernetes.io/projected/8285a362-7003-43c3-969a-88ee748d09e8-kube-api-access-bsqv5\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.968668 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-inventory\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.968703 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-1\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.968729 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ssh-key-openstack-cell1\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.968786 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-0\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.968820 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ceph\") pod \"8285a362-7003-43c3-969a-88ee748d09e8\" (UID: \"8285a362-7003-43c3-969a-88ee748d09e8\") " Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.973066 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.973231 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8285a362-7003-43c3-969a-88ee748d09e8-kube-api-access-bsqv5" (OuterVolumeSpecName: "kube-api-access-bsqv5") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "kube-api-access-bsqv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.973612 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ceph" (OuterVolumeSpecName: "ceph") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:39:10 crc kubenswrapper[5118]: I0223 09:39:10.999817 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.000119 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.001020 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.002347 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.004552 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.006234 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.006517 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.013425 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.013661 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-inventory" (OuterVolumeSpecName: "inventory") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.018751 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8285a362-7003-43c3-969a-88ee748d09e8" (UID: "8285a362-7003-43c3-969a-88ee748d09e8"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071636 5118 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-inventory\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071666 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071697 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071707 5118 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071715 5118 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-ceph\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071724 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071732 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071740 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071749 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071779 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071788 5118 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8285a362-7003-43c3-969a-88ee748d09e8-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071796 5118 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/8285a362-7003-43c3-969a-88ee748d09e8-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.071803 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsqv5\" (UniqueName: \"kubernetes.io/projected/8285a362-7003-43c3-969a-88ee748d09e8-kube-api-access-bsqv5\") on node \"crc\" DevicePath \"\"" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.403938 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" event={"ID":"8285a362-7003-43c3-969a-88ee748d09e8","Type":"ContainerDied","Data":"a967409e74a9b24276fdc92d8c37a067540b8d63f0f81757b3e88abf5dc9c7b7"} Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.404519 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a967409e74a9b24276fdc92d8c37a067540b8d63f0f81757b3e88abf5dc9c7b7" Feb 23 09:39:11 crc kubenswrapper[5118]: I0223 09:39:11.404026 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5" Feb 23 09:39:16 crc kubenswrapper[5118]: I0223 09:39:16.697556 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:39:16 crc kubenswrapper[5118]: E0223 09:39:16.700808 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:39:27 crc kubenswrapper[5118]: I0223 09:39:27.704733 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:39:27 crc kubenswrapper[5118]: E0223 09:39:27.705876 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:39:39 crc kubenswrapper[5118]: I0223 09:39:39.698839 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:39:39 crc kubenswrapper[5118]: E0223 09:39:39.699503 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:39:53 crc kubenswrapper[5118]: I0223 09:39:53.697165 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:39:53 crc kubenswrapper[5118]: E0223 09:39:53.698281 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:40:08 crc kubenswrapper[5118]: I0223 09:40:08.697570 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:40:08 crc kubenswrapper[5118]: E0223 09:40:08.698582 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:40:19 crc kubenswrapper[5118]: I0223 09:40:19.697795 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:40:19 crc kubenswrapper[5118]: E0223 09:40:19.698532 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:40:30 crc kubenswrapper[5118]: I0223 09:40:30.698238 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:40:30 crc kubenswrapper[5118]: E0223 09:40:30.698969 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:40:42 crc kubenswrapper[5118]: I0223 09:40:42.698582 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:40:42 crc kubenswrapper[5118]: E0223 09:40:42.699989 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:40:44 crc kubenswrapper[5118]: I0223 09:40:44.108928 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 09:40:44 crc kubenswrapper[5118]: I0223 09:40:44.109617 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="38283e4b-8372-40cb-83b7-c1e08e09dd96" containerName="adoption" containerID="cri-o://513f8feb41a7ca6f964fae4153244686fc0f735317ffdcfe8668d70db150c8c3" gracePeriod=30 Feb 23 09:40:55 crc kubenswrapper[5118]: I0223 09:40:55.697283 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:40:55 crc kubenswrapper[5118]: E0223 09:40:55.699070 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:41:08 crc kubenswrapper[5118]: I0223 09:41:08.699198 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:41:08 crc kubenswrapper[5118]: E0223 09:41:08.699990 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:41:14 crc kubenswrapper[5118]: I0223 09:41:14.469953 5118 generic.go:334] "Generic (PLEG): container finished" podID="38283e4b-8372-40cb-83b7-c1e08e09dd96" containerID="513f8feb41a7ca6f964fae4153244686fc0f735317ffdcfe8668d70db150c8c3" exitCode=137 Feb 23 09:41:14 crc kubenswrapper[5118]: I0223 09:41:14.470019 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38283e4b-8372-40cb-83b7-c1e08e09dd96","Type":"ContainerDied","Data":"513f8feb41a7ca6f964fae4153244686fc0f735317ffdcfe8668d70db150c8c3"} Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.489423 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38283e4b-8372-40cb-83b7-c1e08e09dd96","Type":"ContainerDied","Data":"6e0016d09d1a32d59219ae2aeb894fc6b1b82562ed486576a8c4b86a19b79ac3"} Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.489494 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e0016d09d1a32d59219ae2aeb894fc6b1b82562ed486576a8c4b86a19b79ac3" Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.549683 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.711642 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8j8z\" (UniqueName: \"kubernetes.io/projected/38283e4b-8372-40cb-83b7-c1e08e09dd96-kube-api-access-k8j8z\") pod \"38283e4b-8372-40cb-83b7-c1e08e09dd96\" (UID: \"38283e4b-8372-40cb-83b7-c1e08e09dd96\") " Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.712678 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\") pod \"38283e4b-8372-40cb-83b7-c1e08e09dd96\" (UID: \"38283e4b-8372-40cb-83b7-c1e08e09dd96\") " Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.722880 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38283e4b-8372-40cb-83b7-c1e08e09dd96-kube-api-access-k8j8z" (OuterVolumeSpecName: "kube-api-access-k8j8z") pod "38283e4b-8372-40cb-83b7-c1e08e09dd96" (UID: "38283e4b-8372-40cb-83b7-c1e08e09dd96"). InnerVolumeSpecName "kube-api-access-k8j8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.736041 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1297e80b-c55a-48fb-bdd6-5a032749d998" (OuterVolumeSpecName: "mariadb-data") pod "38283e4b-8372-40cb-83b7-c1e08e09dd96" (UID: "38283e4b-8372-40cb-83b7-c1e08e09dd96"). InnerVolumeSpecName "pvc-1297e80b-c55a-48fb-bdd6-5a032749d998". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.816732 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8j8z\" (UniqueName: \"kubernetes.io/projected/38283e4b-8372-40cb-83b7-c1e08e09dd96-kube-api-access-k8j8z\") on node \"crc\" DevicePath \"\"" Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.816796 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\") on node \"crc\" " Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.849624 5118 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.849819 5118 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1297e80b-c55a-48fb-bdd6-5a032749d998" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1297e80b-c55a-48fb-bdd6-5a032749d998") on node "crc" Feb 23 09:41:15 crc kubenswrapper[5118]: I0223 09:41:15.919557 5118 reconciler_common.go:293] "Volume detached for volume \"pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1297e80b-c55a-48fb-bdd6-5a032749d998\") on node \"crc\" DevicePath \"\"" Feb 23 09:41:16 crc kubenswrapper[5118]: I0223 09:41:16.502730 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 23 09:41:16 crc kubenswrapper[5118]: I0223 09:41:16.547341 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 09:41:16 crc kubenswrapper[5118]: I0223 09:41:16.557485 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Feb 23 09:41:17 crc kubenswrapper[5118]: I0223 09:41:17.239654 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 23 09:41:17 crc kubenswrapper[5118]: I0223 09:41:17.240217 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="d23ec113-64c1-41ec-84b2-e92ad8675129" containerName="adoption" containerID="cri-o://1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63" gracePeriod=30 Feb 23 09:41:17 crc kubenswrapper[5118]: I0223 09:41:17.712822 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38283e4b-8372-40cb-83b7-c1e08e09dd96" path="/var/lib/kubelet/pods/38283e4b-8372-40cb-83b7-c1e08e09dd96/volumes" Feb 23 09:41:19 crc kubenswrapper[5118]: I0223 09:41:19.697146 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:41:19 crc kubenswrapper[5118]: E0223 09:41:19.697860 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:41:33 crc kubenswrapper[5118]: I0223 09:41:33.697927 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:41:33 crc kubenswrapper[5118]: E0223 09:41:33.699026 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.858628 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.900730 5118 generic.go:334] "Generic (PLEG): container finished" podID="d23ec113-64c1-41ec-84b2-e92ad8675129" containerID="1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63" exitCode=137 Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.900798 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d23ec113-64c1-41ec-84b2-e92ad8675129","Type":"ContainerDied","Data":"1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63"} Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.900868 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"d23ec113-64c1-41ec-84b2-e92ad8675129","Type":"ContainerDied","Data":"00bc12bd6b712cc6014c490b2679ce04c13e9a32cc1a6bd6185e4f7a7f7a3a30"} Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.900890 5118 scope.go:117] "RemoveContainer" containerID="1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63" Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.900816 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.929650 5118 scope.go:117] "RemoveContainer" containerID="1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63" Feb 23 09:41:47 crc kubenswrapper[5118]: E0223 09:41:47.930161 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63\": container with ID starting with 1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63 not found: ID does not exist" containerID="1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63" Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.930211 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63"} err="failed to get container status \"1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63\": rpc error: code = NotFound desc = could not find container \"1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63\": container with ID starting with 1c0eea1357f689f85f986606a24645a68e2efeefdad40752182bdef8c8bfdd63 not found: ID does not exist" Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.959463 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-269946c1-b1f0-400d-ab26-e812b34a4691\") pod \"d23ec113-64c1-41ec-84b2-e92ad8675129\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.959753 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d23ec113-64c1-41ec-84b2-e92ad8675129-ovn-data-cert\") pod \"d23ec113-64c1-41ec-84b2-e92ad8675129\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.959817 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtj78\" (UniqueName: \"kubernetes.io/projected/d23ec113-64c1-41ec-84b2-e92ad8675129-kube-api-access-vtj78\") pod \"d23ec113-64c1-41ec-84b2-e92ad8675129\" (UID: \"d23ec113-64c1-41ec-84b2-e92ad8675129\") " Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.967322 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d23ec113-64c1-41ec-84b2-e92ad8675129-kube-api-access-vtj78" (OuterVolumeSpecName: "kube-api-access-vtj78") pod "d23ec113-64c1-41ec-84b2-e92ad8675129" (UID: "d23ec113-64c1-41ec-84b2-e92ad8675129"). InnerVolumeSpecName "kube-api-access-vtj78". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.967968 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23ec113-64c1-41ec-84b2-e92ad8675129-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "d23ec113-64c1-41ec-84b2-e92ad8675129" (UID: "d23ec113-64c1-41ec-84b2-e92ad8675129"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:41:47 crc kubenswrapper[5118]: I0223 09:41:47.980194 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-269946c1-b1f0-400d-ab26-e812b34a4691" (OuterVolumeSpecName: "ovn-data") pod "d23ec113-64c1-41ec-84b2-e92ad8675129" (UID: "d23ec113-64c1-41ec-84b2-e92ad8675129"). InnerVolumeSpecName "pvc-269946c1-b1f0-400d-ab26-e812b34a4691". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 09:41:48 crc kubenswrapper[5118]: I0223 09:41:48.062403 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-269946c1-b1f0-400d-ab26-e812b34a4691\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-269946c1-b1f0-400d-ab26-e812b34a4691\") on node \"crc\" " Feb 23 09:41:48 crc kubenswrapper[5118]: I0223 09:41:48.062698 5118 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/d23ec113-64c1-41ec-84b2-e92ad8675129-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Feb 23 09:41:48 crc kubenswrapper[5118]: I0223 09:41:48.062769 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtj78\" (UniqueName: \"kubernetes.io/projected/d23ec113-64c1-41ec-84b2-e92ad8675129-kube-api-access-vtj78\") on node \"crc\" DevicePath \"\"" Feb 23 09:41:48 crc kubenswrapper[5118]: I0223 09:41:48.086066 5118 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 09:41:48 crc kubenswrapper[5118]: I0223 09:41:48.086222 5118 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-269946c1-b1f0-400d-ab26-e812b34a4691" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-269946c1-b1f0-400d-ab26-e812b34a4691") on node "crc" Feb 23 09:41:48 crc kubenswrapper[5118]: I0223 09:41:48.164577 5118 reconciler_common.go:293] "Volume detached for volume \"pvc-269946c1-b1f0-400d-ab26-e812b34a4691\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-269946c1-b1f0-400d-ab26-e812b34a4691\") on node \"crc\" DevicePath \"\"" Feb 23 09:41:48 crc kubenswrapper[5118]: I0223 09:41:48.240693 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 23 09:41:48 crc kubenswrapper[5118]: I0223 09:41:48.254486 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Feb 23 09:41:48 crc kubenswrapper[5118]: I0223 09:41:48.697556 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:41:48 crc kubenswrapper[5118]: E0223 09:41:48.697872 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:41:49 crc kubenswrapper[5118]: I0223 09:41:49.719962 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d23ec113-64c1-41ec-84b2-e92ad8675129" path="/var/lib/kubelet/pods/d23ec113-64c1-41ec-84b2-e92ad8675129/volumes" Feb 23 09:42:01 crc kubenswrapper[5118]: I0223 09:42:01.698067 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:42:01 crc kubenswrapper[5118]: E0223 09:42:01.699576 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:42:08 crc kubenswrapper[5118]: I0223 09:42:08.017327 5118 scope.go:117] "RemoveContainer" containerID="513f8feb41a7ca6f964fae4153244686fc0f735317ffdcfe8668d70db150c8c3" Feb 23 09:42:12 crc kubenswrapper[5118]: I0223 09:42:12.697965 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:42:12 crc kubenswrapper[5118]: E0223 09:42:12.698769 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.869823 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bzlqk"] Feb 23 09:42:16 crc kubenswrapper[5118]: E0223 09:42:16.871176 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerName="extract-content" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871200 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerName="extract-content" Feb 23 09:42:16 crc kubenswrapper[5118]: E0223 09:42:16.871225 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8285a362-7003-43c3-969a-88ee748d09e8" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871239 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8285a362-7003-43c3-969a-88ee748d09e8" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 23 09:42:16 crc kubenswrapper[5118]: E0223 09:42:16.871280 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerName="registry-server" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871292 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerName="registry-server" Feb 23 09:42:16 crc kubenswrapper[5118]: E0223 09:42:16.871309 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerName="extract-utilities" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871320 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerName="extract-utilities" Feb 23 09:42:16 crc kubenswrapper[5118]: E0223 09:42:16.871334 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af813b3-395a-4a44-8371-edd425a6f173" containerName="extract-utilities" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871344 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af813b3-395a-4a44-8371-edd425a6f173" containerName="extract-utilities" Feb 23 09:42:16 crc kubenswrapper[5118]: E0223 09:42:16.871362 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af813b3-395a-4a44-8371-edd425a6f173" containerName="registry-server" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871371 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af813b3-395a-4a44-8371-edd425a6f173" containerName="registry-server" Feb 23 09:42:16 crc kubenswrapper[5118]: E0223 09:42:16.871390 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af813b3-395a-4a44-8371-edd425a6f173" containerName="extract-content" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871397 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af813b3-395a-4a44-8371-edd425a6f173" containerName="extract-content" Feb 23 09:42:16 crc kubenswrapper[5118]: E0223 09:42:16.871413 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d23ec113-64c1-41ec-84b2-e92ad8675129" containerName="adoption" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871421 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23ec113-64c1-41ec-84b2-e92ad8675129" containerName="adoption" Feb 23 09:42:16 crc kubenswrapper[5118]: E0223 09:42:16.871438 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38283e4b-8372-40cb-83b7-c1e08e09dd96" containerName="adoption" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871446 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="38283e4b-8372-40cb-83b7-c1e08e09dd96" containerName="adoption" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871687 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae5e13f-b7ee-4e86-ac2a-63221fb573b1" containerName="registry-server" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871712 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d23ec113-64c1-41ec-84b2-e92ad8675129" containerName="adoption" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871726 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af813b3-395a-4a44-8371-edd425a6f173" containerName="registry-server" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871744 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="38283e4b-8372-40cb-83b7-c1e08e09dd96" containerName="adoption" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.871768 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8285a362-7003-43c3-969a-88ee748d09e8" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.873641 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:16 crc kubenswrapper[5118]: I0223 09:42:16.884544 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzlqk"] Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.044609 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-utilities\") pod \"redhat-operators-bzlqk\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.044817 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqlpx\" (UniqueName: \"kubernetes.io/projected/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-kube-api-access-lqlpx\") pod \"redhat-operators-bzlqk\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.044893 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-catalog-content\") pod \"redhat-operators-bzlqk\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.146456 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-utilities\") pod \"redhat-operators-bzlqk\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.146566 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqlpx\" (UniqueName: \"kubernetes.io/projected/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-kube-api-access-lqlpx\") pod \"redhat-operators-bzlqk\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.146608 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-catalog-content\") pod \"redhat-operators-bzlqk\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.147197 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-catalog-content\") pod \"redhat-operators-bzlqk\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.147280 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-utilities\") pod \"redhat-operators-bzlqk\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.165021 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqlpx\" (UniqueName: \"kubernetes.io/projected/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-kube-api-access-lqlpx\") pod \"redhat-operators-bzlqk\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.202846 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:17 crc kubenswrapper[5118]: I0223 09:42:17.833373 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bzlqk"] Feb 23 09:42:18 crc kubenswrapper[5118]: I0223 09:42:18.242920 5118 generic.go:334] "Generic (PLEG): container finished" podID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerID="81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42" exitCode=0 Feb 23 09:42:18 crc kubenswrapper[5118]: I0223 09:42:18.242962 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzlqk" event={"ID":"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22","Type":"ContainerDied","Data":"81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42"} Feb 23 09:42:18 crc kubenswrapper[5118]: I0223 09:42:18.242985 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzlqk" event={"ID":"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22","Type":"ContainerStarted","Data":"1de8295e36cac781ba6f701dc231c411a55e5971addd0849ecaab52d80b43d00"} Feb 23 09:42:18 crc kubenswrapper[5118]: I0223 09:42:18.244939 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:42:19 crc kubenswrapper[5118]: I0223 09:42:19.254047 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzlqk" event={"ID":"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22","Type":"ContainerStarted","Data":"0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791"} Feb 23 09:42:24 crc kubenswrapper[5118]: I0223 09:42:24.327611 5118 generic.go:334] "Generic (PLEG): container finished" podID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerID="0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791" exitCode=0 Feb 23 09:42:24 crc kubenswrapper[5118]: I0223 09:42:24.327723 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzlqk" event={"ID":"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22","Type":"ContainerDied","Data":"0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791"} Feb 23 09:42:25 crc kubenswrapper[5118]: I0223 09:42:25.346060 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzlqk" event={"ID":"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22","Type":"ContainerStarted","Data":"098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385"} Feb 23 09:42:25 crc kubenswrapper[5118]: I0223 09:42:25.379558 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bzlqk" podStartSLOduration=2.851314258 podStartE2EDuration="9.379534863s" podCreationTimestamp="2026-02-23 09:42:16 +0000 UTC" firstStartedPulling="2026-02-23 09:42:18.244752281 +0000 UTC m=+10601.248536844" lastFinishedPulling="2026-02-23 09:42:24.772972876 +0000 UTC m=+10607.776757449" observedRunningTime="2026-02-23 09:42:25.370773021 +0000 UTC m=+10608.374557634" watchObservedRunningTime="2026-02-23 09:42:25.379534863 +0000 UTC m=+10608.383319446" Feb 23 09:42:27 crc kubenswrapper[5118]: I0223 09:42:27.203070 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:27 crc kubenswrapper[5118]: I0223 09:42:27.203390 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:27 crc kubenswrapper[5118]: I0223 09:42:27.711441 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:42:27 crc kubenswrapper[5118]: E0223 09:42:27.711792 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:42:28 crc kubenswrapper[5118]: I0223 09:42:28.246728 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bzlqk" podUID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerName="registry-server" probeResult="failure" output=< Feb 23 09:42:28 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:42:28 crc kubenswrapper[5118]: > Feb 23 09:42:37 crc kubenswrapper[5118]: I0223 09:42:37.258917 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:37 crc kubenswrapper[5118]: I0223 09:42:37.315610 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:37 crc kubenswrapper[5118]: I0223 09:42:37.504517 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzlqk"] Feb 23 09:42:38 crc kubenswrapper[5118]: I0223 09:42:38.500259 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bzlqk" podUID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerName="registry-server" containerID="cri-o://098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385" gracePeriod=2 Feb 23 09:42:38 crc kubenswrapper[5118]: I0223 09:42:38.697555 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:42:38 crc kubenswrapper[5118]: E0223 09:42:38.697888 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.068092 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.132172 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqlpx\" (UniqueName: \"kubernetes.io/projected/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-kube-api-access-lqlpx\") pod \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.132246 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-catalog-content\") pod \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.132371 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-utilities\") pod \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\" (UID: \"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22\") " Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.133623 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-utilities" (OuterVolumeSpecName: "utilities") pod "1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" (UID: "1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.147498 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-kube-api-access-lqlpx" (OuterVolumeSpecName: "kube-api-access-lqlpx") pod "1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" (UID: "1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22"). InnerVolumeSpecName "kube-api-access-lqlpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.233445 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqlpx\" (UniqueName: \"kubernetes.io/projected/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-kube-api-access-lqlpx\") on node \"crc\" DevicePath \"\"" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.233476 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.302040 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" (UID: "1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.336075 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.516502 5118 generic.go:334] "Generic (PLEG): container finished" podID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerID="098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385" exitCode=0 Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.516576 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bzlqk" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.516575 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzlqk" event={"ID":"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22","Type":"ContainerDied","Data":"098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385"} Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.516768 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bzlqk" event={"ID":"1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22","Type":"ContainerDied","Data":"1de8295e36cac781ba6f701dc231c411a55e5971addd0849ecaab52d80b43d00"} Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.516818 5118 scope.go:117] "RemoveContainer" containerID="098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.556608 5118 scope.go:117] "RemoveContainer" containerID="0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.563922 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bzlqk"] Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.579294 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bzlqk"] Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.587795 5118 scope.go:117] "RemoveContainer" containerID="81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.641448 5118 scope.go:117] "RemoveContainer" containerID="098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385" Feb 23 09:42:39 crc kubenswrapper[5118]: E0223 09:42:39.641883 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385\": container with ID starting with 098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385 not found: ID does not exist" containerID="098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.641934 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385"} err="failed to get container status \"098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385\": rpc error: code = NotFound desc = could not find container \"098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385\": container with ID starting with 098e9de86de49041943faba210efcf605620ac3710ca511ed6a7a0b3009ab385 not found: ID does not exist" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.641967 5118 scope.go:117] "RemoveContainer" containerID="0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791" Feb 23 09:42:39 crc kubenswrapper[5118]: E0223 09:42:39.642427 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791\": container with ID starting with 0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791 not found: ID does not exist" containerID="0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.642530 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791"} err="failed to get container status \"0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791\": rpc error: code = NotFound desc = could not find container \"0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791\": container with ID starting with 0ed43faa26662c38c7f974079de25a42499359b5d637615ad3970876ca7be791 not found: ID does not exist" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.642620 5118 scope.go:117] "RemoveContainer" containerID="81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42" Feb 23 09:42:39 crc kubenswrapper[5118]: E0223 09:42:39.642923 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42\": container with ID starting with 81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42 not found: ID does not exist" containerID="81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.642953 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42"} err="failed to get container status \"81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42\": rpc error: code = NotFound desc = could not find container \"81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42\": container with ID starting with 81e9a3a31810f2ffa71fa2550253f2e17fb0810ec757f0534e3f371b0d9aae42 not found: ID does not exist" Feb 23 09:42:39 crc kubenswrapper[5118]: I0223 09:42:39.733869 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" path="/var/lib/kubelet/pods/1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22/volumes" Feb 23 09:42:52 crc kubenswrapper[5118]: I0223 09:42:52.697332 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:42:52 crc kubenswrapper[5118]: E0223 09:42:52.698013 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:43:04 crc kubenswrapper[5118]: I0223 09:43:04.700398 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:43:04 crc kubenswrapper[5118]: E0223 09:43:04.703615 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:43:18 crc kubenswrapper[5118]: I0223 09:43:18.698269 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:43:18 crc kubenswrapper[5118]: E0223 09:43:18.698993 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:43:33 crc kubenswrapper[5118]: I0223 09:43:33.697506 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:43:34 crc kubenswrapper[5118]: I0223 09:43:34.164208 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"760940411e99c7beeb2fe1ad85d4450f709b870eb66d2f725bf08ee457de23fe"} Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.181228 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm"] Feb 23 09:45:00 crc kubenswrapper[5118]: E0223 09:45:00.182173 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerName="extract-content" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.182186 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerName="extract-content" Feb 23 09:45:00 crc kubenswrapper[5118]: E0223 09:45:00.182207 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerName="registry-server" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.182213 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerName="registry-server" Feb 23 09:45:00 crc kubenswrapper[5118]: E0223 09:45:00.182228 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerName="extract-utilities" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.182235 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerName="extract-utilities" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.182431 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1ab3f3-39fe-4e63-b1b0-7e1950d4be22" containerName="registry-server" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.184234 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.187622 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.187866 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.199143 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm"] Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.242079 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sp2w\" (UniqueName: \"kubernetes.io/projected/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-kube-api-access-5sp2w\") pod \"collect-profiles-29530665-pkfdm\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.242284 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-config-volume\") pod \"collect-profiles-29530665-pkfdm\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.242371 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-secret-volume\") pod \"collect-profiles-29530665-pkfdm\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.344876 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-secret-volume\") pod \"collect-profiles-29530665-pkfdm\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.345020 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sp2w\" (UniqueName: \"kubernetes.io/projected/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-kube-api-access-5sp2w\") pod \"collect-profiles-29530665-pkfdm\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.345294 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-config-volume\") pod \"collect-profiles-29530665-pkfdm\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.348234 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-config-volume\") pod \"collect-profiles-29530665-pkfdm\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.360908 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-secret-volume\") pod \"collect-profiles-29530665-pkfdm\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.364277 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sp2w\" (UniqueName: \"kubernetes.io/projected/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-kube-api-access-5sp2w\") pod \"collect-profiles-29530665-pkfdm\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.505047 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:00 crc kubenswrapper[5118]: I0223 09:45:00.994685 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm"] Feb 23 09:45:00 crc kubenswrapper[5118]: W0223 09:45:00.997308 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e474bf6_37f9_4ada_b7af_dd3e663cb6b4.slice/crio-390880f28edb68812b24f55033f8a5316bc24ebee0eb8221e6881e1ebc20021f WatchSource:0}: Error finding container 390880f28edb68812b24f55033f8a5316bc24ebee0eb8221e6881e1ebc20021f: Status 404 returned error can't find the container with id 390880f28edb68812b24f55033f8a5316bc24ebee0eb8221e6881e1ebc20021f Feb 23 09:45:01 crc kubenswrapper[5118]: I0223 09:45:01.191064 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" event={"ID":"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4","Type":"ContainerStarted","Data":"046f74ab5f63b2934d33ba9e9a68fc672d5f784359c1eb951cf91a942db95238"} Feb 23 09:45:01 crc kubenswrapper[5118]: I0223 09:45:01.191375 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" event={"ID":"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4","Type":"ContainerStarted","Data":"390880f28edb68812b24f55033f8a5316bc24ebee0eb8221e6881e1ebc20021f"} Feb 23 09:45:01 crc kubenswrapper[5118]: I0223 09:45:01.215424 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" podStartSLOduration=1.215405423 podStartE2EDuration="1.215405423s" podCreationTimestamp="2026-02-23 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 09:45:01.205710728 +0000 UTC m=+10764.209495321" watchObservedRunningTime="2026-02-23 09:45:01.215405423 +0000 UTC m=+10764.219189996" Feb 23 09:45:02 crc kubenswrapper[5118]: I0223 09:45:02.204736 5118 generic.go:334] "Generic (PLEG): container finished" podID="5e474bf6-37f9-4ada-b7af-dd3e663cb6b4" containerID="046f74ab5f63b2934d33ba9e9a68fc672d5f784359c1eb951cf91a942db95238" exitCode=0 Feb 23 09:45:02 crc kubenswrapper[5118]: I0223 09:45:02.204862 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" event={"ID":"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4","Type":"ContainerDied","Data":"046f74ab5f63b2934d33ba9e9a68fc672d5f784359c1eb951cf91a942db95238"} Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.123481 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.238962 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-secret-volume\") pod \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.240532 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-config-volume\") pod \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.240630 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sp2w\" (UniqueName: \"kubernetes.io/projected/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-kube-api-access-5sp2w\") pod \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\" (UID: \"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4\") " Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.241260 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e474bf6-37f9-4ada-b7af-dd3e663cb6b4" (UID: "5e474bf6-37f9-4ada-b7af-dd3e663cb6b4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.241784 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" event={"ID":"5e474bf6-37f9-4ada-b7af-dd3e663cb6b4","Type":"ContainerDied","Data":"390880f28edb68812b24f55033f8a5316bc24ebee0eb8221e6881e1ebc20021f"} Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.241837 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="390880f28edb68812b24f55033f8a5316bc24ebee0eb8221e6881e1ebc20021f" Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.241906 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530665-pkfdm" Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.243945 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.249447 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-kube-api-access-5sp2w" (OuterVolumeSpecName: "kube-api-access-5sp2w") pod "5e474bf6-37f9-4ada-b7af-dd3e663cb6b4" (UID: "5e474bf6-37f9-4ada-b7af-dd3e663cb6b4"). InnerVolumeSpecName "kube-api-access-5sp2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.259814 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e474bf6-37f9-4ada-b7af-dd3e663cb6b4" (UID: "5e474bf6-37f9-4ada-b7af-dd3e663cb6b4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.283891 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk"] Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.297814 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530620-x2cvk"] Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.346581 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sp2w\" (UniqueName: \"kubernetes.io/projected/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-kube-api-access-5sp2w\") on node \"crc\" DevicePath \"\"" Feb 23 09:45:04 crc kubenswrapper[5118]: I0223 09:45:04.346620 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e474bf6-37f9-4ada-b7af-dd3e663cb6b4-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 09:45:05 crc kubenswrapper[5118]: I0223 09:45:05.713750 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea9878a-b108-4feb-9385-158a04f591d6" path="/var/lib/kubelet/pods/4ea9878a-b108-4feb-9385-158a04f591d6/volumes" Feb 23 09:45:08 crc kubenswrapper[5118]: I0223 09:45:08.151743 5118 scope.go:117] "RemoveContainer" containerID="e38de7cf3e4d61cc96c53c0f4331d839db9b961f1850ec8823ede3c4e92d2b8b" Feb 23 09:45:55 crc kubenswrapper[5118]: I0223 09:45:55.111077 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-5vhwv" podUID="4e891361-5f36-4ebe-a394-c553a139765a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 09:46:02 crc kubenswrapper[5118]: I0223 09:46:02.975669 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:46:02 crc kubenswrapper[5118]: I0223 09:46:02.976397 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:46:32 crc kubenswrapper[5118]: I0223 09:46:32.975017 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:46:32 crc kubenswrapper[5118]: I0223 09:46:32.975472 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:47:02 crc kubenswrapper[5118]: I0223 09:47:02.975460 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:47:02 crc kubenswrapper[5118]: I0223 09:47:02.976064 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:47:02 crc kubenswrapper[5118]: I0223 09:47:02.976145 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 09:47:02 crc kubenswrapper[5118]: I0223 09:47:02.977021 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"760940411e99c7beeb2fe1ad85d4450f709b870eb66d2f725bf08ee457de23fe"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:47:02 crc kubenswrapper[5118]: I0223 09:47:02.977140 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://760940411e99c7beeb2fe1ad85d4450f709b870eb66d2f725bf08ee457de23fe" gracePeriod=600 Feb 23 09:47:03 crc kubenswrapper[5118]: I0223 09:47:03.606087 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="760940411e99c7beeb2fe1ad85d4450f709b870eb66d2f725bf08ee457de23fe" exitCode=0 Feb 23 09:47:03 crc kubenswrapper[5118]: I0223 09:47:03.606142 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"760940411e99c7beeb2fe1ad85d4450f709b870eb66d2f725bf08ee457de23fe"} Feb 23 09:47:03 crc kubenswrapper[5118]: I0223 09:47:03.606392 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30"} Feb 23 09:47:03 crc kubenswrapper[5118]: I0223 09:47:03.606429 5118 scope.go:117] "RemoveContainer" containerID="ab4833c4169edde1e9874058ce5030164acc9a2afd789c861c715c0cf644dc86" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.102217 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ztn4v"] Feb 23 09:47:11 crc kubenswrapper[5118]: E0223 09:47:11.103353 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e474bf6-37f9-4ada-b7af-dd3e663cb6b4" containerName="collect-profiles" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.103389 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e474bf6-37f9-4ada-b7af-dd3e663cb6b4" containerName="collect-profiles" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.103638 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e474bf6-37f9-4ada-b7af-dd3e663cb6b4" containerName="collect-profiles" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.109654 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.112739 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztn4v"] Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.256780 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j85kc\" (UniqueName: \"kubernetes.io/projected/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-kube-api-access-j85kc\") pod \"redhat-marketplace-ztn4v\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.256838 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-utilities\") pod \"redhat-marketplace-ztn4v\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.256876 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-catalog-content\") pod \"redhat-marketplace-ztn4v\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.359932 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j85kc\" (UniqueName: \"kubernetes.io/projected/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-kube-api-access-j85kc\") pod \"redhat-marketplace-ztn4v\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.359985 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-utilities\") pod \"redhat-marketplace-ztn4v\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.360018 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-catalog-content\") pod \"redhat-marketplace-ztn4v\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.360675 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-catalog-content\") pod \"redhat-marketplace-ztn4v\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.360814 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-utilities\") pod \"redhat-marketplace-ztn4v\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.382819 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j85kc\" (UniqueName: \"kubernetes.io/projected/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-kube-api-access-j85kc\") pod \"redhat-marketplace-ztn4v\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.440981 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:11 crc kubenswrapper[5118]: I0223 09:47:11.945577 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztn4v"] Feb 23 09:47:12 crc kubenswrapper[5118]: I0223 09:47:12.736238 5118 generic.go:334] "Generic (PLEG): container finished" podID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerID="cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa" exitCode=0 Feb 23 09:47:12 crc kubenswrapper[5118]: I0223 09:47:12.736357 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztn4v" event={"ID":"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be","Type":"ContainerDied","Data":"cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa"} Feb 23 09:47:12 crc kubenswrapper[5118]: I0223 09:47:12.736945 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztn4v" event={"ID":"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be","Type":"ContainerStarted","Data":"4bed5db67e19334cd8310afb70fb8d2c131a3ed4732db162fda3bc4bc0638432"} Feb 23 09:47:13 crc kubenswrapper[5118]: I0223 09:47:13.747333 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztn4v" event={"ID":"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be","Type":"ContainerStarted","Data":"f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea"} Feb 23 09:47:14 crc kubenswrapper[5118]: I0223 09:47:14.757216 5118 generic.go:334] "Generic (PLEG): container finished" podID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerID="f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea" exitCode=0 Feb 23 09:47:14 crc kubenswrapper[5118]: I0223 09:47:14.757299 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztn4v" event={"ID":"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be","Type":"ContainerDied","Data":"f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea"} Feb 23 09:47:16 crc kubenswrapper[5118]: I0223 09:47:16.780569 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztn4v" event={"ID":"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be","Type":"ContainerStarted","Data":"61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f"} Feb 23 09:47:16 crc kubenswrapper[5118]: I0223 09:47:16.799024 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ztn4v" podStartSLOduration=3.360653169 podStartE2EDuration="5.799003715s" podCreationTimestamp="2026-02-23 09:47:11 +0000 UTC" firstStartedPulling="2026-02-23 09:47:12.738344612 +0000 UTC m=+10895.742129195" lastFinishedPulling="2026-02-23 09:47:15.176695168 +0000 UTC m=+10898.180479741" observedRunningTime="2026-02-23 09:47:16.796389041 +0000 UTC m=+10899.800173634" watchObservedRunningTime="2026-02-23 09:47:16.799003715 +0000 UTC m=+10899.802788288" Feb 23 09:47:21 crc kubenswrapper[5118]: I0223 09:47:21.441310 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:21 crc kubenswrapper[5118]: I0223 09:47:21.441827 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:21 crc kubenswrapper[5118]: I0223 09:47:21.498554 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:21 crc kubenswrapper[5118]: I0223 09:47:21.884671 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:21 crc kubenswrapper[5118]: I0223 09:47:21.935394 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztn4v"] Feb 23 09:47:23 crc kubenswrapper[5118]: I0223 09:47:23.856085 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ztn4v" podUID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerName="registry-server" containerID="cri-o://61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f" gracePeriod=2 Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.395249 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.564201 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-utilities\") pod \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.564298 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j85kc\" (UniqueName: \"kubernetes.io/projected/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-kube-api-access-j85kc\") pod \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.564337 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-catalog-content\") pod \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\" (UID: \"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be\") " Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.565439 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-utilities" (OuterVolumeSpecName: "utilities") pod "8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" (UID: "8b77f9ed-65f7-42a5-ae8f-16495d6ff7be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.569973 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-kube-api-access-j85kc" (OuterVolumeSpecName: "kube-api-access-j85kc") pod "8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" (UID: "8b77f9ed-65f7-42a5-ae8f-16495d6ff7be"). InnerVolumeSpecName "kube-api-access-j85kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.592124 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" (UID: "8b77f9ed-65f7-42a5-ae8f-16495d6ff7be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.666859 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.666903 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j85kc\" (UniqueName: \"kubernetes.io/projected/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-kube-api-access-j85kc\") on node \"crc\" DevicePath \"\"" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.666918 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.867079 5118 generic.go:334] "Generic (PLEG): container finished" podID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerID="61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f" exitCode=0 Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.867158 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztn4v" event={"ID":"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be","Type":"ContainerDied","Data":"61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f"} Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.867190 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztn4v" event={"ID":"8b77f9ed-65f7-42a5-ae8f-16495d6ff7be","Type":"ContainerDied","Data":"4bed5db67e19334cd8310afb70fb8d2c131a3ed4732db162fda3bc4bc0638432"} Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.867209 5118 scope.go:117] "RemoveContainer" containerID="61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.867358 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztn4v" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.897174 5118 scope.go:117] "RemoveContainer" containerID="f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.902465 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztn4v"] Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.914463 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztn4v"] Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.923364 5118 scope.go:117] "RemoveContainer" containerID="cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.975311 5118 scope.go:117] "RemoveContainer" containerID="61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f" Feb 23 09:47:24 crc kubenswrapper[5118]: E0223 09:47:24.975952 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f\": container with ID starting with 61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f not found: ID does not exist" containerID="61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.975994 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f"} err="failed to get container status \"61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f\": rpc error: code = NotFound desc = could not find container \"61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f\": container with ID starting with 61ceb13222fdbcb5b0c841ee8e5cea5944c7e8ffd86b66cbfb7102881e59553f not found: ID does not exist" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.976019 5118 scope.go:117] "RemoveContainer" containerID="f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea" Feb 23 09:47:24 crc kubenswrapper[5118]: E0223 09:47:24.976526 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea\": container with ID starting with f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea not found: ID does not exist" containerID="f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.976579 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea"} err="failed to get container status \"f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea\": rpc error: code = NotFound desc = could not find container \"f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea\": container with ID starting with f2847d6bfa443f3e5bbc6b2bbea5ada3332ae6f4d8b039267bd899deece4ebea not found: ID does not exist" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.976607 5118 scope.go:117] "RemoveContainer" containerID="cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa" Feb 23 09:47:24 crc kubenswrapper[5118]: E0223 09:47:24.976904 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa\": container with ID starting with cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa not found: ID does not exist" containerID="cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa" Feb 23 09:47:24 crc kubenswrapper[5118]: I0223 09:47:24.976932 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa"} err="failed to get container status \"cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa\": rpc error: code = NotFound desc = could not find container \"cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa\": container with ID starting with cf19604ceba9f50f689f9d94a8cb3c2107b4ae73bfc23deeb51fd7bcb165e1aa not found: ID does not exist" Feb 23 09:47:25 crc kubenswrapper[5118]: I0223 09:47:25.724227 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" path="/var/lib/kubelet/pods/8b77f9ed-65f7-42a5-ae8f-16495d6ff7be/volumes" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.690339 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p5v85"] Feb 23 09:48:09 crc kubenswrapper[5118]: E0223 09:48:09.691410 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerName="registry-server" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.691429 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerName="registry-server" Feb 23 09:48:09 crc kubenswrapper[5118]: E0223 09:48:09.691449 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerName="extract-content" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.691458 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerName="extract-content" Feb 23 09:48:09 crc kubenswrapper[5118]: E0223 09:48:09.691485 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerName="extract-utilities" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.691494 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerName="extract-utilities" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.691777 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b77f9ed-65f7-42a5-ae8f-16495d6ff7be" containerName="registry-server" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.693947 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.737250 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5v85"] Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.774843 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlt96\" (UniqueName: \"kubernetes.io/projected/96eb8fa1-882f-4244-878a-78ad56340998-kube-api-access-vlt96\") pod \"certified-operators-p5v85\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.775019 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-catalog-content\") pod \"certified-operators-p5v85\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.775057 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-utilities\") pod \"certified-operators-p5v85\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.877393 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-catalog-content\") pod \"certified-operators-p5v85\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.877448 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-utilities\") pod \"certified-operators-p5v85\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.877563 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlt96\" (UniqueName: \"kubernetes.io/projected/96eb8fa1-882f-4244-878a-78ad56340998-kube-api-access-vlt96\") pod \"certified-operators-p5v85\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.877966 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-catalog-content\") pod \"certified-operators-p5v85\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.878275 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-utilities\") pod \"certified-operators-p5v85\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:09 crc kubenswrapper[5118]: I0223 09:48:09.918134 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlt96\" (UniqueName: \"kubernetes.io/projected/96eb8fa1-882f-4244-878a-78ad56340998-kube-api-access-vlt96\") pod \"certified-operators-p5v85\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:10 crc kubenswrapper[5118]: I0223 09:48:10.023694 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:10 crc kubenswrapper[5118]: I0223 09:48:10.560167 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5v85"] Feb 23 09:48:11 crc kubenswrapper[5118]: I0223 09:48:11.379825 5118 generic.go:334] "Generic (PLEG): container finished" podID="96eb8fa1-882f-4244-878a-78ad56340998" containerID="7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f" exitCode=0 Feb 23 09:48:11 crc kubenswrapper[5118]: I0223 09:48:11.379959 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5v85" event={"ID":"96eb8fa1-882f-4244-878a-78ad56340998","Type":"ContainerDied","Data":"7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f"} Feb 23 09:48:11 crc kubenswrapper[5118]: I0223 09:48:11.380461 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5v85" event={"ID":"96eb8fa1-882f-4244-878a-78ad56340998","Type":"ContainerStarted","Data":"80e7340b4b22c6ae9ca03a7931cc9fd4226a33832732a92d11eb6d79c1598b2a"} Feb 23 09:48:11 crc kubenswrapper[5118]: I0223 09:48:11.384075 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:48:13 crc kubenswrapper[5118]: I0223 09:48:13.405441 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5v85" event={"ID":"96eb8fa1-882f-4244-878a-78ad56340998","Type":"ContainerStarted","Data":"1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e"} Feb 23 09:48:15 crc kubenswrapper[5118]: I0223 09:48:15.430763 5118 generic.go:334] "Generic (PLEG): container finished" podID="96eb8fa1-882f-4244-878a-78ad56340998" containerID="1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e" exitCode=0 Feb 23 09:48:15 crc kubenswrapper[5118]: I0223 09:48:15.430965 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5v85" event={"ID":"96eb8fa1-882f-4244-878a-78ad56340998","Type":"ContainerDied","Data":"1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e"} Feb 23 09:48:16 crc kubenswrapper[5118]: I0223 09:48:16.442830 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5v85" event={"ID":"96eb8fa1-882f-4244-878a-78ad56340998","Type":"ContainerStarted","Data":"8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7"} Feb 23 09:48:16 crc kubenswrapper[5118]: I0223 09:48:16.472585 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p5v85" podStartSLOduration=2.945246972 podStartE2EDuration="7.4725569s" podCreationTimestamp="2026-02-23 09:48:09 +0000 UTC" firstStartedPulling="2026-02-23 09:48:11.383848379 +0000 UTC m=+10954.387632952" lastFinishedPulling="2026-02-23 09:48:15.911158307 +0000 UTC m=+10958.914942880" observedRunningTime="2026-02-23 09:48:16.464213998 +0000 UTC m=+10959.467998581" watchObservedRunningTime="2026-02-23 09:48:16.4725569 +0000 UTC m=+10959.476341483" Feb 23 09:48:20 crc kubenswrapper[5118]: I0223 09:48:20.024431 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:20 crc kubenswrapper[5118]: I0223 09:48:20.025251 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:20 crc kubenswrapper[5118]: I0223 09:48:20.096767 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:30 crc kubenswrapper[5118]: I0223 09:48:30.084663 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:30 crc kubenswrapper[5118]: I0223 09:48:30.150292 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5v85"] Feb 23 09:48:30 crc kubenswrapper[5118]: I0223 09:48:30.587132 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p5v85" podUID="96eb8fa1-882f-4244-878a-78ad56340998" containerName="registry-server" containerID="cri-o://8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7" gracePeriod=2 Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.196584 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.251903 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-catalog-content\") pod \"96eb8fa1-882f-4244-878a-78ad56340998\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.252538 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlt96\" (UniqueName: \"kubernetes.io/projected/96eb8fa1-882f-4244-878a-78ad56340998-kube-api-access-vlt96\") pod \"96eb8fa1-882f-4244-878a-78ad56340998\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.252724 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-utilities\") pod \"96eb8fa1-882f-4244-878a-78ad56340998\" (UID: \"96eb8fa1-882f-4244-878a-78ad56340998\") " Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.253227 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-utilities" (OuterVolumeSpecName: "utilities") pod "96eb8fa1-882f-4244-878a-78ad56340998" (UID: "96eb8fa1-882f-4244-878a-78ad56340998"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.253564 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.264570 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96eb8fa1-882f-4244-878a-78ad56340998-kube-api-access-vlt96" (OuterVolumeSpecName: "kube-api-access-vlt96") pod "96eb8fa1-882f-4244-878a-78ad56340998" (UID: "96eb8fa1-882f-4244-878a-78ad56340998"). InnerVolumeSpecName "kube-api-access-vlt96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.310447 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96eb8fa1-882f-4244-878a-78ad56340998" (UID: "96eb8fa1-882f-4244-878a-78ad56340998"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.355401 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96eb8fa1-882f-4244-878a-78ad56340998-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.355447 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlt96\" (UniqueName: \"kubernetes.io/projected/96eb8fa1-882f-4244-878a-78ad56340998-kube-api-access-vlt96\") on node \"crc\" DevicePath \"\"" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.597668 5118 generic.go:334] "Generic (PLEG): container finished" podID="96eb8fa1-882f-4244-878a-78ad56340998" containerID="8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7" exitCode=0 Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.597727 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5v85" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.597748 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5v85" event={"ID":"96eb8fa1-882f-4244-878a-78ad56340998","Type":"ContainerDied","Data":"8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7"} Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.599293 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5v85" event={"ID":"96eb8fa1-882f-4244-878a-78ad56340998","Type":"ContainerDied","Data":"80e7340b4b22c6ae9ca03a7931cc9fd4226a33832732a92d11eb6d79c1598b2a"} Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.599400 5118 scope.go:117] "RemoveContainer" containerID="8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.627390 5118 scope.go:117] "RemoveContainer" containerID="1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.647711 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5v85"] Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.673760 5118 scope.go:117] "RemoveContainer" containerID="7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.684988 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p5v85"] Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.712774 5118 scope.go:117] "RemoveContainer" containerID="8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7" Feb 23 09:48:31 crc kubenswrapper[5118]: E0223 09:48:31.713331 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7\": container with ID starting with 8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7 not found: ID does not exist" containerID="8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.713371 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7"} err="failed to get container status \"8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7\": rpc error: code = NotFound desc = could not find container \"8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7\": container with ID starting with 8489bee328806097ed2228051857ca47fd08c29c31e782451f9c37c0d9044de7 not found: ID does not exist" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.713392 5118 scope.go:117] "RemoveContainer" containerID="1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e" Feb 23 09:48:31 crc kubenswrapper[5118]: E0223 09:48:31.713775 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e\": container with ID starting with 1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e not found: ID does not exist" containerID="1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.713801 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e"} err="failed to get container status \"1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e\": rpc error: code = NotFound desc = could not find container \"1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e\": container with ID starting with 1da5fe9915b09d172cd46539b1b8a5b53680953152ff33d569b9ab729a06705e not found: ID does not exist" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.713813 5118 scope.go:117] "RemoveContainer" containerID="7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f" Feb 23 09:48:31 crc kubenswrapper[5118]: E0223 09:48:31.714615 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f\": container with ID starting with 7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f not found: ID does not exist" containerID="7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.714637 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f"} err="failed to get container status \"7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f\": rpc error: code = NotFound desc = could not find container \"7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f\": container with ID starting with 7cffec69ae83348370acc7c497ddd4c259f001276d16aca8206195a816cfa88f not found: ID does not exist" Feb 23 09:48:31 crc kubenswrapper[5118]: I0223 09:48:31.719070 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96eb8fa1-882f-4244-878a-78ad56340998" path="/var/lib/kubelet/pods/96eb8fa1-882f-4244-878a-78ad56340998/volumes" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.607717 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6mcgl"] Feb 23 09:49:28 crc kubenswrapper[5118]: E0223 09:49:28.610187 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96eb8fa1-882f-4244-878a-78ad56340998" containerName="registry-server" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.610207 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="96eb8fa1-882f-4244-878a-78ad56340998" containerName="registry-server" Feb 23 09:49:28 crc kubenswrapper[5118]: E0223 09:49:28.610223 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96eb8fa1-882f-4244-878a-78ad56340998" containerName="extract-content" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.610231 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="96eb8fa1-882f-4244-878a-78ad56340998" containerName="extract-content" Feb 23 09:49:28 crc kubenswrapper[5118]: E0223 09:49:28.610247 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96eb8fa1-882f-4244-878a-78ad56340998" containerName="extract-utilities" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.610254 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="96eb8fa1-882f-4244-878a-78ad56340998" containerName="extract-utilities" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.610489 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="96eb8fa1-882f-4244-878a-78ad56340998" containerName="registry-server" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.612077 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.644292 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6mcgl"] Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.739648 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-catalog-content\") pod \"community-operators-6mcgl\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.740051 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-utilities\") pod \"community-operators-6mcgl\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.740389 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vncw\" (UniqueName: \"kubernetes.io/projected/708a586a-62c8-4ee9-92b9-2ccc0784672b-kube-api-access-5vncw\") pod \"community-operators-6mcgl\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.842385 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-catalog-content\") pod \"community-operators-6mcgl\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.842948 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-catalog-content\") pod \"community-operators-6mcgl\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.843178 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-utilities\") pod \"community-operators-6mcgl\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.843235 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-utilities\") pod \"community-operators-6mcgl\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.843327 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vncw\" (UniqueName: \"kubernetes.io/projected/708a586a-62c8-4ee9-92b9-2ccc0784672b-kube-api-access-5vncw\") pod \"community-operators-6mcgl\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.874463 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vncw\" (UniqueName: \"kubernetes.io/projected/708a586a-62c8-4ee9-92b9-2ccc0784672b-kube-api-access-5vncw\") pod \"community-operators-6mcgl\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:28 crc kubenswrapper[5118]: I0223 09:49:28.943438 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:29 crc kubenswrapper[5118]: I0223 09:49:29.450702 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6mcgl"] Feb 23 09:49:30 crc kubenswrapper[5118]: I0223 09:49:30.245949 5118 generic.go:334] "Generic (PLEG): container finished" podID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerID="43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee" exitCode=0 Feb 23 09:49:30 crc kubenswrapper[5118]: I0223 09:49:30.246022 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mcgl" event={"ID":"708a586a-62c8-4ee9-92b9-2ccc0784672b","Type":"ContainerDied","Data":"43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee"} Feb 23 09:49:30 crc kubenswrapper[5118]: I0223 09:49:30.246270 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mcgl" event={"ID":"708a586a-62c8-4ee9-92b9-2ccc0784672b","Type":"ContainerStarted","Data":"10ca3f3f338b731c00030d23e1194727a1a1ee59d66e01795eddc915fc056c5f"} Feb 23 09:49:31 crc kubenswrapper[5118]: I0223 09:49:31.257001 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mcgl" event={"ID":"708a586a-62c8-4ee9-92b9-2ccc0784672b","Type":"ContainerStarted","Data":"45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf"} Feb 23 09:49:32 crc kubenswrapper[5118]: I0223 09:49:32.268330 5118 generic.go:334] "Generic (PLEG): container finished" podID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerID="45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf" exitCode=0 Feb 23 09:49:32 crc kubenswrapper[5118]: I0223 09:49:32.268391 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mcgl" event={"ID":"708a586a-62c8-4ee9-92b9-2ccc0784672b","Type":"ContainerDied","Data":"45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf"} Feb 23 09:49:32 crc kubenswrapper[5118]: I0223 09:49:32.975939 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:49:32 crc kubenswrapper[5118]: I0223 09:49:32.976350 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:49:33 crc kubenswrapper[5118]: I0223 09:49:33.300176 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mcgl" event={"ID":"708a586a-62c8-4ee9-92b9-2ccc0784672b","Type":"ContainerStarted","Data":"e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af"} Feb 23 09:49:33 crc kubenswrapper[5118]: I0223 09:49:33.329458 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6mcgl" podStartSLOduration=2.9202722679999997 podStartE2EDuration="5.329434905s" podCreationTimestamp="2026-02-23 09:49:28 +0000 UTC" firstStartedPulling="2026-02-23 09:49:30.2481484 +0000 UTC m=+11033.251932973" lastFinishedPulling="2026-02-23 09:49:32.657311037 +0000 UTC m=+11035.661095610" observedRunningTime="2026-02-23 09:49:33.317358303 +0000 UTC m=+11036.321142886" watchObservedRunningTime="2026-02-23 09:49:33.329434905 +0000 UTC m=+11036.333219478" Feb 23 09:49:38 crc kubenswrapper[5118]: I0223 09:49:38.943663 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:38 crc kubenswrapper[5118]: I0223 09:49:38.944257 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:38 crc kubenswrapper[5118]: I0223 09:49:38.994164 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:39 crc kubenswrapper[5118]: I0223 09:49:39.425616 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:39 crc kubenswrapper[5118]: I0223 09:49:39.481164 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6mcgl"] Feb 23 09:49:41 crc kubenswrapper[5118]: I0223 09:49:41.384841 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6mcgl" podUID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerName="registry-server" containerID="cri-o://e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af" gracePeriod=2 Feb 23 09:49:41 crc kubenswrapper[5118]: I0223 09:49:41.855029 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:41 crc kubenswrapper[5118]: I0223 09:49:41.922794 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-catalog-content\") pod \"708a586a-62c8-4ee9-92b9-2ccc0784672b\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " Feb 23 09:49:41 crc kubenswrapper[5118]: I0223 09:49:41.923018 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-utilities\") pod \"708a586a-62c8-4ee9-92b9-2ccc0784672b\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " Feb 23 09:49:41 crc kubenswrapper[5118]: I0223 09:49:41.923215 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vncw\" (UniqueName: \"kubernetes.io/projected/708a586a-62c8-4ee9-92b9-2ccc0784672b-kube-api-access-5vncw\") pod \"708a586a-62c8-4ee9-92b9-2ccc0784672b\" (UID: \"708a586a-62c8-4ee9-92b9-2ccc0784672b\") " Feb 23 09:49:41 crc kubenswrapper[5118]: I0223 09:49:41.924335 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-utilities" (OuterVolumeSpecName: "utilities") pod "708a586a-62c8-4ee9-92b9-2ccc0784672b" (UID: "708a586a-62c8-4ee9-92b9-2ccc0784672b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:49:41 crc kubenswrapper[5118]: I0223 09:49:41.931414 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708a586a-62c8-4ee9-92b9-2ccc0784672b-kube-api-access-5vncw" (OuterVolumeSpecName: "kube-api-access-5vncw") pod "708a586a-62c8-4ee9-92b9-2ccc0784672b" (UID: "708a586a-62c8-4ee9-92b9-2ccc0784672b"). InnerVolumeSpecName "kube-api-access-5vncw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:49:41 crc kubenswrapper[5118]: I0223 09:49:41.976936 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "708a586a-62c8-4ee9-92b9-2ccc0784672b" (UID: "708a586a-62c8-4ee9-92b9-2ccc0784672b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.026058 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vncw\" (UniqueName: \"kubernetes.io/projected/708a586a-62c8-4ee9-92b9-2ccc0784672b-kube-api-access-5vncw\") on node \"crc\" DevicePath \"\"" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.026127 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.026137 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/708a586a-62c8-4ee9-92b9-2ccc0784672b-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.400923 5118 generic.go:334] "Generic (PLEG): container finished" podID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerID="e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af" exitCode=0 Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.401041 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mcgl" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.401082 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mcgl" event={"ID":"708a586a-62c8-4ee9-92b9-2ccc0784672b","Type":"ContainerDied","Data":"e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af"} Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.401531 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mcgl" event={"ID":"708a586a-62c8-4ee9-92b9-2ccc0784672b","Type":"ContainerDied","Data":"10ca3f3f338b731c00030d23e1194727a1a1ee59d66e01795eddc915fc056c5f"} Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.401584 5118 scope.go:117] "RemoveContainer" containerID="e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.438795 5118 scope.go:117] "RemoveContainer" containerID="45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.447662 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6mcgl"] Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.462049 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6mcgl"] Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.475644 5118 scope.go:117] "RemoveContainer" containerID="43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.517735 5118 scope.go:117] "RemoveContainer" containerID="e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af" Feb 23 09:49:42 crc kubenswrapper[5118]: E0223 09:49:42.522849 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af\": container with ID starting with e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af not found: ID does not exist" containerID="e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.522913 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af"} err="failed to get container status \"e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af\": rpc error: code = NotFound desc = could not find container \"e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af\": container with ID starting with e1efb7540c88fe200f90cfd28108b78084601255eb1ef56c1c174032b71b56af not found: ID does not exist" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.522949 5118 scope.go:117] "RemoveContainer" containerID="45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf" Feb 23 09:49:42 crc kubenswrapper[5118]: E0223 09:49:42.523373 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf\": container with ID starting with 45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf not found: ID does not exist" containerID="45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.523495 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf"} err="failed to get container status \"45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf\": rpc error: code = NotFound desc = could not find container \"45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf\": container with ID starting with 45a6b9d33ff320ec3a4f93cfa616e5f377892cfad575a4dcf1e013ce9405f5bf not found: ID does not exist" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.523534 5118 scope.go:117] "RemoveContainer" containerID="43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee" Feb 23 09:49:42 crc kubenswrapper[5118]: E0223 09:49:42.527628 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee\": container with ID starting with 43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee not found: ID does not exist" containerID="43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee" Feb 23 09:49:42 crc kubenswrapper[5118]: I0223 09:49:42.527663 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee"} err="failed to get container status \"43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee\": rpc error: code = NotFound desc = could not find container \"43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee\": container with ID starting with 43004a6747992b58b56dcd92d90a08ef3c1947feb0abda00050c6738bcdca5ee not found: ID does not exist" Feb 23 09:49:43 crc kubenswrapper[5118]: I0223 09:49:43.720797 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="708a586a-62c8-4ee9-92b9-2ccc0784672b" path="/var/lib/kubelet/pods/708a586a-62c8-4ee9-92b9-2ccc0784672b/volumes" Feb 23 09:50:02 crc kubenswrapper[5118]: I0223 09:50:02.974800 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:50:02 crc kubenswrapper[5118]: I0223 09:50:02.975231 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:50:32 crc kubenswrapper[5118]: I0223 09:50:32.975640 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:50:32 crc kubenswrapper[5118]: I0223 09:50:32.976326 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:50:32 crc kubenswrapper[5118]: I0223 09:50:32.976399 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 09:50:32 crc kubenswrapper[5118]: I0223 09:50:32.977527 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:50:32 crc kubenswrapper[5118]: I0223 09:50:32.977781 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" gracePeriod=600 Feb 23 09:50:33 crc kubenswrapper[5118]: E0223 09:50:33.116507 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:50:33 crc kubenswrapper[5118]: I0223 09:50:33.995610 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" exitCode=0 Feb 23 09:50:33 crc kubenswrapper[5118]: I0223 09:50:33.995704 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30"} Feb 23 09:50:33 crc kubenswrapper[5118]: I0223 09:50:33.995968 5118 scope.go:117] "RemoveContainer" containerID="760940411e99c7beeb2fe1ad85d4450f709b870eb66d2f725bf08ee457de23fe" Feb 23 09:50:33 crc kubenswrapper[5118]: I0223 09:50:33.997329 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:50:33 crc kubenswrapper[5118]: E0223 09:50:33.997772 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:50:48 crc kubenswrapper[5118]: I0223 09:50:48.697673 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:50:48 crc kubenswrapper[5118]: E0223 09:50:48.698594 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:51:03 crc kubenswrapper[5118]: I0223 09:51:03.697693 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:51:03 crc kubenswrapper[5118]: E0223 09:51:03.698525 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:51:17 crc kubenswrapper[5118]: I0223 09:51:17.705752 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:51:17 crc kubenswrapper[5118]: E0223 09:51:17.706702 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.135164 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 23 09:51:18 crc kubenswrapper[5118]: E0223 09:51:18.136086 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerName="registry-server" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.136120 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerName="registry-server" Feb 23 09:51:18 crc kubenswrapper[5118]: E0223 09:51:18.136138 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerName="extract-content" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.136146 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerName="extract-content" Feb 23 09:51:18 crc kubenswrapper[5118]: E0223 09:51:18.136168 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerName="extract-utilities" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.136178 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerName="extract-utilities" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.136429 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="708a586a-62c8-4ee9-92b9-2ccc0784672b" containerName="registry-server" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.137546 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.141604 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.141852 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.141928 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cbpwf" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.141979 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.149281 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.270840 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.270905 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-config-data\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.270951 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.271004 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.271034 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.271057 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.271110 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.271158 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl8hk\" (UniqueName: \"kubernetes.io/projected/b5f49473-b37e-44f5-a7cf-ad38cd064729-kube-api-access-nl8hk\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.271183 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.373726 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.373805 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.373833 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.373874 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.373925 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl8hk\" (UniqueName: \"kubernetes.io/projected/b5f49473-b37e-44f5-a7cf-ad38cd064729-kube-api-access-nl8hk\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.373948 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.374022 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.374061 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-config-data\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.374117 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.374694 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.375679 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.375754 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.379951 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.386131 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-config-data\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.395739 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.395854 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.395904 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.398868 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl8hk\" (UniqueName: \"kubernetes.io/projected/b5f49473-b37e-44f5-a7cf-ad38cd064729-kube-api-access-nl8hk\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.413711 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.459725 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 23 09:51:18 crc kubenswrapper[5118]: I0223 09:51:18.968271 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 23 09:51:19 crc kubenswrapper[5118]: I0223 09:51:19.536448 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b5f49473-b37e-44f5-a7cf-ad38cd064729","Type":"ContainerStarted","Data":"ed2cb317846b7d2ebb4688b6de2060936d6e9bee5d9a88fd7e03ffbfb6c304a0"} Feb 23 09:51:28 crc kubenswrapper[5118]: I0223 09:51:28.697617 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:51:28 crc kubenswrapper[5118]: E0223 09:51:28.698548 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:51:43 crc kubenswrapper[5118]: I0223 09:51:43.698228 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:51:43 crc kubenswrapper[5118]: E0223 09:51:43.699235 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:51:55 crc kubenswrapper[5118]: I0223 09:51:55.700131 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:51:55 crc kubenswrapper[5118]: E0223 09:51:55.702785 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:52:10 crc kubenswrapper[5118]: I0223 09:52:10.700376 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:52:10 crc kubenswrapper[5118]: E0223 09:52:10.701881 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:52:11 crc kubenswrapper[5118]: E0223 09:52:11.531290 5118 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8419493e1fd846703d277695e03fc5eb" Feb 23 09:52:11 crc kubenswrapper[5118]: E0223 09:52:11.531618 5118 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8419493e1fd846703d277695e03fc5eb" Feb 23 09:52:11 crc kubenswrapper[5118]: E0223 09:52:11.531757 5118 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8419493e1fd846703d277695e03fc5eb,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nl8hk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(b5f49473-b37e-44f5-a7cf-ad38cd064729): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 09:52:11 crc kubenswrapper[5118]: E0223 09:52:11.533001 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="b5f49473-b37e-44f5-a7cf-ad38cd064729" Feb 23 09:52:12 crc kubenswrapper[5118]: E0223 09:52:12.144307 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8419493e1fd846703d277695e03fc5eb\\\"\"" pod="openstack/tempest-tests-tempest" podUID="b5f49473-b37e-44f5-a7cf-ad38cd064729" Feb 23 09:52:22 crc kubenswrapper[5118]: I0223 09:52:22.912010 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 23 09:52:24 crc kubenswrapper[5118]: I0223 09:52:24.263206 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b5f49473-b37e-44f5-a7cf-ad38cd064729","Type":"ContainerStarted","Data":"bbdf74d22d0c7f3828b26e17f7782034ae071f0204d04bbae8d79f7095d5e51b"} Feb 23 09:52:24 crc kubenswrapper[5118]: I0223 09:52:24.281892 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.332135618 podStartE2EDuration="1m7.281872673s" podCreationTimestamp="2026-02-23 09:51:17 +0000 UTC" firstStartedPulling="2026-02-23 09:51:18.959694178 +0000 UTC m=+11141.963478791" lastFinishedPulling="2026-02-23 09:52:22.909431263 +0000 UTC m=+11205.913215846" observedRunningTime="2026-02-23 09:52:24.276337868 +0000 UTC m=+11207.280122451" watchObservedRunningTime="2026-02-23 09:52:24.281872673 +0000 UTC m=+11207.285657246" Feb 23 09:52:25 crc kubenswrapper[5118]: I0223 09:52:25.698963 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:52:25 crc kubenswrapper[5118]: E0223 09:52:25.699777 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:52:32 crc kubenswrapper[5118]: I0223 09:52:32.844005 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-htwtd"] Feb 23 09:52:32 crc kubenswrapper[5118]: I0223 09:52:32.846681 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:32 crc kubenswrapper[5118]: I0223 09:52:32.859206 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-htwtd"] Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.016804 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-utilities\") pod \"redhat-operators-htwtd\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.016969 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-catalog-content\") pod \"redhat-operators-htwtd\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.017147 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wplqz\" (UniqueName: \"kubernetes.io/projected/e413fda0-8280-4c32-a29b-28d9526c7506-kube-api-access-wplqz\") pod \"redhat-operators-htwtd\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.118950 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-utilities\") pod \"redhat-operators-htwtd\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.119381 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-catalog-content\") pod \"redhat-operators-htwtd\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.119496 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-utilities\") pod \"redhat-operators-htwtd\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.119637 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-catalog-content\") pod \"redhat-operators-htwtd\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.119732 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wplqz\" (UniqueName: \"kubernetes.io/projected/e413fda0-8280-4c32-a29b-28d9526c7506-kube-api-access-wplqz\") pod \"redhat-operators-htwtd\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.140953 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wplqz\" (UniqueName: \"kubernetes.io/projected/e413fda0-8280-4c32-a29b-28d9526c7506-kube-api-access-wplqz\") pod \"redhat-operators-htwtd\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.169658 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:33 crc kubenswrapper[5118]: I0223 09:52:33.658846 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-htwtd"] Feb 23 09:52:34 crc kubenswrapper[5118]: I0223 09:52:34.388516 5118 generic.go:334] "Generic (PLEG): container finished" podID="e413fda0-8280-4c32-a29b-28d9526c7506" containerID="2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e" exitCode=0 Feb 23 09:52:34 crc kubenswrapper[5118]: I0223 09:52:34.388813 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-htwtd" event={"ID":"e413fda0-8280-4c32-a29b-28d9526c7506","Type":"ContainerDied","Data":"2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e"} Feb 23 09:52:34 crc kubenswrapper[5118]: I0223 09:52:34.388846 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-htwtd" event={"ID":"e413fda0-8280-4c32-a29b-28d9526c7506","Type":"ContainerStarted","Data":"5fd53bd7d16009943a476dc28f70cf53fbaa4377662f8d455463ccbabb7977a8"} Feb 23 09:52:35 crc kubenswrapper[5118]: I0223 09:52:35.399610 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-htwtd" event={"ID":"e413fda0-8280-4c32-a29b-28d9526c7506","Type":"ContainerStarted","Data":"75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770"} Feb 23 09:52:39 crc kubenswrapper[5118]: I0223 09:52:39.699057 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:52:39 crc kubenswrapper[5118]: E0223 09:52:39.702949 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:52:40 crc kubenswrapper[5118]: I0223 09:52:40.454689 5118 generic.go:334] "Generic (PLEG): container finished" podID="e413fda0-8280-4c32-a29b-28d9526c7506" containerID="75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770" exitCode=0 Feb 23 09:52:40 crc kubenswrapper[5118]: I0223 09:52:40.454746 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-htwtd" event={"ID":"e413fda0-8280-4c32-a29b-28d9526c7506","Type":"ContainerDied","Data":"75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770"} Feb 23 09:52:41 crc kubenswrapper[5118]: I0223 09:52:41.465802 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-htwtd" event={"ID":"e413fda0-8280-4c32-a29b-28d9526c7506","Type":"ContainerStarted","Data":"c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510"} Feb 23 09:52:41 crc kubenswrapper[5118]: I0223 09:52:41.497055 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-htwtd" podStartSLOduration=3.042672134 podStartE2EDuration="9.497031714s" podCreationTimestamp="2026-02-23 09:52:32 +0000 UTC" firstStartedPulling="2026-02-23 09:52:34.394629481 +0000 UTC m=+11217.398414054" lastFinishedPulling="2026-02-23 09:52:40.848989061 +0000 UTC m=+11223.852773634" observedRunningTime="2026-02-23 09:52:41.48820163 +0000 UTC m=+11224.491986213" watchObservedRunningTime="2026-02-23 09:52:41.497031714 +0000 UTC m=+11224.500816287" Feb 23 09:52:43 crc kubenswrapper[5118]: I0223 09:52:43.169837 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:43 crc kubenswrapper[5118]: I0223 09:52:43.170746 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:52:44 crc kubenswrapper[5118]: I0223 09:52:44.255631 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-htwtd" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="registry-server" probeResult="failure" output=< Feb 23 09:52:44 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:52:44 crc kubenswrapper[5118]: > Feb 23 09:52:51 crc kubenswrapper[5118]: I0223 09:52:51.697803 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:52:51 crc kubenswrapper[5118]: E0223 09:52:51.698564 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:52:54 crc kubenswrapper[5118]: I0223 09:52:54.223628 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-htwtd" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="registry-server" probeResult="failure" output=< Feb 23 09:52:54 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:52:54 crc kubenswrapper[5118]: > Feb 23 09:53:04 crc kubenswrapper[5118]: I0223 09:53:04.244207 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-htwtd" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="registry-server" probeResult="failure" output=< Feb 23 09:53:04 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:53:04 crc kubenswrapper[5118]: > Feb 23 09:53:05 crc kubenswrapper[5118]: I0223 09:53:05.697485 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:53:05 crc kubenswrapper[5118]: E0223 09:53:05.698157 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:53:14 crc kubenswrapper[5118]: I0223 09:53:14.221322 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-htwtd" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="registry-server" probeResult="failure" output=< Feb 23 09:53:14 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 09:53:14 crc kubenswrapper[5118]: > Feb 23 09:53:19 crc kubenswrapper[5118]: I0223 09:53:19.698134 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:53:19 crc kubenswrapper[5118]: E0223 09:53:19.699037 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:53:23 crc kubenswrapper[5118]: I0223 09:53:23.228739 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:53:23 crc kubenswrapper[5118]: I0223 09:53:23.297808 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:53:23 crc kubenswrapper[5118]: I0223 09:53:23.471171 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-htwtd"] Feb 23 09:53:24 crc kubenswrapper[5118]: I0223 09:53:24.929829 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-htwtd" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="registry-server" containerID="cri-o://c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510" gracePeriod=2 Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.708161 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.832059 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-utilities\") pod \"e413fda0-8280-4c32-a29b-28d9526c7506\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.832202 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-catalog-content\") pod \"e413fda0-8280-4c32-a29b-28d9526c7506\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.832257 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wplqz\" (UniqueName: \"kubernetes.io/projected/e413fda0-8280-4c32-a29b-28d9526c7506-kube-api-access-wplqz\") pod \"e413fda0-8280-4c32-a29b-28d9526c7506\" (UID: \"e413fda0-8280-4c32-a29b-28d9526c7506\") " Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.832636 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-utilities" (OuterVolumeSpecName: "utilities") pod "e413fda0-8280-4c32-a29b-28d9526c7506" (UID: "e413fda0-8280-4c32-a29b-28d9526c7506"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.833313 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.839558 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e413fda0-8280-4c32-a29b-28d9526c7506-kube-api-access-wplqz" (OuterVolumeSpecName: "kube-api-access-wplqz") pod "e413fda0-8280-4c32-a29b-28d9526c7506" (UID: "e413fda0-8280-4c32-a29b-28d9526c7506"). InnerVolumeSpecName "kube-api-access-wplqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.935051 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wplqz\" (UniqueName: \"kubernetes.io/projected/e413fda0-8280-4c32-a29b-28d9526c7506-kube-api-access-wplqz\") on node \"crc\" DevicePath \"\"" Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.938962 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e413fda0-8280-4c32-a29b-28d9526c7506" (UID: "e413fda0-8280-4c32-a29b-28d9526c7506"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.941617 5118 generic.go:334] "Generic (PLEG): container finished" podID="e413fda0-8280-4c32-a29b-28d9526c7506" containerID="c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510" exitCode=0 Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.941657 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-htwtd" event={"ID":"e413fda0-8280-4c32-a29b-28d9526c7506","Type":"ContainerDied","Data":"c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510"} Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.941682 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-htwtd" event={"ID":"e413fda0-8280-4c32-a29b-28d9526c7506","Type":"ContainerDied","Data":"5fd53bd7d16009943a476dc28f70cf53fbaa4377662f8d455463ccbabb7977a8"} Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.941697 5118 scope.go:117] "RemoveContainer" containerID="c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510" Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.941830 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-htwtd" Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.976370 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-htwtd"] Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.976581 5118 scope.go:117] "RemoveContainer" containerID="75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770" Feb 23 09:53:25 crc kubenswrapper[5118]: I0223 09:53:25.988937 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-htwtd"] Feb 23 09:53:26 crc kubenswrapper[5118]: I0223 09:53:26.001751 5118 scope.go:117] "RemoveContainer" containerID="2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e" Feb 23 09:53:26 crc kubenswrapper[5118]: I0223 09:53:26.036807 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e413fda0-8280-4c32-a29b-28d9526c7506-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:53:26 crc kubenswrapper[5118]: I0223 09:53:26.056036 5118 scope.go:117] "RemoveContainer" containerID="c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510" Feb 23 09:53:26 crc kubenswrapper[5118]: E0223 09:53:26.056430 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510\": container with ID starting with c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510 not found: ID does not exist" containerID="c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510" Feb 23 09:53:26 crc kubenswrapper[5118]: I0223 09:53:26.056464 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510"} err="failed to get container status \"c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510\": rpc error: code = NotFound desc = could not find container \"c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510\": container with ID starting with c24c5cdeb48b1c35f7a26298903656928dd8ef024c0569c3a862e7b8e439d510 not found: ID does not exist" Feb 23 09:53:26 crc kubenswrapper[5118]: I0223 09:53:26.056485 5118 scope.go:117] "RemoveContainer" containerID="75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770" Feb 23 09:53:26 crc kubenswrapper[5118]: E0223 09:53:26.056741 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770\": container with ID starting with 75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770 not found: ID does not exist" containerID="75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770" Feb 23 09:53:26 crc kubenswrapper[5118]: I0223 09:53:26.056758 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770"} err="failed to get container status \"75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770\": rpc error: code = NotFound desc = could not find container \"75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770\": container with ID starting with 75a3f779b1165c724d32265ab1fd7ea8000d86d4af23d8c93a4ef8bc513e8770 not found: ID does not exist" Feb 23 09:53:26 crc kubenswrapper[5118]: I0223 09:53:26.056772 5118 scope.go:117] "RemoveContainer" containerID="2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e" Feb 23 09:53:26 crc kubenswrapper[5118]: E0223 09:53:26.057056 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e\": container with ID starting with 2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e not found: ID does not exist" containerID="2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e" Feb 23 09:53:26 crc kubenswrapper[5118]: I0223 09:53:26.057078 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e"} err="failed to get container status \"2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e\": rpc error: code = NotFound desc = could not find container \"2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e\": container with ID starting with 2798d6cf2ccb82e9a8a228a3dec2d6bbe62f39d3c7da475fad97f95f3ca3b90e not found: ID does not exist" Feb 23 09:53:27 crc kubenswrapper[5118]: I0223 09:53:27.710561 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" path="/var/lib/kubelet/pods/e413fda0-8280-4c32-a29b-28d9526c7506/volumes" Feb 23 09:53:31 crc kubenswrapper[5118]: I0223 09:53:31.699400 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:53:31 crc kubenswrapper[5118]: E0223 09:53:31.699920 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:53:45 crc kubenswrapper[5118]: I0223 09:53:45.697828 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:53:45 crc kubenswrapper[5118]: E0223 09:53:45.698585 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:53:56 crc kubenswrapper[5118]: I0223 09:53:56.698026 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:53:56 crc kubenswrapper[5118]: E0223 09:53:56.699522 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:54:11 crc kubenswrapper[5118]: I0223 09:54:11.697073 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:54:11 crc kubenswrapper[5118]: E0223 09:54:11.697994 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:54:23 crc kubenswrapper[5118]: I0223 09:54:23.697954 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:54:23 crc kubenswrapper[5118]: E0223 09:54:23.698892 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:54:34 crc kubenswrapper[5118]: I0223 09:54:34.697438 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:54:34 crc kubenswrapper[5118]: E0223 09:54:34.698161 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:54:46 crc kubenswrapper[5118]: I0223 09:54:46.698064 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:54:46 crc kubenswrapper[5118]: E0223 09:54:46.698602 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:54:59 crc kubenswrapper[5118]: I0223 09:54:59.697462 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:54:59 crc kubenswrapper[5118]: E0223 09:54:59.698403 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:55:14 crc kubenswrapper[5118]: I0223 09:55:14.698396 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:55:14 crc kubenswrapper[5118]: E0223 09:55:14.699058 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:55:28 crc kubenswrapper[5118]: I0223 09:55:28.697626 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:55:28 crc kubenswrapper[5118]: E0223 09:55:28.698366 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 09:55:40 crc kubenswrapper[5118]: I0223 09:55:40.697181 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:55:41 crc kubenswrapper[5118]: I0223 09:55:41.317634 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"34e85f85810378f5daa565e70996bb730b8a869c87f37f0047be50ab8cad0458"} Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.218554 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krgxw"] Feb 23 09:57:38 crc kubenswrapper[5118]: E0223 09:57:38.219434 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="registry-server" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.219446 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="registry-server" Feb 23 09:57:38 crc kubenswrapper[5118]: E0223 09:57:38.219464 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="extract-utilities" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.219476 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="extract-utilities" Feb 23 09:57:38 crc kubenswrapper[5118]: E0223 09:57:38.219494 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="extract-content" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.219500 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="extract-content" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.219707 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e413fda0-8280-4c32-a29b-28d9526c7506" containerName="registry-server" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.221077 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.225676 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krgxw"] Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.360970 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzqqq\" (UniqueName: \"kubernetes.io/projected/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-kube-api-access-tzqqq\") pod \"redhat-marketplace-krgxw\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.361123 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-catalog-content\") pod \"redhat-marketplace-krgxw\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.361147 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-utilities\") pod \"redhat-marketplace-krgxw\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.463644 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzqqq\" (UniqueName: \"kubernetes.io/projected/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-kube-api-access-tzqqq\") pod \"redhat-marketplace-krgxw\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.463816 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-catalog-content\") pod \"redhat-marketplace-krgxw\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.463843 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-utilities\") pod \"redhat-marketplace-krgxw\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.464348 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-catalog-content\") pod \"redhat-marketplace-krgxw\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.464609 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-utilities\") pod \"redhat-marketplace-krgxw\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.486023 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzqqq\" (UniqueName: \"kubernetes.io/projected/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-kube-api-access-tzqqq\") pod \"redhat-marketplace-krgxw\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:38 crc kubenswrapper[5118]: I0223 09:57:38.539081 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:39 crc kubenswrapper[5118]: I0223 09:57:39.239131 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krgxw"] Feb 23 09:57:39 crc kubenswrapper[5118]: I0223 09:57:39.519595 5118 generic.go:334] "Generic (PLEG): container finished" podID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerID="1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc" exitCode=0 Feb 23 09:57:39 crc kubenswrapper[5118]: I0223 09:57:39.519757 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krgxw" event={"ID":"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb","Type":"ContainerDied","Data":"1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc"} Feb 23 09:57:39 crc kubenswrapper[5118]: I0223 09:57:39.520025 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krgxw" event={"ID":"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb","Type":"ContainerStarted","Data":"1649d819ac0cc76deb505a0e47f093fa687e51491ab26da4fa927a8081fca4fe"} Feb 23 09:57:39 crc kubenswrapper[5118]: I0223 09:57:39.522177 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 09:57:40 crc kubenswrapper[5118]: I0223 09:57:40.531309 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krgxw" event={"ID":"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb","Type":"ContainerStarted","Data":"4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31"} Feb 23 09:57:41 crc kubenswrapper[5118]: I0223 09:57:41.541576 5118 generic.go:334] "Generic (PLEG): container finished" podID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerID="4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31" exitCode=0 Feb 23 09:57:41 crc kubenswrapper[5118]: I0223 09:57:41.541632 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krgxw" event={"ID":"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb","Type":"ContainerDied","Data":"4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31"} Feb 23 09:57:42 crc kubenswrapper[5118]: I0223 09:57:42.573499 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krgxw" event={"ID":"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb","Type":"ContainerStarted","Data":"e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e"} Feb 23 09:57:42 crc kubenswrapper[5118]: I0223 09:57:42.602667 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krgxw" podStartSLOduration=2.179272335 podStartE2EDuration="4.60264613s" podCreationTimestamp="2026-02-23 09:57:38 +0000 UTC" firstStartedPulling="2026-02-23 09:57:39.521924846 +0000 UTC m=+11522.525709419" lastFinishedPulling="2026-02-23 09:57:41.945298641 +0000 UTC m=+11524.949083214" observedRunningTime="2026-02-23 09:57:42.59026758 +0000 UTC m=+11525.594052153" watchObservedRunningTime="2026-02-23 09:57:42.60264613 +0000 UTC m=+11525.606430703" Feb 23 09:57:48 crc kubenswrapper[5118]: I0223 09:57:48.539514 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:48 crc kubenswrapper[5118]: I0223 09:57:48.540013 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:48 crc kubenswrapper[5118]: I0223 09:57:48.648580 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:48 crc kubenswrapper[5118]: I0223 09:57:48.705054 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:48 crc kubenswrapper[5118]: I0223 09:57:48.892688 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krgxw"] Feb 23 09:57:50 crc kubenswrapper[5118]: I0223 09:57:50.650959 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-krgxw" podUID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerName="registry-server" containerID="cri-o://e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e" gracePeriod=2 Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.299675 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.421082 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-catalog-content\") pod \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.421238 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzqqq\" (UniqueName: \"kubernetes.io/projected/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-kube-api-access-tzqqq\") pod \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.421311 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-utilities\") pod \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\" (UID: \"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb\") " Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.422686 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-utilities" (OuterVolumeSpecName: "utilities") pod "edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" (UID: "edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.428348 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-kube-api-access-tzqqq" (OuterVolumeSpecName: "kube-api-access-tzqqq") pod "edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" (UID: "edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb"). InnerVolumeSpecName "kube-api-access-tzqqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.466418 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" (UID: "edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.523499 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.523536 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzqqq\" (UniqueName: \"kubernetes.io/projected/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-kube-api-access-tzqqq\") on node \"crc\" DevicePath \"\"" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.523549 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.662081 5118 generic.go:334] "Generic (PLEG): container finished" podID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerID="e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e" exitCode=0 Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.662135 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krgxw" event={"ID":"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb","Type":"ContainerDied","Data":"e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e"} Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.662170 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krgxw" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.662198 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krgxw" event={"ID":"edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb","Type":"ContainerDied","Data":"1649d819ac0cc76deb505a0e47f093fa687e51491ab26da4fa927a8081fca4fe"} Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.662225 5118 scope.go:117] "RemoveContainer" containerID="e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.693587 5118 scope.go:117] "RemoveContainer" containerID="4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.721766 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krgxw"] Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.733231 5118 scope.go:117] "RemoveContainer" containerID="1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.740193 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-krgxw"] Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.781354 5118 scope.go:117] "RemoveContainer" containerID="e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e" Feb 23 09:57:51 crc kubenswrapper[5118]: E0223 09:57:51.787277 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e\": container with ID starting with e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e not found: ID does not exist" containerID="e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.787355 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e"} err="failed to get container status \"e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e\": rpc error: code = NotFound desc = could not find container \"e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e\": container with ID starting with e31c38689d15f611edf4ef2f45a589cbdee724583e1d7da4e4e7faa7dff50d2e not found: ID does not exist" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.787390 5118 scope.go:117] "RemoveContainer" containerID="4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31" Feb 23 09:57:51 crc kubenswrapper[5118]: E0223 09:57:51.787897 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31\": container with ID starting with 4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31 not found: ID does not exist" containerID="4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.787929 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31"} err="failed to get container status \"4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31\": rpc error: code = NotFound desc = could not find container \"4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31\": container with ID starting with 4bfe804083ea27044e44e113b3c08801326fbb3de8b63d1b0164e68801c32c31 not found: ID does not exist" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.787948 5118 scope.go:117] "RemoveContainer" containerID="1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc" Feb 23 09:57:51 crc kubenswrapper[5118]: E0223 09:57:51.788529 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc\": container with ID starting with 1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc not found: ID does not exist" containerID="1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc" Feb 23 09:57:51 crc kubenswrapper[5118]: I0223 09:57:51.788592 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc"} err="failed to get container status \"1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc\": rpc error: code = NotFound desc = could not find container \"1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc\": container with ID starting with 1e95e43c9e45f9c09cc8dc8e543d905c5cc5d6b6ca681046b878fe64c7c2c1bc not found: ID does not exist" Feb 23 09:57:53 crc kubenswrapper[5118]: I0223 09:57:53.712259 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" path="/var/lib/kubelet/pods/edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb/volumes" Feb 23 09:58:02 crc kubenswrapper[5118]: I0223 09:58:02.974854 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:58:02 crc kubenswrapper[5118]: I0223 09:58:02.975461 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:58:32 crc kubenswrapper[5118]: I0223 09:58:32.975638 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:58:32 crc kubenswrapper[5118]: I0223 09:58:32.976136 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:59:02 crc kubenswrapper[5118]: I0223 09:59:02.975617 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 09:59:02 crc kubenswrapper[5118]: I0223 09:59:02.977704 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 09:59:02 crc kubenswrapper[5118]: I0223 09:59:02.977934 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 09:59:02 crc kubenswrapper[5118]: I0223 09:59:02.979614 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34e85f85810378f5daa565e70996bb730b8a869c87f37f0047be50ab8cad0458"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 09:59:02 crc kubenswrapper[5118]: I0223 09:59:02.980347 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://34e85f85810378f5daa565e70996bb730b8a869c87f37f0047be50ab8cad0458" gracePeriod=600 Feb 23 09:59:03 crc kubenswrapper[5118]: I0223 09:59:03.427562 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="34e85f85810378f5daa565e70996bb730b8a869c87f37f0047be50ab8cad0458" exitCode=0 Feb 23 09:59:03 crc kubenswrapper[5118]: I0223 09:59:03.427651 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"34e85f85810378f5daa565e70996bb730b8a869c87f37f0047be50ab8cad0458"} Feb 23 09:59:03 crc kubenswrapper[5118]: I0223 09:59:03.428051 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03"} Feb 23 09:59:03 crc kubenswrapper[5118]: I0223 09:59:03.428072 5118 scope.go:117] "RemoveContainer" containerID="3e45cc4137b63cddc6dd50628df71ea483b6bcf24723bfc08c6de47ae16aab30" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.777865 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b6nnh"] Feb 23 09:59:23 crc kubenswrapper[5118]: E0223 09:59:23.778646 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerName="registry-server" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.778674 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerName="registry-server" Feb 23 09:59:23 crc kubenswrapper[5118]: E0223 09:59:23.778692 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerName="extract-content" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.778700 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerName="extract-content" Feb 23 09:59:23 crc kubenswrapper[5118]: E0223 09:59:23.778710 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerName="extract-utilities" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.778716 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerName="extract-utilities" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.778960 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc0ee18-dd2c-49a5-9f3e-39629b8bb7fb" containerName="registry-server" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.780293 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.793763 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6nnh"] Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.873836 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-utilities\") pod \"certified-operators-b6nnh\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.873976 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt6bq\" (UniqueName: \"kubernetes.io/projected/86428ba7-e604-403b-b9a9-c35fa5333e84-kube-api-access-wt6bq\") pod \"certified-operators-b6nnh\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.874041 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-catalog-content\") pod \"certified-operators-b6nnh\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.976053 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-utilities\") pod \"certified-operators-b6nnh\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.976262 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt6bq\" (UniqueName: \"kubernetes.io/projected/86428ba7-e604-403b-b9a9-c35fa5333e84-kube-api-access-wt6bq\") pod \"certified-operators-b6nnh\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.976338 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-catalog-content\") pod \"certified-operators-b6nnh\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.977879 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-utilities\") pod \"certified-operators-b6nnh\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:23 crc kubenswrapper[5118]: I0223 09:59:23.977934 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-catalog-content\") pod \"certified-operators-b6nnh\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:24 crc kubenswrapper[5118]: I0223 09:59:23.998935 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt6bq\" (UniqueName: \"kubernetes.io/projected/86428ba7-e604-403b-b9a9-c35fa5333e84-kube-api-access-wt6bq\") pod \"certified-operators-b6nnh\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:24 crc kubenswrapper[5118]: I0223 09:59:24.100262 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:24 crc kubenswrapper[5118]: I0223 09:59:24.730548 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b6nnh"] Feb 23 09:59:25 crc kubenswrapper[5118]: I0223 09:59:25.689979 5118 generic.go:334] "Generic (PLEG): container finished" podID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerID="6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c" exitCode=0 Feb 23 09:59:25 crc kubenswrapper[5118]: I0223 09:59:25.690326 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6nnh" event={"ID":"86428ba7-e604-403b-b9a9-c35fa5333e84","Type":"ContainerDied","Data":"6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c"} Feb 23 09:59:25 crc kubenswrapper[5118]: I0223 09:59:25.690362 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6nnh" event={"ID":"86428ba7-e604-403b-b9a9-c35fa5333e84","Type":"ContainerStarted","Data":"54555436d8e3ac2769aa34c10eb07f3bfb957134c625d358dce88eaec586f068"} Feb 23 09:59:27 crc kubenswrapper[5118]: I0223 09:59:27.715656 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6nnh" event={"ID":"86428ba7-e604-403b-b9a9-c35fa5333e84","Type":"ContainerStarted","Data":"9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9"} Feb 23 09:59:28 crc kubenswrapper[5118]: I0223 09:59:28.731016 5118 generic.go:334] "Generic (PLEG): container finished" podID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerID="9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9" exitCode=0 Feb 23 09:59:28 crc kubenswrapper[5118]: I0223 09:59:28.731382 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6nnh" event={"ID":"86428ba7-e604-403b-b9a9-c35fa5333e84","Type":"ContainerDied","Data":"9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9"} Feb 23 09:59:29 crc kubenswrapper[5118]: I0223 09:59:29.743391 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6nnh" event={"ID":"86428ba7-e604-403b-b9a9-c35fa5333e84","Type":"ContainerStarted","Data":"7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2"} Feb 23 09:59:29 crc kubenswrapper[5118]: I0223 09:59:29.782832 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b6nnh" podStartSLOduration=3.3763727709999998 podStartE2EDuration="6.782807063s" podCreationTimestamp="2026-02-23 09:59:23 +0000 UTC" firstStartedPulling="2026-02-23 09:59:25.692465198 +0000 UTC m=+11628.696249771" lastFinishedPulling="2026-02-23 09:59:29.09889949 +0000 UTC m=+11632.102684063" observedRunningTime="2026-02-23 09:59:29.770965676 +0000 UTC m=+11632.774750249" watchObservedRunningTime="2026-02-23 09:59:29.782807063 +0000 UTC m=+11632.786591636" Feb 23 09:59:34 crc kubenswrapper[5118]: I0223 09:59:34.100358 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:34 crc kubenswrapper[5118]: I0223 09:59:34.102049 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:34 crc kubenswrapper[5118]: I0223 09:59:34.166828 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:34 crc kubenswrapper[5118]: I0223 09:59:34.849278 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:34 crc kubenswrapper[5118]: I0223 09:59:34.902243 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6nnh"] Feb 23 09:59:36 crc kubenswrapper[5118]: I0223 09:59:36.823367 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b6nnh" podUID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerName="registry-server" containerID="cri-o://7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2" gracePeriod=2 Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.563670 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.714151 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt6bq\" (UniqueName: \"kubernetes.io/projected/86428ba7-e604-403b-b9a9-c35fa5333e84-kube-api-access-wt6bq\") pod \"86428ba7-e604-403b-b9a9-c35fa5333e84\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.714879 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-utilities\") pod \"86428ba7-e604-403b-b9a9-c35fa5333e84\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.715474 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-catalog-content\") pod \"86428ba7-e604-403b-b9a9-c35fa5333e84\" (UID: \"86428ba7-e604-403b-b9a9-c35fa5333e84\") " Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.725939 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-utilities" (OuterVolumeSpecName: "utilities") pod "86428ba7-e604-403b-b9a9-c35fa5333e84" (UID: "86428ba7-e604-403b-b9a9-c35fa5333e84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.739641 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86428ba7-e604-403b-b9a9-c35fa5333e84-kube-api-access-wt6bq" (OuterVolumeSpecName: "kube-api-access-wt6bq") pod "86428ba7-e604-403b-b9a9-c35fa5333e84" (UID: "86428ba7-e604-403b-b9a9-c35fa5333e84"). InnerVolumeSpecName "kube-api-access-wt6bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.814754 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86428ba7-e604-403b-b9a9-c35fa5333e84" (UID: "86428ba7-e604-403b-b9a9-c35fa5333e84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.827704 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.827736 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt6bq\" (UniqueName: \"kubernetes.io/projected/86428ba7-e604-403b-b9a9-c35fa5333e84-kube-api-access-wt6bq\") on node \"crc\" DevicePath \"\"" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.827748 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86428ba7-e604-403b-b9a9-c35fa5333e84-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.837627 5118 generic.go:334] "Generic (PLEG): container finished" podID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerID="7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2" exitCode=0 Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.837679 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6nnh" event={"ID":"86428ba7-e604-403b-b9a9-c35fa5333e84","Type":"ContainerDied","Data":"7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2"} Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.837711 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b6nnh" event={"ID":"86428ba7-e604-403b-b9a9-c35fa5333e84","Type":"ContainerDied","Data":"54555436d8e3ac2769aa34c10eb07f3bfb957134c625d358dce88eaec586f068"} Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.837733 5118 scope.go:117] "RemoveContainer" containerID="7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.837921 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b6nnh" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.870557 5118 scope.go:117] "RemoveContainer" containerID="9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.880639 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b6nnh"] Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.894070 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b6nnh"] Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.912018 5118 scope.go:117] "RemoveContainer" containerID="6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.939860 5118 scope.go:117] "RemoveContainer" containerID="7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2" Feb 23 09:59:37 crc kubenswrapper[5118]: E0223 09:59:37.944373 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2\": container with ID starting with 7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2 not found: ID does not exist" containerID="7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.944441 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2"} err="failed to get container status \"7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2\": rpc error: code = NotFound desc = could not find container \"7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2\": container with ID starting with 7b5aebdbfdfe6a8e0c5232c5347908347e6b84cc56d2d4eaf84c42d748da95a2 not found: ID does not exist" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.944476 5118 scope.go:117] "RemoveContainer" containerID="9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9" Feb 23 09:59:37 crc kubenswrapper[5118]: E0223 09:59:37.944822 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9\": container with ID starting with 9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9 not found: ID does not exist" containerID="9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.944963 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9"} err="failed to get container status \"9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9\": rpc error: code = NotFound desc = could not find container \"9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9\": container with ID starting with 9f21b6c56a4bb344b37efc67f765e4f4b13d65216292d09df6e5fbeea15a7dd9 not found: ID does not exist" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.945086 5118 scope.go:117] "RemoveContainer" containerID="6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c" Feb 23 09:59:37 crc kubenswrapper[5118]: E0223 09:59:37.945658 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c\": container with ID starting with 6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c not found: ID does not exist" containerID="6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c" Feb 23 09:59:37 crc kubenswrapper[5118]: I0223 09:59:37.945799 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c"} err="failed to get container status \"6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c\": rpc error: code = NotFound desc = could not find container \"6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c\": container with ID starting with 6bcb95ebd9b3d4324fcd6917f501c281a19bab4915608da653bb82269bdd606c not found: ID does not exist" Feb 23 09:59:39 crc kubenswrapper[5118]: I0223 09:59:39.722581 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86428ba7-e604-403b-b9a9-c35fa5333e84" path="/var/lib/kubelet/pods/86428ba7-e604-403b-b9a9-c35fa5333e84/volumes" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.195471 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v"] Feb 23 10:00:00 crc kubenswrapper[5118]: E0223 10:00:00.196609 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerName="extract-content" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.196629 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerName="extract-content" Feb 23 10:00:00 crc kubenswrapper[5118]: E0223 10:00:00.196664 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerName="extract-utilities" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.196672 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerName="extract-utilities" Feb 23 10:00:00 crc kubenswrapper[5118]: E0223 10:00:00.196693 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerName="registry-server" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.196701 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerName="registry-server" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.196949 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="86428ba7-e604-403b-b9a9-c35fa5333e84" containerName="registry-server" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.197868 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.200601 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.200904 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.205270 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v"] Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.291473 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b03443fe-245d-498d-9ce4-5a221202d503-secret-volume\") pod \"collect-profiles-29530680-2vt6v\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.291583 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sb55\" (UniqueName: \"kubernetes.io/projected/b03443fe-245d-498d-9ce4-5a221202d503-kube-api-access-9sb55\") pod \"collect-profiles-29530680-2vt6v\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.291626 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b03443fe-245d-498d-9ce4-5a221202d503-config-volume\") pod \"collect-profiles-29530680-2vt6v\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.394003 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b03443fe-245d-498d-9ce4-5a221202d503-secret-volume\") pod \"collect-profiles-29530680-2vt6v\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.394131 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sb55\" (UniqueName: \"kubernetes.io/projected/b03443fe-245d-498d-9ce4-5a221202d503-kube-api-access-9sb55\") pod \"collect-profiles-29530680-2vt6v\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.394164 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b03443fe-245d-498d-9ce4-5a221202d503-config-volume\") pod \"collect-profiles-29530680-2vt6v\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.395176 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b03443fe-245d-498d-9ce4-5a221202d503-config-volume\") pod \"collect-profiles-29530680-2vt6v\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.403064 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b03443fe-245d-498d-9ce4-5a221202d503-secret-volume\") pod \"collect-profiles-29530680-2vt6v\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.414024 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sb55\" (UniqueName: \"kubernetes.io/projected/b03443fe-245d-498d-9ce4-5a221202d503-kube-api-access-9sb55\") pod \"collect-profiles-29530680-2vt6v\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:00 crc kubenswrapper[5118]: I0223 10:00:00.516544 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:01 crc kubenswrapper[5118]: I0223 10:00:01.015348 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v"] Feb 23 10:00:01 crc kubenswrapper[5118]: I0223 10:00:01.078972 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" event={"ID":"b03443fe-245d-498d-9ce4-5a221202d503","Type":"ContainerStarted","Data":"4bc14026fee254e6fa5534db2c9038ba83daee85722c16e4e619d457484e6783"} Feb 23 10:00:02 crc kubenswrapper[5118]: I0223 10:00:02.099471 5118 generic.go:334] "Generic (PLEG): container finished" podID="b03443fe-245d-498d-9ce4-5a221202d503" containerID="64b49385e9431af45d7e05c3b2575d19ca0c81f5b8c3db7ad0c14751075848e0" exitCode=0 Feb 23 10:00:02 crc kubenswrapper[5118]: I0223 10:00:02.099544 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" event={"ID":"b03443fe-245d-498d-9ce4-5a221202d503","Type":"ContainerDied","Data":"64b49385e9431af45d7e05c3b2575d19ca0c81f5b8c3db7ad0c14751075848e0"} Feb 23 10:00:03 crc kubenswrapper[5118]: I0223 10:00:03.682573 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:03 crc kubenswrapper[5118]: I0223 10:00:03.761952 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sb55\" (UniqueName: \"kubernetes.io/projected/b03443fe-245d-498d-9ce4-5a221202d503-kube-api-access-9sb55\") pod \"b03443fe-245d-498d-9ce4-5a221202d503\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " Feb 23 10:00:03 crc kubenswrapper[5118]: I0223 10:00:03.762454 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b03443fe-245d-498d-9ce4-5a221202d503-secret-volume\") pod \"b03443fe-245d-498d-9ce4-5a221202d503\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " Feb 23 10:00:03 crc kubenswrapper[5118]: I0223 10:00:03.762550 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b03443fe-245d-498d-9ce4-5a221202d503-config-volume\") pod \"b03443fe-245d-498d-9ce4-5a221202d503\" (UID: \"b03443fe-245d-498d-9ce4-5a221202d503\") " Feb 23 10:00:03 crc kubenswrapper[5118]: I0223 10:00:03.764583 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b03443fe-245d-498d-9ce4-5a221202d503-config-volume" (OuterVolumeSpecName: "config-volume") pod "b03443fe-245d-498d-9ce4-5a221202d503" (UID: "b03443fe-245d-498d-9ce4-5a221202d503"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 10:00:03 crc kubenswrapper[5118]: I0223 10:00:03.775628 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03443fe-245d-498d-9ce4-5a221202d503-kube-api-access-9sb55" (OuterVolumeSpecName: "kube-api-access-9sb55") pod "b03443fe-245d-498d-9ce4-5a221202d503" (UID: "b03443fe-245d-498d-9ce4-5a221202d503"). InnerVolumeSpecName "kube-api-access-9sb55". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:00:03 crc kubenswrapper[5118]: I0223 10:00:03.782319 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03443fe-245d-498d-9ce4-5a221202d503-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b03443fe-245d-498d-9ce4-5a221202d503" (UID: "b03443fe-245d-498d-9ce4-5a221202d503"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 10:00:03 crc kubenswrapper[5118]: I0223 10:00:03.866708 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b03443fe-245d-498d-9ce4-5a221202d503-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 10:00:03 crc kubenswrapper[5118]: I0223 10:00:03.866750 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b03443fe-245d-498d-9ce4-5a221202d503-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 10:00:03 crc kubenswrapper[5118]: I0223 10:00:03.866763 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sb55\" (UniqueName: \"kubernetes.io/projected/b03443fe-245d-498d-9ce4-5a221202d503-kube-api-access-9sb55\") on node \"crc\" DevicePath \"\"" Feb 23 10:00:04 crc kubenswrapper[5118]: I0223 10:00:04.122718 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" event={"ID":"b03443fe-245d-498d-9ce4-5a221202d503","Type":"ContainerDied","Data":"4bc14026fee254e6fa5534db2c9038ba83daee85722c16e4e619d457484e6783"} Feb 23 10:00:04 crc kubenswrapper[5118]: I0223 10:00:04.123005 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bc14026fee254e6fa5534db2c9038ba83daee85722c16e4e619d457484e6783" Feb 23 10:00:04 crc kubenswrapper[5118]: I0223 10:00:04.122782 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530680-2vt6v" Feb 23 10:00:04 crc kubenswrapper[5118]: I0223 10:00:04.783626 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d"] Feb 23 10:00:04 crc kubenswrapper[5118]: I0223 10:00:04.794948 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530635-wkf8d"] Feb 23 10:00:05 crc kubenswrapper[5118]: I0223 10:00:05.711711 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10ed81de-2549-464b-9243-25e9cf51511c" path="/var/lib/kubelet/pods/10ed81de-2549-464b-9243-25e9cf51511c/volumes" Feb 23 10:00:08 crc kubenswrapper[5118]: I0223 10:00:08.605819 5118 scope.go:117] "RemoveContainer" containerID="a4b0589081c57b575016d17e7d1912dd918259c3dcb31a4c5b71ff4d12099014" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.436168 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tpmql"] Feb 23 10:00:29 crc kubenswrapper[5118]: E0223 10:00:29.437701 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03443fe-245d-498d-9ce4-5a221202d503" containerName="collect-profiles" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.437728 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03443fe-245d-498d-9ce4-5a221202d503" containerName="collect-profiles" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.438036 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03443fe-245d-498d-9ce4-5a221202d503" containerName="collect-profiles" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.441396 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.491150 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tpmql"] Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.508046 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-utilities\") pod \"community-operators-tpmql\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.508308 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-catalog-content\") pod \"community-operators-tpmql\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.508667 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xstkf\" (UniqueName: \"kubernetes.io/projected/73307468-391a-49c5-b704-e25df7331c03-kube-api-access-xstkf\") pod \"community-operators-tpmql\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.612489 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xstkf\" (UniqueName: \"kubernetes.io/projected/73307468-391a-49c5-b704-e25df7331c03-kube-api-access-xstkf\") pod \"community-operators-tpmql\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.612573 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-utilities\") pod \"community-operators-tpmql\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.612632 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-catalog-content\") pod \"community-operators-tpmql\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.613222 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-catalog-content\") pod \"community-operators-tpmql\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.613324 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-utilities\") pod \"community-operators-tpmql\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.645896 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xstkf\" (UniqueName: \"kubernetes.io/projected/73307468-391a-49c5-b704-e25df7331c03-kube-api-access-xstkf\") pod \"community-operators-tpmql\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:29 crc kubenswrapper[5118]: I0223 10:00:29.792608 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:30 crc kubenswrapper[5118]: I0223 10:00:30.440656 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tpmql"] Feb 23 10:00:31 crc kubenswrapper[5118]: I0223 10:00:31.444765 5118 generic.go:334] "Generic (PLEG): container finished" podID="73307468-391a-49c5-b704-e25df7331c03" containerID="24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9" exitCode=0 Feb 23 10:00:31 crc kubenswrapper[5118]: I0223 10:00:31.444803 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpmql" event={"ID":"73307468-391a-49c5-b704-e25df7331c03","Type":"ContainerDied","Data":"24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9"} Feb 23 10:00:31 crc kubenswrapper[5118]: I0223 10:00:31.445240 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpmql" event={"ID":"73307468-391a-49c5-b704-e25df7331c03","Type":"ContainerStarted","Data":"b982264a82c31c9dd4a59dc03175b0d5dc98f3f29d453d33b6eeabc35e311303"} Feb 23 10:00:32 crc kubenswrapper[5118]: I0223 10:00:32.456884 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpmql" event={"ID":"73307468-391a-49c5-b704-e25df7331c03","Type":"ContainerStarted","Data":"a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411"} Feb 23 10:00:34 crc kubenswrapper[5118]: I0223 10:00:34.487206 5118 generic.go:334] "Generic (PLEG): container finished" podID="73307468-391a-49c5-b704-e25df7331c03" containerID="a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411" exitCode=0 Feb 23 10:00:34 crc kubenswrapper[5118]: I0223 10:00:34.487242 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpmql" event={"ID":"73307468-391a-49c5-b704-e25df7331c03","Type":"ContainerDied","Data":"a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411"} Feb 23 10:00:40 crc kubenswrapper[5118]: I0223 10:00:40.557892 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpmql" event={"ID":"73307468-391a-49c5-b704-e25df7331c03","Type":"ContainerStarted","Data":"e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa"} Feb 23 10:00:49 crc kubenswrapper[5118]: I0223 10:00:49.792759 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:49 crc kubenswrapper[5118]: I0223 10:00:49.793286 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:49 crc kubenswrapper[5118]: I0223 10:00:49.845599 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:49 crc kubenswrapper[5118]: I0223 10:00:49.875044 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tpmql" podStartSLOduration=12.365393903 podStartE2EDuration="20.87501714s" podCreationTimestamp="2026-02-23 10:00:29 +0000 UTC" firstStartedPulling="2026-02-23 10:00:31.447316722 +0000 UTC m=+11694.451101295" lastFinishedPulling="2026-02-23 10:00:39.956939959 +0000 UTC m=+11702.960724532" observedRunningTime="2026-02-23 10:00:40.584274049 +0000 UTC m=+11703.588058632" watchObservedRunningTime="2026-02-23 10:00:49.87501714 +0000 UTC m=+11712.878801723" Feb 23 10:00:50 crc kubenswrapper[5118]: I0223 10:00:50.721025 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:50 crc kubenswrapper[5118]: I0223 10:00:50.785087 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tpmql"] Feb 23 10:00:52 crc kubenswrapper[5118]: I0223 10:00:52.692159 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tpmql" podUID="73307468-391a-49c5-b704-e25df7331c03" containerName="registry-server" containerID="cri-o://e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa" gracePeriod=2 Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.412209 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.562370 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-utilities\") pod \"73307468-391a-49c5-b704-e25df7331c03\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.562476 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-catalog-content\") pod \"73307468-391a-49c5-b704-e25df7331c03\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.562563 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xstkf\" (UniqueName: \"kubernetes.io/projected/73307468-391a-49c5-b704-e25df7331c03-kube-api-access-xstkf\") pod \"73307468-391a-49c5-b704-e25df7331c03\" (UID: \"73307468-391a-49c5-b704-e25df7331c03\") " Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.563927 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-utilities" (OuterVolumeSpecName: "utilities") pod "73307468-391a-49c5-b704-e25df7331c03" (UID: "73307468-391a-49c5-b704-e25df7331c03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.574379 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73307468-391a-49c5-b704-e25df7331c03-kube-api-access-xstkf" (OuterVolumeSpecName: "kube-api-access-xstkf") pod "73307468-391a-49c5-b704-e25df7331c03" (UID: "73307468-391a-49c5-b704-e25df7331c03"). InnerVolumeSpecName "kube-api-access-xstkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.631205 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73307468-391a-49c5-b704-e25df7331c03" (UID: "73307468-391a-49c5-b704-e25df7331c03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.664551 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.664587 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73307468-391a-49c5-b704-e25df7331c03-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.664599 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xstkf\" (UniqueName: \"kubernetes.io/projected/73307468-391a-49c5-b704-e25df7331c03-kube-api-access-xstkf\") on node \"crc\" DevicePath \"\"" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.705590 5118 generic.go:334] "Generic (PLEG): container finished" podID="73307468-391a-49c5-b704-e25df7331c03" containerID="e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa" exitCode=0 Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.705704 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tpmql" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.720483 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpmql" event={"ID":"73307468-391a-49c5-b704-e25df7331c03","Type":"ContainerDied","Data":"e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa"} Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.720529 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tpmql" event={"ID":"73307468-391a-49c5-b704-e25df7331c03","Type":"ContainerDied","Data":"b982264a82c31c9dd4a59dc03175b0d5dc98f3f29d453d33b6eeabc35e311303"} Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.720553 5118 scope.go:117] "RemoveContainer" containerID="e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.760328 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tpmql"] Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.771324 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tpmql"] Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.780253 5118 scope.go:117] "RemoveContainer" containerID="a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.807956 5118 scope.go:117] "RemoveContainer" containerID="24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.856138 5118 scope.go:117] "RemoveContainer" containerID="e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa" Feb 23 10:00:53 crc kubenswrapper[5118]: E0223 10:00:53.856721 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa\": container with ID starting with e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa not found: ID does not exist" containerID="e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.857913 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa"} err="failed to get container status \"e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa\": rpc error: code = NotFound desc = could not find container \"e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa\": container with ID starting with e57dec1933aa5435768df8a9a595859a76e25e1b11040e881647d17875784baa not found: ID does not exist" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.857981 5118 scope.go:117] "RemoveContainer" containerID="a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411" Feb 23 10:00:53 crc kubenswrapper[5118]: E0223 10:00:53.858394 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411\": container with ID starting with a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411 not found: ID does not exist" containerID="a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.858422 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411"} err="failed to get container status \"a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411\": rpc error: code = NotFound desc = could not find container \"a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411\": container with ID starting with a35f64c1cd1a294c1f4a5105bccc5e67c39a82d256291768624169094e7c8411 not found: ID does not exist" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.858440 5118 scope.go:117] "RemoveContainer" containerID="24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9" Feb 23 10:00:53 crc kubenswrapper[5118]: E0223 10:00:53.858783 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9\": container with ID starting with 24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9 not found: ID does not exist" containerID="24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9" Feb 23 10:00:53 crc kubenswrapper[5118]: I0223 10:00:53.858835 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9"} err="failed to get container status \"24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9\": rpc error: code = NotFound desc = could not find container \"24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9\": container with ID starting with 24de1bcc3790806c74ebdcdf3b1e7bd26ca2100f50a932d3df77b7dedc0204c9 not found: ID does not exist" Feb 23 10:00:55 crc kubenswrapper[5118]: I0223 10:00:55.710749 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73307468-391a-49c5-b704-e25df7331c03" path="/var/lib/kubelet/pods/73307468-391a-49c5-b704-e25df7331c03/volumes" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.165087 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29530681-hz74l"] Feb 23 10:01:00 crc kubenswrapper[5118]: E0223 10:01:00.166347 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73307468-391a-49c5-b704-e25df7331c03" containerName="extract-utilities" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.166374 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="73307468-391a-49c5-b704-e25df7331c03" containerName="extract-utilities" Feb 23 10:01:00 crc kubenswrapper[5118]: E0223 10:01:00.166408 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73307468-391a-49c5-b704-e25df7331c03" containerName="extract-content" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.166419 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="73307468-391a-49c5-b704-e25df7331c03" containerName="extract-content" Feb 23 10:01:00 crc kubenswrapper[5118]: E0223 10:01:00.166455 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73307468-391a-49c5-b704-e25df7331c03" containerName="registry-server" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.166468 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="73307468-391a-49c5-b704-e25df7331c03" containerName="registry-server" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.166986 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="73307468-391a-49c5-b704-e25df7331c03" containerName="registry-server" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.177601 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.208206 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530681-hz74l"] Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.301525 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9tt4\" (UniqueName: \"kubernetes.io/projected/76e9f12c-042d-4e31-bb0b-ff59287da006-kube-api-access-l9tt4\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.301624 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-combined-ca-bundle\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.301697 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-config-data\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.301792 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-fernet-keys\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.403040 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-combined-ca-bundle\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.403344 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-config-data\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.403482 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-fernet-keys\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.403654 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9tt4\" (UniqueName: \"kubernetes.io/projected/76e9f12c-042d-4e31-bb0b-ff59287da006-kube-api-access-l9tt4\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.410446 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-combined-ca-bundle\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.410446 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-fernet-keys\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.416286 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-config-data\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.429033 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9tt4\" (UniqueName: \"kubernetes.io/projected/76e9f12c-042d-4e31-bb0b-ff59287da006-kube-api-access-l9tt4\") pod \"keystone-cron-29530681-hz74l\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.506718 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:00 crc kubenswrapper[5118]: I0223 10:01:00.985738 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530681-hz74l"] Feb 23 10:01:01 crc kubenswrapper[5118]: I0223 10:01:01.796169 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530681-hz74l" event={"ID":"76e9f12c-042d-4e31-bb0b-ff59287da006","Type":"ContainerStarted","Data":"f605bcc8aa4ea35040da99d0c958ff06c007596413cfe1b61b8b48976e849ed0"} Feb 23 10:01:01 crc kubenswrapper[5118]: I0223 10:01:01.796479 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530681-hz74l" event={"ID":"76e9f12c-042d-4e31-bb0b-ff59287da006","Type":"ContainerStarted","Data":"613769a280d80a81c8bf1bb8e769ec86ad5c6f351a616613c53c1d155386074c"} Feb 23 10:01:01 crc kubenswrapper[5118]: I0223 10:01:01.827128 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29530681-hz74l" podStartSLOduration=1.827088485 podStartE2EDuration="1.827088485s" podCreationTimestamp="2026-02-23 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 10:01:01.817593044 +0000 UTC m=+11724.821377637" watchObservedRunningTime="2026-02-23 10:01:01.827088485 +0000 UTC m=+11724.830873058" Feb 23 10:01:04 crc kubenswrapper[5118]: I0223 10:01:04.826021 5118 generic.go:334] "Generic (PLEG): container finished" podID="76e9f12c-042d-4e31-bb0b-ff59287da006" containerID="f605bcc8aa4ea35040da99d0c958ff06c007596413cfe1b61b8b48976e849ed0" exitCode=0 Feb 23 10:01:04 crc kubenswrapper[5118]: I0223 10:01:04.826116 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530681-hz74l" event={"ID":"76e9f12c-042d-4e31-bb0b-ff59287da006","Type":"ContainerDied","Data":"f605bcc8aa4ea35040da99d0c958ff06c007596413cfe1b61b8b48976e849ed0"} Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.485772 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.533925 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-combined-ca-bundle\") pod \"76e9f12c-042d-4e31-bb0b-ff59287da006\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.533968 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-fernet-keys\") pod \"76e9f12c-042d-4e31-bb0b-ff59287da006\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.534064 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9tt4\" (UniqueName: \"kubernetes.io/projected/76e9f12c-042d-4e31-bb0b-ff59287da006-kube-api-access-l9tt4\") pod \"76e9f12c-042d-4e31-bb0b-ff59287da006\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.534204 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-config-data\") pod \"76e9f12c-042d-4e31-bb0b-ff59287da006\" (UID: \"76e9f12c-042d-4e31-bb0b-ff59287da006\") " Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.584155 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "76e9f12c-042d-4e31-bb0b-ff59287da006" (UID: "76e9f12c-042d-4e31-bb0b-ff59287da006"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.588599 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e9f12c-042d-4e31-bb0b-ff59287da006-kube-api-access-l9tt4" (OuterVolumeSpecName: "kube-api-access-l9tt4") pod "76e9f12c-042d-4e31-bb0b-ff59287da006" (UID: "76e9f12c-042d-4e31-bb0b-ff59287da006"). InnerVolumeSpecName "kube-api-access-l9tt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.604724 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76e9f12c-042d-4e31-bb0b-ff59287da006" (UID: "76e9f12c-042d-4e31-bb0b-ff59287da006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.614772 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-config-data" (OuterVolumeSpecName: "config-data") pod "76e9f12c-042d-4e31-bb0b-ff59287da006" (UID: "76e9f12c-042d-4e31-bb0b-ff59287da006"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.636950 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9tt4\" (UniqueName: \"kubernetes.io/projected/76e9f12c-042d-4e31-bb0b-ff59287da006-kube-api-access-l9tt4\") on node \"crc\" DevicePath \"\"" Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.636983 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.636992 5118 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.637003 5118 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/76e9f12c-042d-4e31-bb0b-ff59287da006-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.845048 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530681-hz74l" event={"ID":"76e9f12c-042d-4e31-bb0b-ff59287da006","Type":"ContainerDied","Data":"613769a280d80a81c8bf1bb8e769ec86ad5c6f351a616613c53c1d155386074c"} Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.845126 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530681-hz74l" Feb 23 10:01:06 crc kubenswrapper[5118]: I0223 10:01:06.845149 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="613769a280d80a81c8bf1bb8e769ec86ad5c6f351a616613c53c1d155386074c" Feb 23 10:01:18 crc kubenswrapper[5118]: I0223 10:01:18.721522 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="c23b5987-be35-42db-b1d5-cfaadfbdb9e0" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 23 10:01:19 crc kubenswrapper[5118]: I0223 10:01:19.982438 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" podUID="4dae455a-bb65-4035-9cac-679bcb07e7f3" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.42:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:20 crc kubenswrapper[5118]: I0223 10:01:20.842319 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-volume-volume1-0" podUID="54fb818d-8422-4517-84b1-b9d71a9f213a" containerName="cinder-volume" probeResult="failure" output="Get \"http://10.217.1.102:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:20 crc kubenswrapper[5118]: I0223 10:01:20.883341 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="1d980983-651d-4df6-a7ca-18d1191300cf" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.105:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:21 crc kubenswrapper[5118]: I0223 10:01:21.612282 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-backup-0" podUID="6b3cd17b-fa93-44fa-9571-cebff83d65ad" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.1.103:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:23 crc kubenswrapper[5118]: I0223 10:01:23.024290 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" podUID="4dae455a-bb65-4035-9cac-679bcb07e7f3" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.42:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:23 crc kubenswrapper[5118]: I0223 10:01:23.720222 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="c23b5987-be35-42db-b1d5-cfaadfbdb9e0" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 23 10:01:25 crc kubenswrapper[5118]: I0223 10:01:25.020347 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-5vhwv" podUID="4e891361-5f36-4ebe-a394-c553a139765a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:25 crc kubenswrapper[5118]: I0223 10:01:25.883390 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-volume-volume1-0" podUID="54fb818d-8422-4517-84b1-b9d71a9f213a" containerName="cinder-volume" probeResult="failure" output="Get \"http://10.217.1.102:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:25 crc kubenswrapper[5118]: I0223 10:01:25.924382 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="1d980983-651d-4df6-a7ca-18d1191300cf" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.105:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:26 crc kubenswrapper[5118]: I0223 10:01:26.065532 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" podUID="4dae455a-bb65-4035-9cac-679bcb07e7f3" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.42:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:26 crc kubenswrapper[5118]: I0223 10:01:26.654440 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-backup-0" podUID="6b3cd17b-fa93-44fa-9571-cebff83d65ad" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.1.103:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:28 crc kubenswrapper[5118]: I0223 10:01:28.172405 5118 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-75bvd" podUID="ad24076e-75f9-406f-8f31-11211974625d" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:28 crc kubenswrapper[5118]: I0223 10:01:28.213429 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-75bvd" podUID="ad24076e-75f9-406f-8f31-11211974625d" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:28 crc kubenswrapper[5118]: I0223 10:01:28.254433 5118 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kzwjr container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.65:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 10:01:28 crc kubenswrapper[5118]: I0223 10:01:28.254523 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-kzwjr" podUID="2e64f70b-4830-4fef-94c0-d5eef18c5eb0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.65:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:28 crc kubenswrapper[5118]: I0223 10:01:28.721015 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="c23b5987-be35-42db-b1d5-cfaadfbdb9e0" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Feb 23 10:01:28 crc kubenswrapper[5118]: I0223 10:01:28.721175 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Feb 23 10:01:28 crc kubenswrapper[5118]: I0223 10:01:28.722386 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"443ecb8a6a62e13eb189433f1f439206a70612506122bb0d89904d4a7a871c84"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Feb 23 10:01:28 crc kubenswrapper[5118]: I0223 10:01:28.722561 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c23b5987-be35-42db-b1d5-cfaadfbdb9e0" containerName="ceilometer-central-agent" containerID="cri-o://443ecb8a6a62e13eb189433f1f439206a70612506122bb0d89904d4a7a871c84" gracePeriod=30 Feb 23 10:01:28 crc kubenswrapper[5118]: I0223 10:01:28.723006 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="c23b5987-be35-42db-b1d5-cfaadfbdb9e0" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Feb 23 10:01:29 crc kubenswrapper[5118]: I0223 10:01:29.107410 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-kf2x5" podUID="4dae455a-bb65-4035-9cac-679bcb07e7f3" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.42:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:30 crc kubenswrapper[5118]: I0223 10:01:30.925779 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-volume-volume1-0" podUID="54fb818d-8422-4517-84b1-b9d71a9f213a" containerName="cinder-volume" probeResult="failure" output="Get \"http://10.217.1.102:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:30 crc kubenswrapper[5118]: I0223 10:01:30.926205 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 23 10:01:30 crc kubenswrapper[5118]: I0223 10:01:30.927665 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-volume" containerStatusID={"Type":"cri-o","ID":"4c79f177ad515968885c3d8d11e9a38761b4cba8b2a220929d8acd2bcfa1ff16"} pod="openstack/cinder-volume-volume1-0" containerMessage="Container cinder-volume failed liveness probe, will be restarted" Feb 23 10:01:30 crc kubenswrapper[5118]: I0223 10:01:30.927790 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="54fb818d-8422-4517-84b1-b9d71a9f213a" containerName="cinder-volume" containerID="cri-o://4c79f177ad515968885c3d8d11e9a38761b4cba8b2a220929d8acd2bcfa1ff16" gracePeriod=30 Feb 23 10:01:30 crc kubenswrapper[5118]: I0223 10:01:30.966520 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="1d980983-651d-4df6-a7ca-18d1191300cf" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.1.105:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:30 crc kubenswrapper[5118]: I0223 10:01:30.966634 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 10:01:30 crc kubenswrapper[5118]: I0223 10:01:30.969193 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"8ef69207cf9008b4276250fa2cb094863b20603f2bc298365835ad2fdba89005"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Feb 23 10:01:30 crc kubenswrapper[5118]: I0223 10:01:30.969267 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1d980983-651d-4df6-a7ca-18d1191300cf" containerName="cinder-scheduler" containerID="cri-o://8ef69207cf9008b4276250fa2cb094863b20603f2bc298365835ad2fdba89005" gracePeriod=30 Feb 23 10:01:31 crc kubenswrapper[5118]: I0223 10:01:31.734425 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-backup-0" podUID="6b3cd17b-fa93-44fa-9571-cebff83d65ad" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.217.1.103:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:31 crc kubenswrapper[5118]: I0223 10:01:31.734547 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-backup-0" Feb 23 10:01:31 crc kubenswrapper[5118]: I0223 10:01:31.734446 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-98sv6" podUID="d42094d4-0a02-4b3d-8bad-7d0a0bda3d47" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.81:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 10:01:31 crc kubenswrapper[5118]: I0223 10:01:31.735805 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-backup" containerStatusID={"Type":"cri-o","ID":"6881a67160010059398317ace9154f13a06dbd58244c977e52407e940a0704a1"} pod="openstack/cinder-backup-0" containerMessage="Container cinder-backup failed liveness probe, will be restarted" Feb 23 10:01:31 crc kubenswrapper[5118]: I0223 10:01:31.735902 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="6b3cd17b-fa93-44fa-9571-cebff83d65ad" containerName="cinder-backup" containerID="cri-o://6881a67160010059398317ace9154f13a06dbd58244c977e52407e940a0704a1" gracePeriod=30 Feb 23 10:01:32 crc kubenswrapper[5118]: I0223 10:01:32.976060 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:01:32 crc kubenswrapper[5118]: I0223 10:01:32.976815 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:01:33 crc kubenswrapper[5118]: E0223 10:01:33.834912 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc23b5987_be35_42db_b1d5_cfaadfbdb9e0.slice/crio-conmon-443ecb8a6a62e13eb189433f1f439206a70612506122bb0d89904d4a7a871c84.scope\": RecentStats: unable to find data in memory cache]" Feb 23 10:01:34 crc kubenswrapper[5118]: I0223 10:01:34.179713 5118 generic.go:334] "Generic (PLEG): container finished" podID="c23b5987-be35-42db-b1d5-cfaadfbdb9e0" containerID="443ecb8a6a62e13eb189433f1f439206a70612506122bb0d89904d4a7a871c84" exitCode=0 Feb 23 10:01:34 crc kubenswrapper[5118]: I0223 10:01:34.179786 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23b5987-be35-42db-b1d5-cfaadfbdb9e0","Type":"ContainerDied","Data":"443ecb8a6a62e13eb189433f1f439206a70612506122bb0d89904d4a7a871c84"} Feb 23 10:01:37 crc kubenswrapper[5118]: I0223 10:01:37.225519 5118 generic.go:334] "Generic (PLEG): container finished" podID="54fb818d-8422-4517-84b1-b9d71a9f213a" containerID="4c79f177ad515968885c3d8d11e9a38761b4cba8b2a220929d8acd2bcfa1ff16" exitCode=0 Feb 23 10:01:37 crc kubenswrapper[5118]: I0223 10:01:37.225616 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"54fb818d-8422-4517-84b1-b9d71a9f213a","Type":"ContainerDied","Data":"4c79f177ad515968885c3d8d11e9a38761b4cba8b2a220929d8acd2bcfa1ff16"} Feb 23 10:01:37 crc kubenswrapper[5118]: I0223 10:01:37.235390 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c23b5987-be35-42db-b1d5-cfaadfbdb9e0","Type":"ContainerStarted","Data":"96d181b0e99506f008bf3192b891bcec445700b13555204e4949495484183910"} Feb 23 10:01:38 crc kubenswrapper[5118]: I0223 10:01:38.248425 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"54fb818d-8422-4517-84b1-b9d71a9f213a","Type":"ContainerStarted","Data":"f44328d0b0f175a9cb9ca627c5eb172f657217549884d5b5a86f5f35eb958acc"} Feb 23 10:01:38 crc kubenswrapper[5118]: I0223 10:01:38.253321 5118 generic.go:334] "Generic (PLEG): container finished" podID="1d980983-651d-4df6-a7ca-18d1191300cf" containerID="8ef69207cf9008b4276250fa2cb094863b20603f2bc298365835ad2fdba89005" exitCode=0 Feb 23 10:01:38 crc kubenswrapper[5118]: I0223 10:01:38.253793 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d980983-651d-4df6-a7ca-18d1191300cf","Type":"ContainerDied","Data":"8ef69207cf9008b4276250fa2cb094863b20603f2bc298365835ad2fdba89005"} Feb 23 10:01:38 crc kubenswrapper[5118]: I0223 10:01:38.800815 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 23 10:01:40 crc kubenswrapper[5118]: I0223 10:01:40.276255 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1d980983-651d-4df6-a7ca-18d1191300cf","Type":"ContainerStarted","Data":"1b3e87e0ec3f03f4d3a3e35e6fe3ead314ad238a5d1038498281cdfaa8175e3b"} Feb 23 10:01:40 crc kubenswrapper[5118]: I0223 10:01:40.818953 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 23 10:01:41 crc kubenswrapper[5118]: I0223 10:01:41.321851 5118 generic.go:334] "Generic (PLEG): container finished" podID="6b3cd17b-fa93-44fa-9571-cebff83d65ad" containerID="6881a67160010059398317ace9154f13a06dbd58244c977e52407e940a0704a1" exitCode=0 Feb 23 10:01:41 crc kubenswrapper[5118]: I0223 10:01:41.323347 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6b3cd17b-fa93-44fa-9571-cebff83d65ad","Type":"ContainerDied","Data":"6881a67160010059398317ace9154f13a06dbd58244c977e52407e940a0704a1"} Feb 23 10:01:41 crc kubenswrapper[5118]: I0223 10:01:41.323378 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"6b3cd17b-fa93-44fa-9571-cebff83d65ad","Type":"ContainerStarted","Data":"2f9fa584d86b4707db81dfdb6299dc4bd8668a088eb3bd1812958e5cc3d81319"} Feb 23 10:01:43 crc kubenswrapper[5118]: I0223 10:01:43.817171 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 23 10:01:44 crc kubenswrapper[5118]: I0223 10:01:44.571719 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 23 10:01:45 crc kubenswrapper[5118]: I0223 10:01:45.827815 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 23 10:01:49 crc kubenswrapper[5118]: I0223 10:01:49.580140 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 23 10:02:02 crc kubenswrapper[5118]: I0223 10:02:02.975635 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:02:02 crc kubenswrapper[5118]: I0223 10:02:02.976280 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:02:32 crc kubenswrapper[5118]: I0223 10:02:32.975852 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:02:32 crc kubenswrapper[5118]: I0223 10:02:32.976554 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:02:32 crc kubenswrapper[5118]: I0223 10:02:32.976609 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 10:02:32 crc kubenswrapper[5118]: I0223 10:02:32.977579 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 10:02:32 crc kubenswrapper[5118]: I0223 10:02:32.977633 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" gracePeriod=600 Feb 23 10:02:33 crc kubenswrapper[5118]: E0223 10:02:33.126350 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:02:33 crc kubenswrapper[5118]: I0223 10:02:33.828421 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" exitCode=0 Feb 23 10:02:33 crc kubenswrapper[5118]: I0223 10:02:33.828477 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03"} Feb 23 10:02:33 crc kubenswrapper[5118]: I0223 10:02:33.828515 5118 scope.go:117] "RemoveContainer" containerID="34e85f85810378f5daa565e70996bb730b8a869c87f37f0047be50ab8cad0458" Feb 23 10:02:33 crc kubenswrapper[5118]: I0223 10:02:33.829402 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:02:33 crc kubenswrapper[5118]: E0223 10:02:33.829763 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:02:48 crc kubenswrapper[5118]: I0223 10:02:48.697121 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:02:48 crc kubenswrapper[5118]: E0223 10:02:48.697937 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:02:59 crc kubenswrapper[5118]: I0223 10:02:59.697451 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:02:59 crc kubenswrapper[5118]: E0223 10:02:59.698163 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:03:10 crc kubenswrapper[5118]: I0223 10:03:10.698383 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:03:10 crc kubenswrapper[5118]: E0223 10:03:10.699282 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:03:22 crc kubenswrapper[5118]: I0223 10:03:22.699831 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:03:22 crc kubenswrapper[5118]: E0223 10:03:22.700838 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.674079 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9jf8x"] Feb 23 10:03:26 crc kubenswrapper[5118]: E0223 10:03:26.675117 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e9f12c-042d-4e31-bb0b-ff59287da006" containerName="keystone-cron" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.675132 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e9f12c-042d-4e31-bb0b-ff59287da006" containerName="keystone-cron" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.675414 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e9f12c-042d-4e31-bb0b-ff59287da006" containerName="keystone-cron" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.677180 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.686946 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jf8x"] Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.752890 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-utilities\") pod \"redhat-operators-9jf8x\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.753053 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-catalog-content\") pod \"redhat-operators-9jf8x\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.753759 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fpxt\" (UniqueName: \"kubernetes.io/projected/5e8d71de-228f-4d13-af89-962375f4561d-kube-api-access-2fpxt\") pod \"redhat-operators-9jf8x\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.856165 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-utilities\") pod \"redhat-operators-9jf8x\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.856315 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-catalog-content\") pod \"redhat-operators-9jf8x\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.856460 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fpxt\" (UniqueName: \"kubernetes.io/projected/5e8d71de-228f-4d13-af89-962375f4561d-kube-api-access-2fpxt\") pod \"redhat-operators-9jf8x\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.856962 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-catalog-content\") pod \"redhat-operators-9jf8x\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.857143 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-utilities\") pod \"redhat-operators-9jf8x\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:26 crc kubenswrapper[5118]: I0223 10:03:26.879921 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fpxt\" (UniqueName: \"kubernetes.io/projected/5e8d71de-228f-4d13-af89-962375f4561d-kube-api-access-2fpxt\") pod \"redhat-operators-9jf8x\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:27 crc kubenswrapper[5118]: I0223 10:03:27.007548 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:27 crc kubenswrapper[5118]: I0223 10:03:27.535128 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jf8x"] Feb 23 10:03:28 crc kubenswrapper[5118]: I0223 10:03:28.380488 5118 generic.go:334] "Generic (PLEG): container finished" podID="5e8d71de-228f-4d13-af89-962375f4561d" containerID="20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa" exitCode=0 Feb 23 10:03:28 crc kubenswrapper[5118]: I0223 10:03:28.380576 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jf8x" event={"ID":"5e8d71de-228f-4d13-af89-962375f4561d","Type":"ContainerDied","Data":"20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa"} Feb 23 10:03:28 crc kubenswrapper[5118]: I0223 10:03:28.380781 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jf8x" event={"ID":"5e8d71de-228f-4d13-af89-962375f4561d","Type":"ContainerStarted","Data":"340261b2f1ca14a918366b4664902c8673da9ce1aaea45e0b6e95ed45302fd7e"} Feb 23 10:03:28 crc kubenswrapper[5118]: I0223 10:03:28.382627 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 10:03:29 crc kubenswrapper[5118]: I0223 10:03:29.391819 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jf8x" event={"ID":"5e8d71de-228f-4d13-af89-962375f4561d","Type":"ContainerStarted","Data":"b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233"} Feb 23 10:03:34 crc kubenswrapper[5118]: I0223 10:03:34.443676 5118 generic.go:334] "Generic (PLEG): container finished" podID="5e8d71de-228f-4d13-af89-962375f4561d" containerID="b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233" exitCode=0 Feb 23 10:03:34 crc kubenswrapper[5118]: I0223 10:03:34.444030 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jf8x" event={"ID":"5e8d71de-228f-4d13-af89-962375f4561d","Type":"ContainerDied","Data":"b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233"} Feb 23 10:03:35 crc kubenswrapper[5118]: I0223 10:03:35.456447 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jf8x" event={"ID":"5e8d71de-228f-4d13-af89-962375f4561d","Type":"ContainerStarted","Data":"d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b"} Feb 23 10:03:35 crc kubenswrapper[5118]: I0223 10:03:35.479427 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9jf8x" podStartSLOduration=3.017167608 podStartE2EDuration="9.479394838s" podCreationTimestamp="2026-02-23 10:03:26 +0000 UTC" firstStartedPulling="2026-02-23 10:03:28.382381357 +0000 UTC m=+11871.386165930" lastFinishedPulling="2026-02-23 10:03:34.844608587 +0000 UTC m=+11877.848393160" observedRunningTime="2026-02-23 10:03:35.476851917 +0000 UTC m=+11878.480636490" watchObservedRunningTime="2026-02-23 10:03:35.479394838 +0000 UTC m=+11878.483179411" Feb 23 10:03:37 crc kubenswrapper[5118]: I0223 10:03:37.009287 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:37 crc kubenswrapper[5118]: I0223 10:03:37.009338 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:37 crc kubenswrapper[5118]: I0223 10:03:37.712395 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:03:37 crc kubenswrapper[5118]: E0223 10:03:37.713081 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:03:38 crc kubenswrapper[5118]: I0223 10:03:38.055275 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9jf8x" podUID="5e8d71de-228f-4d13-af89-962375f4561d" containerName="registry-server" probeResult="failure" output=< Feb 23 10:03:38 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 10:03:38 crc kubenswrapper[5118]: > Feb 23 10:03:47 crc kubenswrapper[5118]: I0223 10:03:47.060814 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:47 crc kubenswrapper[5118]: I0223 10:03:47.125152 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:47 crc kubenswrapper[5118]: I0223 10:03:47.304377 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jf8x"] Feb 23 10:03:48 crc kubenswrapper[5118]: I0223 10:03:48.635432 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9jf8x" podUID="5e8d71de-228f-4d13-af89-962375f4561d" containerName="registry-server" containerID="cri-o://d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b" gracePeriod=2 Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.249221 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.301971 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fpxt\" (UniqueName: \"kubernetes.io/projected/5e8d71de-228f-4d13-af89-962375f4561d-kube-api-access-2fpxt\") pod \"5e8d71de-228f-4d13-af89-962375f4561d\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.302055 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-utilities\") pod \"5e8d71de-228f-4d13-af89-962375f4561d\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.302118 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-catalog-content\") pod \"5e8d71de-228f-4d13-af89-962375f4561d\" (UID: \"5e8d71de-228f-4d13-af89-962375f4561d\") " Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.303260 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-utilities" (OuterVolumeSpecName: "utilities") pod "5e8d71de-228f-4d13-af89-962375f4561d" (UID: "5e8d71de-228f-4d13-af89-962375f4561d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.308651 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e8d71de-228f-4d13-af89-962375f4561d-kube-api-access-2fpxt" (OuterVolumeSpecName: "kube-api-access-2fpxt") pod "5e8d71de-228f-4d13-af89-962375f4561d" (UID: "5e8d71de-228f-4d13-af89-962375f4561d"). InnerVolumeSpecName "kube-api-access-2fpxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.406134 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fpxt\" (UniqueName: \"kubernetes.io/projected/5e8d71de-228f-4d13-af89-962375f4561d-kube-api-access-2fpxt\") on node \"crc\" DevicePath \"\"" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.406178 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.416850 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e8d71de-228f-4d13-af89-962375f4561d" (UID: "5e8d71de-228f-4d13-af89-962375f4561d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.508079 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e8d71de-228f-4d13-af89-962375f4561d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.650646 5118 generic.go:334] "Generic (PLEG): container finished" podID="5e8d71de-228f-4d13-af89-962375f4561d" containerID="d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b" exitCode=0 Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.650730 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jf8x" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.650763 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jf8x" event={"ID":"5e8d71de-228f-4d13-af89-962375f4561d","Type":"ContainerDied","Data":"d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b"} Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.652116 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jf8x" event={"ID":"5e8d71de-228f-4d13-af89-962375f4561d","Type":"ContainerDied","Data":"340261b2f1ca14a918366b4664902c8673da9ce1aaea45e0b6e95ed45302fd7e"} Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.652156 5118 scope.go:117] "RemoveContainer" containerID="d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.681733 5118 scope.go:117] "RemoveContainer" containerID="b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.701206 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jf8x"] Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.715998 5118 scope.go:117] "RemoveContainer" containerID="20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.718396 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9jf8x"] Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.765193 5118 scope.go:117] "RemoveContainer" containerID="d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b" Feb 23 10:03:49 crc kubenswrapper[5118]: E0223 10:03:49.769339 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b\": container with ID starting with d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b not found: ID does not exist" containerID="d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.769375 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b"} err="failed to get container status \"d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b\": rpc error: code = NotFound desc = could not find container \"d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b\": container with ID starting with d2cdecc29de10860e6c231b1c25d93bf7bba92f1fbbde272e394975ba234cc0b not found: ID does not exist" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.769394 5118 scope.go:117] "RemoveContainer" containerID="b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233" Feb 23 10:03:49 crc kubenswrapper[5118]: E0223 10:03:49.769986 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233\": container with ID starting with b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233 not found: ID does not exist" containerID="b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.770008 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233"} err="failed to get container status \"b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233\": rpc error: code = NotFound desc = could not find container \"b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233\": container with ID starting with b16fe25f676fc8d8db1d24a4bd965d959351b463ce340c15e449256051d0f233 not found: ID does not exist" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.770023 5118 scope.go:117] "RemoveContainer" containerID="20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa" Feb 23 10:03:49 crc kubenswrapper[5118]: E0223 10:03:49.770533 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa\": container with ID starting with 20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa not found: ID does not exist" containerID="20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa" Feb 23 10:03:49 crc kubenswrapper[5118]: I0223 10:03:49.770559 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa"} err="failed to get container status \"20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa\": rpc error: code = NotFound desc = could not find container \"20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa\": container with ID starting with 20d43699201dc377ad940f99fad0a512e5f5e3a40d4709e84a676cb9a42ddcfa not found: ID does not exist" Feb 23 10:03:51 crc kubenswrapper[5118]: I0223 10:03:51.697699 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:03:51 crc kubenswrapper[5118]: E0223 10:03:51.698287 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:03:51 crc kubenswrapper[5118]: I0223 10:03:51.711600 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e8d71de-228f-4d13-af89-962375f4561d" path="/var/lib/kubelet/pods/5e8d71de-228f-4d13-af89-962375f4561d/volumes" Feb 23 10:04:05 crc kubenswrapper[5118]: I0223 10:04:05.697292 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:04:05 crc kubenswrapper[5118]: E0223 10:04:05.698364 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:04:16 crc kubenswrapper[5118]: I0223 10:04:16.697666 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:04:16 crc kubenswrapper[5118]: E0223 10:04:16.698690 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:04:27 crc kubenswrapper[5118]: I0223 10:04:27.704289 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:04:27 crc kubenswrapper[5118]: E0223 10:04:27.705002 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:04:39 crc kubenswrapper[5118]: I0223 10:04:39.698336 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:04:39 crc kubenswrapper[5118]: E0223 10:04:39.699130 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:04:54 crc kubenswrapper[5118]: I0223 10:04:54.697446 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:04:54 crc kubenswrapper[5118]: E0223 10:04:54.698217 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:05:06 crc kubenswrapper[5118]: I0223 10:05:06.697751 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:05:06 crc kubenswrapper[5118]: E0223 10:05:06.699021 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:05:20 crc kubenswrapper[5118]: I0223 10:05:20.698279 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:05:20 crc kubenswrapper[5118]: E0223 10:05:20.699465 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:05:34 crc kubenswrapper[5118]: I0223 10:05:34.698126 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:05:34 crc kubenswrapper[5118]: E0223 10:05:34.698870 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:05:45 crc kubenswrapper[5118]: I0223 10:05:45.698378 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:05:45 crc kubenswrapper[5118]: E0223 10:05:45.699157 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:05:57 crc kubenswrapper[5118]: I0223 10:05:57.734687 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:05:57 crc kubenswrapper[5118]: E0223 10:05:57.735399 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:06:11 crc kubenswrapper[5118]: I0223 10:06:11.697550 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:06:11 crc kubenswrapper[5118]: E0223 10:06:11.698429 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:06:22 crc kubenswrapper[5118]: I0223 10:06:22.697949 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:06:22 crc kubenswrapper[5118]: E0223 10:06:22.698857 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:06:33 crc kubenswrapper[5118]: I0223 10:06:33.698735 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:06:33 crc kubenswrapper[5118]: E0223 10:06:33.699570 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:06:47 crc kubenswrapper[5118]: I0223 10:06:47.707509 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:06:47 crc kubenswrapper[5118]: E0223 10:06:47.708369 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:07:01 crc kubenswrapper[5118]: I0223 10:07:01.697976 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:07:01 crc kubenswrapper[5118]: E0223 10:07:01.698859 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:07:12 crc kubenswrapper[5118]: I0223 10:07:12.698342 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:07:12 crc kubenswrapper[5118]: E0223 10:07:12.699374 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:07:24 crc kubenswrapper[5118]: I0223 10:07:24.696991 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:07:24 crc kubenswrapper[5118]: E0223 10:07:24.697819 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:07:35 crc kubenswrapper[5118]: I0223 10:07:35.697958 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:07:36 crc kubenswrapper[5118]: I0223 10:07:36.058157 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"4dcb38446cecdf81ec0db57492be71a7df5ba224f8bd7abc00df7bfb9045ad2c"} Feb 23 10:08:34 crc kubenswrapper[5118]: I0223 10:08:34.709338 5118 generic.go:334] "Generic (PLEG): container finished" podID="b5f49473-b37e-44f5-a7cf-ad38cd064729" containerID="bbdf74d22d0c7f3828b26e17f7782034ae071f0204d04bbae8d79f7095d5e51b" exitCode=1 Feb 23 10:08:34 crc kubenswrapper[5118]: I0223 10:08:34.709424 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b5f49473-b37e-44f5-a7cf-ad38cd064729","Type":"ContainerDied","Data":"bbdf74d22d0c7f3828b26e17f7782034ae071f0204d04bbae8d79f7095d5e51b"} Feb 23 10:08:34 crc kubenswrapper[5118]: E0223 10:08:34.787041 5118 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5f49473_b37e_44f5_a7cf_ad38cd064729.slice/crio-conmon-bbdf74d22d0c7f3828b26e17f7782034ae071f0204d04bbae8d79f7095d5e51b.scope\": RecentStats: unable to find data in memory cache]" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.172806 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.273088 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b5f49473-b37e-44f5-a7cf-ad38cd064729\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.273534 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-workdir\") pod \"b5f49473-b37e-44f5-a7cf-ad38cd064729\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.273566 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl8hk\" (UniqueName: \"kubernetes.io/projected/b5f49473-b37e-44f5-a7cf-ad38cd064729-kube-api-access-nl8hk\") pod \"b5f49473-b37e-44f5-a7cf-ad38cd064729\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.273624 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-config-data\") pod \"b5f49473-b37e-44f5-a7cf-ad38cd064729\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.273707 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-temporary\") pod \"b5f49473-b37e-44f5-a7cf-ad38cd064729\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.273745 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config\") pod \"b5f49473-b37e-44f5-a7cf-ad38cd064729\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.273810 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ssh-key\") pod \"b5f49473-b37e-44f5-a7cf-ad38cd064729\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.273911 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config-secret\") pod \"b5f49473-b37e-44f5-a7cf-ad38cd064729\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.273935 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ca-certs\") pod \"b5f49473-b37e-44f5-a7cf-ad38cd064729\" (UID: \"b5f49473-b37e-44f5-a7cf-ad38cd064729\") " Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.275014 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b5f49473-b37e-44f5-a7cf-ad38cd064729" (UID: "b5f49473-b37e-44f5-a7cf-ad38cd064729"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.276035 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-config-data" (OuterVolumeSpecName: "config-data") pod "b5f49473-b37e-44f5-a7cf-ad38cd064729" (UID: "b5f49473-b37e-44f5-a7cf-ad38cd064729"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.279556 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b5f49473-b37e-44f5-a7cf-ad38cd064729" (UID: "b5f49473-b37e-44f5-a7cf-ad38cd064729"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.280367 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f49473-b37e-44f5-a7cf-ad38cd064729-kube-api-access-nl8hk" (OuterVolumeSpecName: "kube-api-access-nl8hk") pod "b5f49473-b37e-44f5-a7cf-ad38cd064729" (UID: "b5f49473-b37e-44f5-a7cf-ad38cd064729"). InnerVolumeSpecName "kube-api-access-nl8hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.289199 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b5f49473-b37e-44f5-a7cf-ad38cd064729" (UID: "b5f49473-b37e-44f5-a7cf-ad38cd064729"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.303156 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b5f49473-b37e-44f5-a7cf-ad38cd064729" (UID: "b5f49473-b37e-44f5-a7cf-ad38cd064729"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.319984 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b5f49473-b37e-44f5-a7cf-ad38cd064729" (UID: "b5f49473-b37e-44f5-a7cf-ad38cd064729"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.325292 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b5f49473-b37e-44f5-a7cf-ad38cd064729" (UID: "b5f49473-b37e-44f5-a7cf-ad38cd064729"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.353328 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b5f49473-b37e-44f5-a7cf-ad38cd064729" (UID: "b5f49473-b37e-44f5-a7cf-ad38cd064729"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.376723 5118 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.376768 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.376784 5118 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.376797 5118 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.376811 5118 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b5f49473-b37e-44f5-a7cf-ad38cd064729-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.376856 5118 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.376869 5118 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b5f49473-b37e-44f5-a7cf-ad38cd064729-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.376884 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl8hk\" (UniqueName: \"kubernetes.io/projected/b5f49473-b37e-44f5-a7cf-ad38cd064729-kube-api-access-nl8hk\") on node \"crc\" DevicePath \"\"" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.376905 5118 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5f49473-b37e-44f5-a7cf-ad38cd064729-config-data\") on node \"crc\" DevicePath \"\"" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.397625 5118 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.479546 5118 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.730488 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b5f49473-b37e-44f5-a7cf-ad38cd064729","Type":"ContainerDied","Data":"ed2cb317846b7d2ebb4688b6de2060936d6e9bee5d9a88fd7e03ffbfb6c304a0"} Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.730803 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed2cb317846b7d2ebb4688b6de2060936d6e9bee5d9a88fd7e03ffbfb6c304a0" Feb 23 10:08:36 crc kubenswrapper[5118]: I0223 10:08:36.730607 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.051735 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 23 10:08:47 crc kubenswrapper[5118]: E0223 10:08:47.052888 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f49473-b37e-44f5-a7cf-ad38cd064729" containerName="tempest-tests-tempest-tests-runner" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.052909 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f49473-b37e-44f5-a7cf-ad38cd064729" containerName="tempest-tests-tempest-tests-runner" Feb 23 10:08:47 crc kubenswrapper[5118]: E0223 10:08:47.052944 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8d71de-228f-4d13-af89-962375f4561d" containerName="extract-content" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.052953 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8d71de-228f-4d13-af89-962375f4561d" containerName="extract-content" Feb 23 10:08:47 crc kubenswrapper[5118]: E0223 10:08:47.052972 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8d71de-228f-4d13-af89-962375f4561d" containerName="registry-server" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.052983 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8d71de-228f-4d13-af89-962375f4561d" containerName="registry-server" Feb 23 10:08:47 crc kubenswrapper[5118]: E0223 10:08:47.052993 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e8d71de-228f-4d13-af89-962375f4561d" containerName="extract-utilities" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.053002 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e8d71de-228f-4d13-af89-962375f4561d" containerName="extract-utilities" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.053313 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f49473-b37e-44f5-a7cf-ad38cd064729" containerName="tempest-tests-tempest-tests-runner" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.053342 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e8d71de-228f-4d13-af89-962375f4561d" containerName="registry-server" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.054389 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.057066 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-cbpwf" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.062815 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.197847 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ea4eec4e-c875-4140-a73f-51881eda1861\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.198186 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5rs2\" (UniqueName: \"kubernetes.io/projected/ea4eec4e-c875-4140-a73f-51881eda1861-kube-api-access-j5rs2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ea4eec4e-c875-4140-a73f-51881eda1861\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.300173 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5rs2\" (UniqueName: \"kubernetes.io/projected/ea4eec4e-c875-4140-a73f-51881eda1861-kube-api-access-j5rs2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ea4eec4e-c875-4140-a73f-51881eda1861\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.300452 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ea4eec4e-c875-4140-a73f-51881eda1861\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.300966 5118 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ea4eec4e-c875-4140-a73f-51881eda1861\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.326087 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5rs2\" (UniqueName: \"kubernetes.io/projected/ea4eec4e-c875-4140-a73f-51881eda1861-kube-api-access-j5rs2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ea4eec4e-c875-4140-a73f-51881eda1861\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.333914 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ea4eec4e-c875-4140-a73f-51881eda1861\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.385024 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.857925 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 23 10:08:47 crc kubenswrapper[5118]: I0223 10:08:47.870963 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 10:08:48 crc kubenswrapper[5118]: I0223 10:08:48.865378 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ea4eec4e-c875-4140-a73f-51881eda1861","Type":"ContainerStarted","Data":"30f2d1c7f3661f09edfe30d1150b31a6c56ff4b43b4f4bd12b94633b34ff0549"} Feb 23 10:08:49 crc kubenswrapper[5118]: I0223 10:08:49.876823 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ea4eec4e-c875-4140-a73f-51881eda1861","Type":"ContainerStarted","Data":"6bff1351b682a37d16ac59c91992ae2ebe1e5291ce3193a852b7468803516539"} Feb 23 10:08:49 crc kubenswrapper[5118]: I0223 10:08:49.892010 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.779672541 podStartE2EDuration="2.891981475s" podCreationTimestamp="2026-02-23 10:08:47 +0000 UTC" firstStartedPulling="2026-02-23 10:08:47.870633091 +0000 UTC m=+12190.874417684" lastFinishedPulling="2026-02-23 10:08:48.982942015 +0000 UTC m=+12191.986726618" observedRunningTime="2026-02-23 10:08:49.889216878 +0000 UTC m=+12192.893001451" watchObservedRunningTime="2026-02-23 10:08:49.891981475 +0000 UTC m=+12192.895766048" Feb 23 10:10:02 crc kubenswrapper[5118]: I0223 10:10:02.975410 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:10:02 crc kubenswrapper[5118]: I0223 10:10:02.975950 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.646415 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lrbmh/must-gather-l59jv"] Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.648426 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/must-gather-l59jv" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.656304 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lrbmh"/"kube-root-ca.crt" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.656494 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lrbmh"/"default-dockercfg-jxmnl" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.656496 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lrbmh"/"openshift-service-ca.crt" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.673658 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lrbmh/must-gather-l59jv"] Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.791715 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ea04ed6c-6822-4375-809a-d0c8ab5f484b-must-gather-output\") pod \"must-gather-l59jv\" (UID: \"ea04ed6c-6822-4375-809a-d0c8ab5f484b\") " pod="openshift-must-gather-lrbmh/must-gather-l59jv" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.792149 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gwl\" (UniqueName: \"kubernetes.io/projected/ea04ed6c-6822-4375-809a-d0c8ab5f484b-kube-api-access-28gwl\") pod \"must-gather-l59jv\" (UID: \"ea04ed6c-6822-4375-809a-d0c8ab5f484b\") " pod="openshift-must-gather-lrbmh/must-gather-l59jv" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.894362 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gwl\" (UniqueName: \"kubernetes.io/projected/ea04ed6c-6822-4375-809a-d0c8ab5f484b-kube-api-access-28gwl\") pod \"must-gather-l59jv\" (UID: \"ea04ed6c-6822-4375-809a-d0c8ab5f484b\") " pod="openshift-must-gather-lrbmh/must-gather-l59jv" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.894564 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ea04ed6c-6822-4375-809a-d0c8ab5f484b-must-gather-output\") pod \"must-gather-l59jv\" (UID: \"ea04ed6c-6822-4375-809a-d0c8ab5f484b\") " pod="openshift-must-gather-lrbmh/must-gather-l59jv" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.895316 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ea04ed6c-6822-4375-809a-d0c8ab5f484b-must-gather-output\") pod \"must-gather-l59jv\" (UID: \"ea04ed6c-6822-4375-809a-d0c8ab5f484b\") " pod="openshift-must-gather-lrbmh/must-gather-l59jv" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.940423 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gwl\" (UniqueName: \"kubernetes.io/projected/ea04ed6c-6822-4375-809a-d0c8ab5f484b-kube-api-access-28gwl\") pod \"must-gather-l59jv\" (UID: \"ea04ed6c-6822-4375-809a-d0c8ab5f484b\") " pod="openshift-must-gather-lrbmh/must-gather-l59jv" Feb 23 10:10:06 crc kubenswrapper[5118]: I0223 10:10:06.976197 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/must-gather-l59jv" Feb 23 10:10:07 crc kubenswrapper[5118]: I0223 10:10:07.449945 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lrbmh/must-gather-l59jv"] Feb 23 10:10:07 crc kubenswrapper[5118]: I0223 10:10:07.739574 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/must-gather-l59jv" event={"ID":"ea04ed6c-6822-4375-809a-d0c8ab5f484b","Type":"ContainerStarted","Data":"0c099812b1c6d6637812d966e7042a93ec7b73b30ef61cc871901aaf9088c1aa"} Feb 23 10:10:14 crc kubenswrapper[5118]: I0223 10:10:14.839013 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/must-gather-l59jv" event={"ID":"ea04ed6c-6822-4375-809a-d0c8ab5f484b","Type":"ContainerStarted","Data":"67aa9354bf4b1b213ddcae4c5c3d11584acce1e0449a49a8affac0253899a251"} Feb 23 10:10:14 crc kubenswrapper[5118]: I0223 10:10:14.839628 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/must-gather-l59jv" event={"ID":"ea04ed6c-6822-4375-809a-d0c8ab5f484b","Type":"ContainerStarted","Data":"b67fa2bd0975bc3ad32231ffc41e99039dd9a8a3ccaf41df33670791e05eff66"} Feb 23 10:10:14 crc kubenswrapper[5118]: I0223 10:10:14.861395 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lrbmh/must-gather-l59jv" podStartSLOduration=2.536689425 podStartE2EDuration="8.861374463s" podCreationTimestamp="2026-02-23 10:10:06 +0000 UTC" firstStartedPulling="2026-02-23 10:10:07.458580221 +0000 UTC m=+12270.462364794" lastFinishedPulling="2026-02-23 10:10:13.783265259 +0000 UTC m=+12276.787049832" observedRunningTime="2026-02-23 10:10:14.851541125 +0000 UTC m=+12277.855325708" watchObservedRunningTime="2026-02-23 10:10:14.861374463 +0000 UTC m=+12277.865159036" Feb 23 10:10:18 crc kubenswrapper[5118]: E0223 10:10:18.957481 5118 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.46:45348->38.102.83.46:40611: write tcp 38.102.83.46:45348->38.102.83.46:40611: write: broken pipe Feb 23 10:10:20 crc kubenswrapper[5118]: I0223 10:10:20.086658 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lrbmh/crc-debug-nwg8x"] Feb 23 10:10:20 crc kubenswrapper[5118]: I0223 10:10:20.088659 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" Feb 23 10:10:20 crc kubenswrapper[5118]: I0223 10:10:20.215983 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj5w4\" (UniqueName: \"kubernetes.io/projected/d1f1a230-130d-4cf9-a480-4d57f666a95c-kube-api-access-nj5w4\") pod \"crc-debug-nwg8x\" (UID: \"d1f1a230-130d-4cf9-a480-4d57f666a95c\") " pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" Feb 23 10:10:20 crc kubenswrapper[5118]: I0223 10:10:20.216474 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1f1a230-130d-4cf9-a480-4d57f666a95c-host\") pod \"crc-debug-nwg8x\" (UID: \"d1f1a230-130d-4cf9-a480-4d57f666a95c\") " pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" Feb 23 10:10:20 crc kubenswrapper[5118]: I0223 10:10:20.318545 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj5w4\" (UniqueName: \"kubernetes.io/projected/d1f1a230-130d-4cf9-a480-4d57f666a95c-kube-api-access-nj5w4\") pod \"crc-debug-nwg8x\" (UID: \"d1f1a230-130d-4cf9-a480-4d57f666a95c\") " pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" Feb 23 10:10:20 crc kubenswrapper[5118]: I0223 10:10:20.318730 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1f1a230-130d-4cf9-a480-4d57f666a95c-host\") pod \"crc-debug-nwg8x\" (UID: \"d1f1a230-130d-4cf9-a480-4d57f666a95c\") " pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" Feb 23 10:10:20 crc kubenswrapper[5118]: I0223 10:10:20.318844 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1f1a230-130d-4cf9-a480-4d57f666a95c-host\") pod \"crc-debug-nwg8x\" (UID: \"d1f1a230-130d-4cf9-a480-4d57f666a95c\") " pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" Feb 23 10:10:20 crc kubenswrapper[5118]: I0223 10:10:20.337626 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj5w4\" (UniqueName: \"kubernetes.io/projected/d1f1a230-130d-4cf9-a480-4d57f666a95c-kube-api-access-nj5w4\") pod \"crc-debug-nwg8x\" (UID: \"d1f1a230-130d-4cf9-a480-4d57f666a95c\") " pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" Feb 23 10:10:20 crc kubenswrapper[5118]: I0223 10:10:20.408109 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" Feb 23 10:10:20 crc kubenswrapper[5118]: I0223 10:10:20.900067 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" event={"ID":"d1f1a230-130d-4cf9-a480-4d57f666a95c","Type":"ContainerStarted","Data":"fdda6996057e97ad55518fcaf2e2ac6e6be8f85b1cb98167227e879a4fcf1b53"} Feb 23 10:10:30 crc kubenswrapper[5118]: I0223 10:10:30.022237 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" event={"ID":"d1f1a230-130d-4cf9-a480-4d57f666a95c","Type":"ContainerStarted","Data":"c05fda2b3296a81e5b6f8afd6fec560a0597e4f949ba354e9d577e6774bd35e0"} Feb 23 10:10:30 crc kubenswrapper[5118]: I0223 10:10:30.047675 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" podStartSLOduration=0.860323755 podStartE2EDuration="10.047650985s" podCreationTimestamp="2026-02-23 10:10:20 +0000 UTC" firstStartedPulling="2026-02-23 10:10:20.44224257 +0000 UTC m=+12283.446027133" lastFinishedPulling="2026-02-23 10:10:29.62956979 +0000 UTC m=+12292.633354363" observedRunningTime="2026-02-23 10:10:30.040977093 +0000 UTC m=+12293.044761666" watchObservedRunningTime="2026-02-23 10:10:30.047650985 +0000 UTC m=+12293.051435558" Feb 23 10:10:32 crc kubenswrapper[5118]: I0223 10:10:32.974839 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:10:32 crc kubenswrapper[5118]: I0223 10:10:32.975568 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:10:45 crc kubenswrapper[5118]: I0223 10:10:45.956007 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-286ds"] Feb 23 10:10:45 crc kubenswrapper[5118]: I0223 10:10:45.964146 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:45 crc kubenswrapper[5118]: I0223 10:10:45.980588 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-286ds"] Feb 23 10:10:46 crc kubenswrapper[5118]: I0223 10:10:46.112748 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-utilities\") pod \"certified-operators-286ds\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:46 crc kubenswrapper[5118]: I0223 10:10:46.112817 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-catalog-content\") pod \"certified-operators-286ds\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:46 crc kubenswrapper[5118]: I0223 10:10:46.112853 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvb7v\" (UniqueName: \"kubernetes.io/projected/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-kube-api-access-vvb7v\") pod \"certified-operators-286ds\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:46 crc kubenswrapper[5118]: I0223 10:10:46.215092 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-utilities\") pod \"certified-operators-286ds\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:46 crc kubenswrapper[5118]: I0223 10:10:46.215161 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-catalog-content\") pod \"certified-operators-286ds\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:46 crc kubenswrapper[5118]: I0223 10:10:46.215189 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvb7v\" (UniqueName: \"kubernetes.io/projected/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-kube-api-access-vvb7v\") pod \"certified-operators-286ds\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:46 crc kubenswrapper[5118]: I0223 10:10:46.215859 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-catalog-content\") pod \"certified-operators-286ds\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:46 crc kubenswrapper[5118]: I0223 10:10:46.215998 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-utilities\") pod \"certified-operators-286ds\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:46 crc kubenswrapper[5118]: I0223 10:10:46.237188 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvb7v\" (UniqueName: \"kubernetes.io/projected/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-kube-api-access-vvb7v\") pod \"certified-operators-286ds\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:46 crc kubenswrapper[5118]: I0223 10:10:46.386368 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:47 crc kubenswrapper[5118]: I0223 10:10:47.000768 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-286ds"] Feb 23 10:10:47 crc kubenswrapper[5118]: W0223 10:10:47.008447 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc2d6ae7_09b6_4cab_9a26_12f0381f0680.slice/crio-989771aa219ce8ce34ed2cc52e901d707f1e933dd1cf9586c9a0b8e1c3e61cd4 WatchSource:0}: Error finding container 989771aa219ce8ce34ed2cc52e901d707f1e933dd1cf9586c9a0b8e1c3e61cd4: Status 404 returned error can't find the container with id 989771aa219ce8ce34ed2cc52e901d707f1e933dd1cf9586c9a0b8e1c3e61cd4 Feb 23 10:10:47 crc kubenswrapper[5118]: I0223 10:10:47.201708 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-286ds" event={"ID":"cc2d6ae7-09b6-4cab-9a26-12f0381f0680","Type":"ContainerStarted","Data":"989771aa219ce8ce34ed2cc52e901d707f1e933dd1cf9586c9a0b8e1c3e61cd4"} Feb 23 10:10:48 crc kubenswrapper[5118]: I0223 10:10:48.216236 5118 generic.go:334] "Generic (PLEG): container finished" podID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerID="c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd" exitCode=0 Feb 23 10:10:48 crc kubenswrapper[5118]: I0223 10:10:48.216833 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-286ds" event={"ID":"cc2d6ae7-09b6-4cab-9a26-12f0381f0680","Type":"ContainerDied","Data":"c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd"} Feb 23 10:10:50 crc kubenswrapper[5118]: I0223 10:10:50.253559 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-286ds" event={"ID":"cc2d6ae7-09b6-4cab-9a26-12f0381f0680","Type":"ContainerStarted","Data":"0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683"} Feb 23 10:10:51 crc kubenswrapper[5118]: I0223 10:10:51.288753 5118 generic.go:334] "Generic (PLEG): container finished" podID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerID="0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683" exitCode=0 Feb 23 10:10:51 crc kubenswrapper[5118]: I0223 10:10:51.289162 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-286ds" event={"ID":"cc2d6ae7-09b6-4cab-9a26-12f0381f0680","Type":"ContainerDied","Data":"0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683"} Feb 23 10:10:52 crc kubenswrapper[5118]: I0223 10:10:52.302538 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-286ds" event={"ID":"cc2d6ae7-09b6-4cab-9a26-12f0381f0680","Type":"ContainerStarted","Data":"4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078"} Feb 23 10:10:52 crc kubenswrapper[5118]: I0223 10:10:52.345730 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-286ds" podStartSLOduration=3.851019521 podStartE2EDuration="7.345707666s" podCreationTimestamp="2026-02-23 10:10:45 +0000 UTC" firstStartedPulling="2026-02-23 10:10:48.223188193 +0000 UTC m=+12311.226972766" lastFinishedPulling="2026-02-23 10:10:51.717876298 +0000 UTC m=+12314.721660911" observedRunningTime="2026-02-23 10:10:52.334326591 +0000 UTC m=+12315.338111264" watchObservedRunningTime="2026-02-23 10:10:52.345707666 +0000 UTC m=+12315.349492249" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.315527 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mbfss"] Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.319005 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.344767 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbfss"] Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.491677 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-catalog-content\") pod \"community-operators-mbfss\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.492160 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-utilities\") pod \"community-operators-mbfss\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.492220 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-576zx\" (UniqueName: \"kubernetes.io/projected/d52a10af-d2df-47f5-87b4-9336e2ab87d6-kube-api-access-576zx\") pod \"community-operators-mbfss\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.594190 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-catalog-content\") pod \"community-operators-mbfss\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.594294 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-utilities\") pod \"community-operators-mbfss\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.594368 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-576zx\" (UniqueName: \"kubernetes.io/projected/d52a10af-d2df-47f5-87b4-9336e2ab87d6-kube-api-access-576zx\") pod \"community-operators-mbfss\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.595136 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-catalog-content\") pod \"community-operators-mbfss\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.595465 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-utilities\") pod \"community-operators-mbfss\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.631701 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-576zx\" (UniqueName: \"kubernetes.io/projected/d52a10af-d2df-47f5-87b4-9336e2ab87d6-kube-api-access-576zx\") pod \"community-operators-mbfss\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:54 crc kubenswrapper[5118]: I0223 10:10:54.688579 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:10:55 crc kubenswrapper[5118]: I0223 10:10:55.248518 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbfss"] Feb 23 10:10:55 crc kubenswrapper[5118]: W0223 10:10:55.249379 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd52a10af_d2df_47f5_87b4_9336e2ab87d6.slice/crio-409cf0abcdbb98b4f571abda6164d992d8875882e9babfa1eed5604cd6a69513 WatchSource:0}: Error finding container 409cf0abcdbb98b4f571abda6164d992d8875882e9babfa1eed5604cd6a69513: Status 404 returned error can't find the container with id 409cf0abcdbb98b4f571abda6164d992d8875882e9babfa1eed5604cd6a69513 Feb 23 10:10:55 crc kubenswrapper[5118]: I0223 10:10:55.338211 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbfss" event={"ID":"d52a10af-d2df-47f5-87b4-9336e2ab87d6","Type":"ContainerStarted","Data":"409cf0abcdbb98b4f571abda6164d992d8875882e9babfa1eed5604cd6a69513"} Feb 23 10:10:56 crc kubenswrapper[5118]: I0223 10:10:56.350875 5118 generic.go:334] "Generic (PLEG): container finished" podID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerID="66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8" exitCode=0 Feb 23 10:10:56 crc kubenswrapper[5118]: I0223 10:10:56.350973 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbfss" event={"ID":"d52a10af-d2df-47f5-87b4-9336e2ab87d6","Type":"ContainerDied","Data":"66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8"} Feb 23 10:10:56 crc kubenswrapper[5118]: I0223 10:10:56.386724 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:56 crc kubenswrapper[5118]: I0223 10:10:56.387186 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:56 crc kubenswrapper[5118]: I0223 10:10:56.445659 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:57 crc kubenswrapper[5118]: I0223 10:10:57.364310 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbfss" event={"ID":"d52a10af-d2df-47f5-87b4-9336e2ab87d6","Type":"ContainerStarted","Data":"24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001"} Feb 23 10:10:57 crc kubenswrapper[5118]: I0223 10:10:57.412880 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:10:58 crc kubenswrapper[5118]: I0223 10:10:58.380178 5118 generic.go:334] "Generic (PLEG): container finished" podID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerID="24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001" exitCode=0 Feb 23 10:10:58 crc kubenswrapper[5118]: I0223 10:10:58.381667 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbfss" event={"ID":"d52a10af-d2df-47f5-87b4-9336e2ab87d6","Type":"ContainerDied","Data":"24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001"} Feb 23 10:10:58 crc kubenswrapper[5118]: I0223 10:10:58.705586 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-286ds"] Feb 23 10:10:59 crc kubenswrapper[5118]: I0223 10:10:59.393060 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbfss" event={"ID":"d52a10af-d2df-47f5-87b4-9336e2ab87d6","Type":"ContainerStarted","Data":"baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b"} Feb 23 10:10:59 crc kubenswrapper[5118]: I0223 10:10:59.393149 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-286ds" podUID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerName="registry-server" containerID="cri-o://4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078" gracePeriod=2 Feb 23 10:10:59 crc kubenswrapper[5118]: I0223 10:10:59.463796 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mbfss" podStartSLOduration=2.980273273 podStartE2EDuration="5.463771116s" podCreationTimestamp="2026-02-23 10:10:54 +0000 UTC" firstStartedPulling="2026-02-23 10:10:56.354207007 +0000 UTC m=+12319.357991620" lastFinishedPulling="2026-02-23 10:10:58.83770489 +0000 UTC m=+12321.841489463" observedRunningTime="2026-02-23 10:10:59.418191542 +0000 UTC m=+12322.421976125" watchObservedRunningTime="2026-02-23 10:10:59.463771116 +0000 UTC m=+12322.467555699" Feb 23 10:10:59 crc kubenswrapper[5118]: I0223 10:10:59.928191 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.031765 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvb7v\" (UniqueName: \"kubernetes.io/projected/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-kube-api-access-vvb7v\") pod \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.032171 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-utilities\") pod \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.032223 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-catalog-content\") pod \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\" (UID: \"cc2d6ae7-09b6-4cab-9a26-12f0381f0680\") " Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.032806 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-utilities" (OuterVolumeSpecName: "utilities") pod "cc2d6ae7-09b6-4cab-9a26-12f0381f0680" (UID: "cc2d6ae7-09b6-4cab-9a26-12f0381f0680"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.033143 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.039254 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-kube-api-access-vvb7v" (OuterVolumeSpecName: "kube-api-access-vvb7v") pod "cc2d6ae7-09b6-4cab-9a26-12f0381f0680" (UID: "cc2d6ae7-09b6-4cab-9a26-12f0381f0680"). InnerVolumeSpecName "kube-api-access-vvb7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.088083 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc2d6ae7-09b6-4cab-9a26-12f0381f0680" (UID: "cc2d6ae7-09b6-4cab-9a26-12f0381f0680"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.135575 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvb7v\" (UniqueName: \"kubernetes.io/projected/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-kube-api-access-vvb7v\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.135629 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc2d6ae7-09b6-4cab-9a26-12f0381f0680-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.403758 5118 generic.go:334] "Generic (PLEG): container finished" podID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerID="4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078" exitCode=0 Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.403798 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-286ds" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.403833 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-286ds" event={"ID":"cc2d6ae7-09b6-4cab-9a26-12f0381f0680","Type":"ContainerDied","Data":"4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078"} Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.404050 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-286ds" event={"ID":"cc2d6ae7-09b6-4cab-9a26-12f0381f0680","Type":"ContainerDied","Data":"989771aa219ce8ce34ed2cc52e901d707f1e933dd1cf9586c9a0b8e1c3e61cd4"} Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.404067 5118 scope.go:117] "RemoveContainer" containerID="4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.424215 5118 scope.go:117] "RemoveContainer" containerID="0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.439132 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-286ds"] Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.452192 5118 scope.go:117] "RemoveContainer" containerID="c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.459427 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-286ds"] Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.500986 5118 scope.go:117] "RemoveContainer" containerID="4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078" Feb 23 10:11:00 crc kubenswrapper[5118]: E0223 10:11:00.504078 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078\": container with ID starting with 4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078 not found: ID does not exist" containerID="4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.504135 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078"} err="failed to get container status \"4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078\": rpc error: code = NotFound desc = could not find container \"4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078\": container with ID starting with 4749a6311b64469669d6a394cfc8d909593f80edf315e3b8db264cd96a1b7078 not found: ID does not exist" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.504157 5118 scope.go:117] "RemoveContainer" containerID="0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683" Feb 23 10:11:00 crc kubenswrapper[5118]: E0223 10:11:00.504544 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683\": container with ID starting with 0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683 not found: ID does not exist" containerID="0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.504574 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683"} err="failed to get container status \"0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683\": rpc error: code = NotFound desc = could not find container \"0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683\": container with ID starting with 0f769c68ae9e86c20358c8bbff999c4a382612ce11e1ca8dbaf70a9349944683 not found: ID does not exist" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.504595 5118 scope.go:117] "RemoveContainer" containerID="c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd" Feb 23 10:11:00 crc kubenswrapper[5118]: E0223 10:11:00.504886 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd\": container with ID starting with c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd not found: ID does not exist" containerID="c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd" Feb 23 10:11:00 crc kubenswrapper[5118]: I0223 10:11:00.504920 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd"} err="failed to get container status \"c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd\": rpc error: code = NotFound desc = could not find container \"c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd\": container with ID starting with c26f586ae21555522e4240f05d8797614ba096da1c7fc6dfafa02af8460bfbfd not found: ID does not exist" Feb 23 10:11:01 crc kubenswrapper[5118]: I0223 10:11:01.719923 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" path="/var/lib/kubelet/pods/cc2d6ae7-09b6-4cab-9a26-12f0381f0680/volumes" Feb 23 10:11:02 crc kubenswrapper[5118]: I0223 10:11:02.975613 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:11:02 crc kubenswrapper[5118]: I0223 10:11:02.975993 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:11:02 crc kubenswrapper[5118]: I0223 10:11:02.976183 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 10:11:02 crc kubenswrapper[5118]: I0223 10:11:02.978347 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dcb38446cecdf81ec0db57492be71a7df5ba224f8bd7abc00df7bfb9045ad2c"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 10:11:02 crc kubenswrapper[5118]: I0223 10:11:02.978452 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://4dcb38446cecdf81ec0db57492be71a7df5ba224f8bd7abc00df7bfb9045ad2c" gracePeriod=600 Feb 23 10:11:03 crc kubenswrapper[5118]: I0223 10:11:03.441586 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="4dcb38446cecdf81ec0db57492be71a7df5ba224f8bd7abc00df7bfb9045ad2c" exitCode=0 Feb 23 10:11:03 crc kubenswrapper[5118]: I0223 10:11:03.441797 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"4dcb38446cecdf81ec0db57492be71a7df5ba224f8bd7abc00df7bfb9045ad2c"} Feb 23 10:11:03 crc kubenswrapper[5118]: I0223 10:11:03.441923 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016"} Feb 23 10:11:03 crc kubenswrapper[5118]: I0223 10:11:03.441948 5118 scope.go:117] "RemoveContainer" containerID="95cf4375b5ec346736fd8c50bddbd356306c08ac60b601217b2335fee4e0fb03" Feb 23 10:11:04 crc kubenswrapper[5118]: I0223 10:11:04.689657 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:11:04 crc kubenswrapper[5118]: I0223 10:11:04.690485 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:11:04 crc kubenswrapper[5118]: I0223 10:11:04.771212 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:11:05 crc kubenswrapper[5118]: I0223 10:11:05.533828 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:11:05 crc kubenswrapper[5118]: I0223 10:11:05.593999 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbfss"] Feb 23 10:11:07 crc kubenswrapper[5118]: I0223 10:11:07.506840 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mbfss" podUID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerName="registry-server" containerID="cri-o://baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b" gracePeriod=2 Feb 23 10:11:07 crc kubenswrapper[5118]: I0223 10:11:07.958552 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.160083 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-576zx\" (UniqueName: \"kubernetes.io/projected/d52a10af-d2df-47f5-87b4-9336e2ab87d6-kube-api-access-576zx\") pod \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.160270 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-catalog-content\") pod \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.160366 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-utilities\") pod \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\" (UID: \"d52a10af-d2df-47f5-87b4-9336e2ab87d6\") " Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.161327 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-utilities" (OuterVolumeSpecName: "utilities") pod "d52a10af-d2df-47f5-87b4-9336e2ab87d6" (UID: "d52a10af-d2df-47f5-87b4-9336e2ab87d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.195279 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52a10af-d2df-47f5-87b4-9336e2ab87d6-kube-api-access-576zx" (OuterVolumeSpecName: "kube-api-access-576zx") pod "d52a10af-d2df-47f5-87b4-9336e2ab87d6" (UID: "d52a10af-d2df-47f5-87b4-9336e2ab87d6"). InnerVolumeSpecName "kube-api-access-576zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.209600 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d52a10af-d2df-47f5-87b4-9336e2ab87d6" (UID: "d52a10af-d2df-47f5-87b4-9336e2ab87d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.262938 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.262971 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d52a10af-d2df-47f5-87b4-9336e2ab87d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.262983 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-576zx\" (UniqueName: \"kubernetes.io/projected/d52a10af-d2df-47f5-87b4-9336e2ab87d6-kube-api-access-576zx\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.524295 5118 generic.go:334] "Generic (PLEG): container finished" podID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerID="baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b" exitCode=0 Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.524371 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbfss" event={"ID":"d52a10af-d2df-47f5-87b4-9336e2ab87d6","Type":"ContainerDied","Data":"baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b"} Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.524489 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbfss" event={"ID":"d52a10af-d2df-47f5-87b4-9336e2ab87d6","Type":"ContainerDied","Data":"409cf0abcdbb98b4f571abda6164d992d8875882e9babfa1eed5604cd6a69513"} Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.524528 5118 scope.go:117] "RemoveContainer" containerID="baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.524681 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbfss" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.568370 5118 scope.go:117] "RemoveContainer" containerID="24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.577421 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbfss"] Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.589376 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mbfss"] Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.599361 5118 scope.go:117] "RemoveContainer" containerID="66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.669240 5118 scope.go:117] "RemoveContainer" containerID="baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b" Feb 23 10:11:08 crc kubenswrapper[5118]: E0223 10:11:08.669850 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b\": container with ID starting with baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b not found: ID does not exist" containerID="baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.669880 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b"} err="failed to get container status \"baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b\": rpc error: code = NotFound desc = could not find container \"baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b\": container with ID starting with baf7b01ea96a563451f00bd1456ee2b5742182017606d6fe16ef9a76c3baaa4b not found: ID does not exist" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.669901 5118 scope.go:117] "RemoveContainer" containerID="24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001" Feb 23 10:11:08 crc kubenswrapper[5118]: E0223 10:11:08.670696 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001\": container with ID starting with 24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001 not found: ID does not exist" containerID="24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.670746 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001"} err="failed to get container status \"24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001\": rpc error: code = NotFound desc = could not find container \"24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001\": container with ID starting with 24b68e8f91e80812b6b2521490ede02f58cbc6d368b9b3a43d2d4686ab492001 not found: ID does not exist" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.670778 5118 scope.go:117] "RemoveContainer" containerID="66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8" Feb 23 10:11:08 crc kubenswrapper[5118]: E0223 10:11:08.671035 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8\": container with ID starting with 66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8 not found: ID does not exist" containerID="66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8" Feb 23 10:11:08 crc kubenswrapper[5118]: I0223 10:11:08.671053 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8"} err="failed to get container status \"66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8\": rpc error: code = NotFound desc = could not find container \"66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8\": container with ID starting with 66d1095c7be2f0b27cd5b0d8db11d3e65da7b4029751b31d0eaa1da3fbd3c5c8 not found: ID does not exist" Feb 23 10:11:09 crc kubenswrapper[5118]: I0223 10:11:09.708762 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" path="/var/lib/kubelet/pods/d52a10af-d2df-47f5-87b4-9336e2ab87d6/volumes" Feb 23 10:11:17 crc kubenswrapper[5118]: I0223 10:11:17.658053 5118 generic.go:334] "Generic (PLEG): container finished" podID="d1f1a230-130d-4cf9-a480-4d57f666a95c" containerID="c05fda2b3296a81e5b6f8afd6fec560a0597e4f949ba354e9d577e6774bd35e0" exitCode=0 Feb 23 10:11:17 crc kubenswrapper[5118]: I0223 10:11:17.658177 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" event={"ID":"d1f1a230-130d-4cf9-a480-4d57f666a95c","Type":"ContainerDied","Data":"c05fda2b3296a81e5b6f8afd6fec560a0597e4f949ba354e9d577e6774bd35e0"} Feb 23 10:11:18 crc kubenswrapper[5118]: I0223 10:11:18.806893 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" Feb 23 10:11:18 crc kubenswrapper[5118]: I0223 10:11:18.871770 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lrbmh/crc-debug-nwg8x"] Feb 23 10:11:18 crc kubenswrapper[5118]: I0223 10:11:18.881642 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lrbmh/crc-debug-nwg8x"] Feb 23 10:11:18 crc kubenswrapper[5118]: I0223 10:11:18.928175 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1f1a230-130d-4cf9-a480-4d57f666a95c-host\") pod \"d1f1a230-130d-4cf9-a480-4d57f666a95c\" (UID: \"d1f1a230-130d-4cf9-a480-4d57f666a95c\") " Feb 23 10:11:18 crc kubenswrapper[5118]: I0223 10:11:18.928281 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1f1a230-130d-4cf9-a480-4d57f666a95c-host" (OuterVolumeSpecName: "host") pod "d1f1a230-130d-4cf9-a480-4d57f666a95c" (UID: "d1f1a230-130d-4cf9-a480-4d57f666a95c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 10:11:18 crc kubenswrapper[5118]: I0223 10:11:18.928434 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj5w4\" (UniqueName: \"kubernetes.io/projected/d1f1a230-130d-4cf9-a480-4d57f666a95c-kube-api-access-nj5w4\") pod \"d1f1a230-130d-4cf9-a480-4d57f666a95c\" (UID: \"d1f1a230-130d-4cf9-a480-4d57f666a95c\") " Feb 23 10:11:18 crc kubenswrapper[5118]: I0223 10:11:18.929262 5118 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1f1a230-130d-4cf9-a480-4d57f666a95c-host\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:18 crc kubenswrapper[5118]: I0223 10:11:18.933596 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1f1a230-130d-4cf9-a480-4d57f666a95c-kube-api-access-nj5w4" (OuterVolumeSpecName: "kube-api-access-nj5w4") pod "d1f1a230-130d-4cf9-a480-4d57f666a95c" (UID: "d1f1a230-130d-4cf9-a480-4d57f666a95c"). InnerVolumeSpecName "kube-api-access-nj5w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:11:19 crc kubenswrapper[5118]: I0223 10:11:19.031304 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj5w4\" (UniqueName: \"kubernetes.io/projected/d1f1a230-130d-4cf9-a480-4d57f666a95c-kube-api-access-nj5w4\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:19 crc kubenswrapper[5118]: I0223 10:11:19.694218 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdda6996057e97ad55518fcaf2e2ac6e6be8f85b1cb98167227e879a4fcf1b53" Feb 23 10:11:19 crc kubenswrapper[5118]: I0223 10:11:19.694503 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-nwg8x" Feb 23 10:11:19 crc kubenswrapper[5118]: I0223 10:11:19.713509 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1f1a230-130d-4cf9-a480-4d57f666a95c" path="/var/lib/kubelet/pods/d1f1a230-130d-4cf9-a480-4d57f666a95c/volumes" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.016797 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lrbmh/crc-debug-rgnpp"] Feb 23 10:11:20 crc kubenswrapper[5118]: E0223 10:11:20.017311 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerName="registry-server" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.017325 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerName="registry-server" Feb 23 10:11:20 crc kubenswrapper[5118]: E0223 10:11:20.017349 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerName="extract-utilities" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.017358 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerName="extract-utilities" Feb 23 10:11:20 crc kubenswrapper[5118]: E0223 10:11:20.017390 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerName="extract-content" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.017398 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerName="extract-content" Feb 23 10:11:20 crc kubenswrapper[5118]: E0223 10:11:20.017410 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerName="extract-content" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.017417 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerName="extract-content" Feb 23 10:11:20 crc kubenswrapper[5118]: E0223 10:11:20.017432 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerName="extract-utilities" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.017440 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerName="extract-utilities" Feb 23 10:11:20 crc kubenswrapper[5118]: E0223 10:11:20.017460 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerName="registry-server" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.017467 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerName="registry-server" Feb 23 10:11:20 crc kubenswrapper[5118]: E0223 10:11:20.017483 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1f1a230-130d-4cf9-a480-4d57f666a95c" containerName="container-00" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.017491 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1f1a230-130d-4cf9-a480-4d57f666a95c" containerName="container-00" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.017723 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc2d6ae7-09b6-4cab-9a26-12f0381f0680" containerName="registry-server" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.017742 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d52a10af-d2df-47f5-87b4-9336e2ab87d6" containerName="registry-server" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.017775 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1f1a230-130d-4cf9-a480-4d57f666a95c" containerName="container-00" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.018678 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.154138 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9508534e-9ba4-465f-a1ed-c6d024fb0a20-host\") pod \"crc-debug-rgnpp\" (UID: \"9508534e-9ba4-465f-a1ed-c6d024fb0a20\") " pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.154212 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djq6q\" (UniqueName: \"kubernetes.io/projected/9508534e-9ba4-465f-a1ed-c6d024fb0a20-kube-api-access-djq6q\") pod \"crc-debug-rgnpp\" (UID: \"9508534e-9ba4-465f-a1ed-c6d024fb0a20\") " pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.256940 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9508534e-9ba4-465f-a1ed-c6d024fb0a20-host\") pod \"crc-debug-rgnpp\" (UID: \"9508534e-9ba4-465f-a1ed-c6d024fb0a20\") " pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.256990 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djq6q\" (UniqueName: \"kubernetes.io/projected/9508534e-9ba4-465f-a1ed-c6d024fb0a20-kube-api-access-djq6q\") pod \"crc-debug-rgnpp\" (UID: \"9508534e-9ba4-465f-a1ed-c6d024fb0a20\") " pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.257335 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9508534e-9ba4-465f-a1ed-c6d024fb0a20-host\") pod \"crc-debug-rgnpp\" (UID: \"9508534e-9ba4-465f-a1ed-c6d024fb0a20\") " pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.278971 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djq6q\" (UniqueName: \"kubernetes.io/projected/9508534e-9ba4-465f-a1ed-c6d024fb0a20-kube-api-access-djq6q\") pod \"crc-debug-rgnpp\" (UID: \"9508534e-9ba4-465f-a1ed-c6d024fb0a20\") " pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.336860 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" Feb 23 10:11:20 crc kubenswrapper[5118]: W0223 10:11:20.364571 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9508534e_9ba4_465f_a1ed_c6d024fb0a20.slice/crio-8637282528b46903c6e5b71f4db1d7e36cf7bfc1122a50330f71ba728ea00375 WatchSource:0}: Error finding container 8637282528b46903c6e5b71f4db1d7e36cf7bfc1122a50330f71ba728ea00375: Status 404 returned error can't find the container with id 8637282528b46903c6e5b71f4db1d7e36cf7bfc1122a50330f71ba728ea00375 Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.713059 5118 generic.go:334] "Generic (PLEG): container finished" podID="9508534e-9ba4-465f-a1ed-c6d024fb0a20" containerID="0a2ba2383abc699b31d944854d6f11b8e49eb16aff5b2494a0b5ba197209c681" exitCode=0 Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.713175 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" event={"ID":"9508534e-9ba4-465f-a1ed-c6d024fb0a20","Type":"ContainerDied","Data":"0a2ba2383abc699b31d944854d6f11b8e49eb16aff5b2494a0b5ba197209c681"} Feb 23 10:11:20 crc kubenswrapper[5118]: I0223 10:11:20.713434 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" event={"ID":"9508534e-9ba4-465f-a1ed-c6d024fb0a20","Type":"ContainerStarted","Data":"8637282528b46903c6e5b71f4db1d7e36cf7bfc1122a50330f71ba728ea00375"} Feb 23 10:11:21 crc kubenswrapper[5118]: I0223 10:11:21.603575 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lrbmh/crc-debug-rgnpp"] Feb 23 10:11:21 crc kubenswrapper[5118]: I0223 10:11:21.613349 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lrbmh/crc-debug-rgnpp"] Feb 23 10:11:21 crc kubenswrapper[5118]: I0223 10:11:21.832171 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" Feb 23 10:11:21 crc kubenswrapper[5118]: I0223 10:11:21.991806 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9508534e-9ba4-465f-a1ed-c6d024fb0a20-host\") pod \"9508534e-9ba4-465f-a1ed-c6d024fb0a20\" (UID: \"9508534e-9ba4-465f-a1ed-c6d024fb0a20\") " Feb 23 10:11:21 crc kubenswrapper[5118]: I0223 10:11:21.991907 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djq6q\" (UniqueName: \"kubernetes.io/projected/9508534e-9ba4-465f-a1ed-c6d024fb0a20-kube-api-access-djq6q\") pod \"9508534e-9ba4-465f-a1ed-c6d024fb0a20\" (UID: \"9508534e-9ba4-465f-a1ed-c6d024fb0a20\") " Feb 23 10:11:21 crc kubenswrapper[5118]: I0223 10:11:21.992012 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9508534e-9ba4-465f-a1ed-c6d024fb0a20-host" (OuterVolumeSpecName: "host") pod "9508534e-9ba4-465f-a1ed-c6d024fb0a20" (UID: "9508534e-9ba4-465f-a1ed-c6d024fb0a20"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 10:11:21 crc kubenswrapper[5118]: I0223 10:11:21.993026 5118 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9508534e-9ba4-465f-a1ed-c6d024fb0a20-host\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:22 crc kubenswrapper[5118]: I0223 10:11:22.001446 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9508534e-9ba4-465f-a1ed-c6d024fb0a20-kube-api-access-djq6q" (OuterVolumeSpecName: "kube-api-access-djq6q") pod "9508534e-9ba4-465f-a1ed-c6d024fb0a20" (UID: "9508534e-9ba4-465f-a1ed-c6d024fb0a20"). InnerVolumeSpecName "kube-api-access-djq6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:11:22 crc kubenswrapper[5118]: I0223 10:11:22.095419 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djq6q\" (UniqueName: \"kubernetes.io/projected/9508534e-9ba4-465f-a1ed-c6d024fb0a20-kube-api-access-djq6q\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:22 crc kubenswrapper[5118]: I0223 10:11:22.742599 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-rgnpp" Feb 23 10:11:22 crc kubenswrapper[5118]: I0223 10:11:22.743930 5118 scope.go:117] "RemoveContainer" containerID="0a2ba2383abc699b31d944854d6f11b8e49eb16aff5b2494a0b5ba197209c681" Feb 23 10:11:22 crc kubenswrapper[5118]: I0223 10:11:22.827925 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lrbmh/crc-debug-z4kg7"] Feb 23 10:11:22 crc kubenswrapper[5118]: E0223 10:11:22.828548 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9508534e-9ba4-465f-a1ed-c6d024fb0a20" containerName="container-00" Feb 23 10:11:22 crc kubenswrapper[5118]: I0223 10:11:22.828657 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="9508534e-9ba4-465f-a1ed-c6d024fb0a20" containerName="container-00" Feb 23 10:11:22 crc kubenswrapper[5118]: I0223 10:11:22.828907 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="9508534e-9ba4-465f-a1ed-c6d024fb0a20" containerName="container-00" Feb 23 10:11:22 crc kubenswrapper[5118]: I0223 10:11:22.829626 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" Feb 23 10:11:22 crc kubenswrapper[5118]: I0223 10:11:22.910258 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/117fd8d6-fa9e-4481-944b-c3d08560fdf4-host\") pod \"crc-debug-z4kg7\" (UID: \"117fd8d6-fa9e-4481-944b-c3d08560fdf4\") " pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" Feb 23 10:11:22 crc kubenswrapper[5118]: I0223 10:11:22.910780 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vcg6\" (UniqueName: \"kubernetes.io/projected/117fd8d6-fa9e-4481-944b-c3d08560fdf4-kube-api-access-5vcg6\") pod \"crc-debug-z4kg7\" (UID: \"117fd8d6-fa9e-4481-944b-c3d08560fdf4\") " pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.012984 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vcg6\" (UniqueName: \"kubernetes.io/projected/117fd8d6-fa9e-4481-944b-c3d08560fdf4-kube-api-access-5vcg6\") pod \"crc-debug-z4kg7\" (UID: \"117fd8d6-fa9e-4481-944b-c3d08560fdf4\") " pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.013169 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/117fd8d6-fa9e-4481-944b-c3d08560fdf4-host\") pod \"crc-debug-z4kg7\" (UID: \"117fd8d6-fa9e-4481-944b-c3d08560fdf4\") " pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.013311 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/117fd8d6-fa9e-4481-944b-c3d08560fdf4-host\") pod \"crc-debug-z4kg7\" (UID: \"117fd8d6-fa9e-4481-944b-c3d08560fdf4\") " pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.050702 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vcg6\" (UniqueName: \"kubernetes.io/projected/117fd8d6-fa9e-4481-944b-c3d08560fdf4-kube-api-access-5vcg6\") pod \"crc-debug-z4kg7\" (UID: \"117fd8d6-fa9e-4481-944b-c3d08560fdf4\") " pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.152240 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" Feb 23 10:11:23 crc kubenswrapper[5118]: W0223 10:11:23.185601 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod117fd8d6_fa9e_4481_944b_c3d08560fdf4.slice/crio-90cf6375c888fc65c165110b8345301f6ee96461a34fb0174058471003ae3d09 WatchSource:0}: Error finding container 90cf6375c888fc65c165110b8345301f6ee96461a34fb0174058471003ae3d09: Status 404 returned error can't find the container with id 90cf6375c888fc65c165110b8345301f6ee96461a34fb0174058471003ae3d09 Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.716534 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9508534e-9ba4-465f-a1ed-c6d024fb0a20" path="/var/lib/kubelet/pods/9508534e-9ba4-465f-a1ed-c6d024fb0a20/volumes" Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.759769 5118 generic.go:334] "Generic (PLEG): container finished" podID="117fd8d6-fa9e-4481-944b-c3d08560fdf4" containerID="2befa269d4309fe905d79f8adda3ec60934dca742f3092b21ea4da7c94d6a403" exitCode=0 Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.759825 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" event={"ID":"117fd8d6-fa9e-4481-944b-c3d08560fdf4","Type":"ContainerDied","Data":"2befa269d4309fe905d79f8adda3ec60934dca742f3092b21ea4da7c94d6a403"} Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.759858 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" event={"ID":"117fd8d6-fa9e-4481-944b-c3d08560fdf4","Type":"ContainerStarted","Data":"90cf6375c888fc65c165110b8345301f6ee96461a34fb0174058471003ae3d09"} Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.815701 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lrbmh/crc-debug-z4kg7"] Feb 23 10:11:23 crc kubenswrapper[5118]: I0223 10:11:23.828121 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lrbmh/crc-debug-z4kg7"] Feb 23 10:11:24 crc kubenswrapper[5118]: I0223 10:11:24.898415 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" Feb 23 10:11:25 crc kubenswrapper[5118]: I0223 10:11:25.056894 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vcg6\" (UniqueName: \"kubernetes.io/projected/117fd8d6-fa9e-4481-944b-c3d08560fdf4-kube-api-access-5vcg6\") pod \"117fd8d6-fa9e-4481-944b-c3d08560fdf4\" (UID: \"117fd8d6-fa9e-4481-944b-c3d08560fdf4\") " Feb 23 10:11:25 crc kubenswrapper[5118]: I0223 10:11:25.057269 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/117fd8d6-fa9e-4481-944b-c3d08560fdf4-host\") pod \"117fd8d6-fa9e-4481-944b-c3d08560fdf4\" (UID: \"117fd8d6-fa9e-4481-944b-c3d08560fdf4\") " Feb 23 10:11:25 crc kubenswrapper[5118]: I0223 10:11:25.057766 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/117fd8d6-fa9e-4481-944b-c3d08560fdf4-host" (OuterVolumeSpecName: "host") pod "117fd8d6-fa9e-4481-944b-c3d08560fdf4" (UID: "117fd8d6-fa9e-4481-944b-c3d08560fdf4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 10:11:25 crc kubenswrapper[5118]: I0223 10:11:25.058218 5118 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/117fd8d6-fa9e-4481-944b-c3d08560fdf4-host\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:25 crc kubenswrapper[5118]: I0223 10:11:25.065398 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117fd8d6-fa9e-4481-944b-c3d08560fdf4-kube-api-access-5vcg6" (OuterVolumeSpecName: "kube-api-access-5vcg6") pod "117fd8d6-fa9e-4481-944b-c3d08560fdf4" (UID: "117fd8d6-fa9e-4481-944b-c3d08560fdf4"). InnerVolumeSpecName "kube-api-access-5vcg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:11:25 crc kubenswrapper[5118]: I0223 10:11:25.161622 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vcg6\" (UniqueName: \"kubernetes.io/projected/117fd8d6-fa9e-4481-944b-c3d08560fdf4-kube-api-access-5vcg6\") on node \"crc\" DevicePath \"\"" Feb 23 10:11:25 crc kubenswrapper[5118]: I0223 10:11:25.721756 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117fd8d6-fa9e-4481-944b-c3d08560fdf4" path="/var/lib/kubelet/pods/117fd8d6-fa9e-4481-944b-c3d08560fdf4/volumes" Feb 23 10:11:25 crc kubenswrapper[5118]: I0223 10:11:25.787997 5118 scope.go:117] "RemoveContainer" containerID="2befa269d4309fe905d79f8adda3ec60934dca742f3092b21ea4da7c94d6a403" Feb 23 10:11:25 crc kubenswrapper[5118]: I0223 10:11:25.788063 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/crc-debug-z4kg7" Feb 23 10:13:32 crc kubenswrapper[5118]: I0223 10:13:32.975428 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:13:32 crc kubenswrapper[5118]: I0223 10:13:32.976034 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.769763 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9tvqr"] Feb 23 10:13:36 crc kubenswrapper[5118]: E0223 10:13:36.770688 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117fd8d6-fa9e-4481-944b-c3d08560fdf4" containerName="container-00" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.770702 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="117fd8d6-fa9e-4481-944b-c3d08560fdf4" containerName="container-00" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.770937 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="117fd8d6-fa9e-4481-944b-c3d08560fdf4" containerName="container-00" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.772523 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.782944 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tvqr"] Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.876331 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-utilities\") pod \"redhat-marketplace-9tvqr\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.876398 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-catalog-content\") pod \"redhat-marketplace-9tvqr\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.876451 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvq9l\" (UniqueName: \"kubernetes.io/projected/700cfb03-9925-4678-8c30-bd08af10c64c-kube-api-access-vvq9l\") pod \"redhat-marketplace-9tvqr\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.963320 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g9vkh"] Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.965555 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.978952 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-utilities\") pod \"redhat-marketplace-9tvqr\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.979000 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-catalog-content\") pod \"redhat-marketplace-9tvqr\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.979041 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvq9l\" (UniqueName: \"kubernetes.io/projected/700cfb03-9925-4678-8c30-bd08af10c64c-kube-api-access-vvq9l\") pod \"redhat-marketplace-9tvqr\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.979466 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-utilities\") pod \"redhat-marketplace-9tvqr\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.979581 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-catalog-content\") pod \"redhat-marketplace-9tvqr\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:36 crc kubenswrapper[5118]: I0223 10:13:36.982636 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9vkh"] Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.006606 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvq9l\" (UniqueName: \"kubernetes.io/projected/700cfb03-9925-4678-8c30-bd08af10c64c-kube-api-access-vvq9l\") pod \"redhat-marketplace-9tvqr\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.080524 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmxjf\" (UniqueName: \"kubernetes.io/projected/e24a7994-0a1e-4538-9148-d70634e23dab-kube-api-access-tmxjf\") pod \"redhat-operators-g9vkh\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.080843 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-utilities\") pod \"redhat-operators-g9vkh\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.080915 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-catalog-content\") pod \"redhat-operators-g9vkh\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.093838 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.183121 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-catalog-content\") pod \"redhat-operators-g9vkh\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.183291 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmxjf\" (UniqueName: \"kubernetes.io/projected/e24a7994-0a1e-4538-9148-d70634e23dab-kube-api-access-tmxjf\") pod \"redhat-operators-g9vkh\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.183325 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-utilities\") pod \"redhat-operators-g9vkh\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.183798 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-utilities\") pod \"redhat-operators-g9vkh\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.184049 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-catalog-content\") pod \"redhat-operators-g9vkh\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.209170 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmxjf\" (UniqueName: \"kubernetes.io/projected/e24a7994-0a1e-4538-9148-d70634e23dab-kube-api-access-tmxjf\") pod \"redhat-operators-g9vkh\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.388273 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.594787 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tvqr"] Feb 23 10:13:37 crc kubenswrapper[5118]: I0223 10:13:37.863329 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9vkh"] Feb 23 10:13:37 crc kubenswrapper[5118]: W0223 10:13:37.866016 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24a7994_0a1e_4538_9148_d70634e23dab.slice/crio-39324be2e652b6834c5a2dbf4ea6624d02f453c345c928827ca505a2d95e097f WatchSource:0}: Error finding container 39324be2e652b6834c5a2dbf4ea6624d02f453c345c928827ca505a2d95e097f: Status 404 returned error can't find the container with id 39324be2e652b6834c5a2dbf4ea6624d02f453c345c928827ca505a2d95e097f Feb 23 10:13:38 crc kubenswrapper[5118]: I0223 10:13:38.201688 5118 generic.go:334] "Generic (PLEG): container finished" podID="700cfb03-9925-4678-8c30-bd08af10c64c" containerID="25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954" exitCode=0 Feb 23 10:13:38 crc kubenswrapper[5118]: I0223 10:13:38.201800 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tvqr" event={"ID":"700cfb03-9925-4678-8c30-bd08af10c64c","Type":"ContainerDied","Data":"25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954"} Feb 23 10:13:38 crc kubenswrapper[5118]: I0223 10:13:38.202128 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tvqr" event={"ID":"700cfb03-9925-4678-8c30-bd08af10c64c","Type":"ContainerStarted","Data":"d06fcf41753b069f9672ff3c20f06359a8b614c47ed33ced9ec4f5b296ae9e81"} Feb 23 10:13:38 crc kubenswrapper[5118]: I0223 10:13:38.204766 5118 generic.go:334] "Generic (PLEG): container finished" podID="e24a7994-0a1e-4538-9148-d70634e23dab" containerID="74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4" exitCode=0 Feb 23 10:13:38 crc kubenswrapper[5118]: I0223 10:13:38.204802 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9vkh" event={"ID":"e24a7994-0a1e-4538-9148-d70634e23dab","Type":"ContainerDied","Data":"74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4"} Feb 23 10:13:38 crc kubenswrapper[5118]: I0223 10:13:38.204823 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9vkh" event={"ID":"e24a7994-0a1e-4538-9148-d70634e23dab","Type":"ContainerStarted","Data":"39324be2e652b6834c5a2dbf4ea6624d02f453c345c928827ca505a2d95e097f"} Feb 23 10:13:39 crc kubenswrapper[5118]: I0223 10:13:39.216619 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tvqr" event={"ID":"700cfb03-9925-4678-8c30-bd08af10c64c","Type":"ContainerStarted","Data":"c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139"} Feb 23 10:13:39 crc kubenswrapper[5118]: I0223 10:13:39.220712 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9vkh" event={"ID":"e24a7994-0a1e-4538-9148-d70634e23dab","Type":"ContainerStarted","Data":"a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9"} Feb 23 10:13:41 crc kubenswrapper[5118]: I0223 10:13:41.246270 5118 generic.go:334] "Generic (PLEG): container finished" podID="700cfb03-9925-4678-8c30-bd08af10c64c" containerID="c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139" exitCode=0 Feb 23 10:13:41 crc kubenswrapper[5118]: I0223 10:13:41.246344 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tvqr" event={"ID":"700cfb03-9925-4678-8c30-bd08af10c64c","Type":"ContainerDied","Data":"c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139"} Feb 23 10:13:43 crc kubenswrapper[5118]: I0223 10:13:43.269882 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tvqr" event={"ID":"700cfb03-9925-4678-8c30-bd08af10c64c","Type":"ContainerStarted","Data":"0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737"} Feb 23 10:13:43 crc kubenswrapper[5118]: I0223 10:13:43.272207 5118 generic.go:334] "Generic (PLEG): container finished" podID="e24a7994-0a1e-4538-9148-d70634e23dab" containerID="a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9" exitCode=0 Feb 23 10:13:43 crc kubenswrapper[5118]: I0223 10:13:43.272248 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9vkh" event={"ID":"e24a7994-0a1e-4538-9148-d70634e23dab","Type":"ContainerDied","Data":"a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9"} Feb 23 10:13:43 crc kubenswrapper[5118]: I0223 10:13:43.296495 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9tvqr" podStartSLOduration=3.677325272 podStartE2EDuration="7.296470544s" podCreationTimestamp="2026-02-23 10:13:36 +0000 UTC" firstStartedPulling="2026-02-23 10:13:38.203530695 +0000 UTC m=+12481.207315268" lastFinishedPulling="2026-02-23 10:13:41.822675947 +0000 UTC m=+12484.826460540" observedRunningTime="2026-02-23 10:13:43.287906727 +0000 UTC m=+12486.291691300" watchObservedRunningTime="2026-02-23 10:13:43.296470544 +0000 UTC m=+12486.300255117" Feb 23 10:13:44 crc kubenswrapper[5118]: I0223 10:13:44.282643 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9vkh" event={"ID":"e24a7994-0a1e-4538-9148-d70634e23dab","Type":"ContainerStarted","Data":"7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208"} Feb 23 10:13:44 crc kubenswrapper[5118]: I0223 10:13:44.300251 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g9vkh" podStartSLOduration=2.830579685 podStartE2EDuration="8.300236366s" podCreationTimestamp="2026-02-23 10:13:36 +0000 UTC" firstStartedPulling="2026-02-23 10:13:38.206125548 +0000 UTC m=+12481.209910121" lastFinishedPulling="2026-02-23 10:13:43.675782229 +0000 UTC m=+12486.679566802" observedRunningTime="2026-02-23 10:13:44.297442599 +0000 UTC m=+12487.301227172" watchObservedRunningTime="2026-02-23 10:13:44.300236366 +0000 UTC m=+12487.304020939" Feb 23 10:13:47 crc kubenswrapper[5118]: I0223 10:13:47.096402 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:47 crc kubenswrapper[5118]: I0223 10:13:47.096663 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:47 crc kubenswrapper[5118]: I0223 10:13:47.206690 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:47 crc kubenswrapper[5118]: I0223 10:13:47.360591 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:47 crc kubenswrapper[5118]: I0223 10:13:47.389405 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:47 crc kubenswrapper[5118]: I0223 10:13:47.390368 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:13:48 crc kubenswrapper[5118]: I0223 10:13:48.451645 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g9vkh" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" containerName="registry-server" probeResult="failure" output=< Feb 23 10:13:48 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 10:13:48 crc kubenswrapper[5118]: > Feb 23 10:13:49 crc kubenswrapper[5118]: I0223 10:13:49.752409 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tvqr"] Feb 23 10:13:49 crc kubenswrapper[5118]: I0223 10:13:49.752963 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9tvqr" podUID="700cfb03-9925-4678-8c30-bd08af10c64c" containerName="registry-server" containerID="cri-o://0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737" gracePeriod=2 Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.194848 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.277364 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-utilities\") pod \"700cfb03-9925-4678-8c30-bd08af10c64c\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.277492 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-catalog-content\") pod \"700cfb03-9925-4678-8c30-bd08af10c64c\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.277557 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvq9l\" (UniqueName: \"kubernetes.io/projected/700cfb03-9925-4678-8c30-bd08af10c64c-kube-api-access-vvq9l\") pod \"700cfb03-9925-4678-8c30-bd08af10c64c\" (UID: \"700cfb03-9925-4678-8c30-bd08af10c64c\") " Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.279054 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-utilities" (OuterVolumeSpecName: "utilities") pod "700cfb03-9925-4678-8c30-bd08af10c64c" (UID: "700cfb03-9925-4678-8c30-bd08af10c64c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.285347 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700cfb03-9925-4678-8c30-bd08af10c64c-kube-api-access-vvq9l" (OuterVolumeSpecName: "kube-api-access-vvq9l") pod "700cfb03-9925-4678-8c30-bd08af10c64c" (UID: "700cfb03-9925-4678-8c30-bd08af10c64c"). InnerVolumeSpecName "kube-api-access-vvq9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.303291 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "700cfb03-9925-4678-8c30-bd08af10c64c" (UID: "700cfb03-9925-4678-8c30-bd08af10c64c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.341495 5118 generic.go:334] "Generic (PLEG): container finished" podID="700cfb03-9925-4678-8c30-bd08af10c64c" containerID="0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737" exitCode=0 Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.341546 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tvqr" event={"ID":"700cfb03-9925-4678-8c30-bd08af10c64c","Type":"ContainerDied","Data":"0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737"} Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.341596 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9tvqr" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.341623 5118 scope.go:117] "RemoveContainer" containerID="0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.341606 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9tvqr" event={"ID":"700cfb03-9925-4678-8c30-bd08af10c64c","Type":"ContainerDied","Data":"d06fcf41753b069f9672ff3c20f06359a8b614c47ed33ced9ec4f5b296ae9e81"} Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.363499 5118 scope.go:117] "RemoveContainer" containerID="c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.379693 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.379724 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/700cfb03-9925-4678-8c30-bd08af10c64c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.379734 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvq9l\" (UniqueName: \"kubernetes.io/projected/700cfb03-9925-4678-8c30-bd08af10c64c-kube-api-access-vvq9l\") on node \"crc\" DevicePath \"\"" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.393213 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tvqr"] Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.393630 5118 scope.go:117] "RemoveContainer" containerID="25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.404351 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9tvqr"] Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.440214 5118 scope.go:117] "RemoveContainer" containerID="0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737" Feb 23 10:13:50 crc kubenswrapper[5118]: E0223 10:13:50.440727 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737\": container with ID starting with 0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737 not found: ID does not exist" containerID="0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.440762 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737"} err="failed to get container status \"0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737\": rpc error: code = NotFound desc = could not find container \"0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737\": container with ID starting with 0ef5239e35949eaad49c23b884e9ebbb61945e86bbd1140b4017bc265884c737 not found: ID does not exist" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.440788 5118 scope.go:117] "RemoveContainer" containerID="c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139" Feb 23 10:13:50 crc kubenswrapper[5118]: E0223 10:13:50.441234 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139\": container with ID starting with c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139 not found: ID does not exist" containerID="c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.441266 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139"} err="failed to get container status \"c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139\": rpc error: code = NotFound desc = could not find container \"c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139\": container with ID starting with c4d40512b44f403b8e0db4e4672d8181dbc65c7d2afb04dd1cb904443fd43139 not found: ID does not exist" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.441283 5118 scope.go:117] "RemoveContainer" containerID="25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954" Feb 23 10:13:50 crc kubenswrapper[5118]: E0223 10:13:50.441594 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954\": container with ID starting with 25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954 not found: ID does not exist" containerID="25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954" Feb 23 10:13:50 crc kubenswrapper[5118]: I0223 10:13:50.441632 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954"} err="failed to get container status \"25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954\": rpc error: code = NotFound desc = could not find container \"25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954\": container with ID starting with 25cb9bf31857526285050a83e13a939dea02fa5dcf2276bfa3a8c760fb8fe954 not found: ID does not exist" Feb 23 10:13:51 crc kubenswrapper[5118]: I0223 10:13:51.710582 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700cfb03-9925-4678-8c30-bd08af10c64c" path="/var/lib/kubelet/pods/700cfb03-9925-4678-8c30-bd08af10c64c/volumes" Feb 23 10:13:58 crc kubenswrapper[5118]: I0223 10:13:58.457505 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g9vkh" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" containerName="registry-server" probeResult="failure" output=< Feb 23 10:13:58 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 10:13:58 crc kubenswrapper[5118]: > Feb 23 10:14:02 crc kubenswrapper[5118]: I0223 10:14:02.975720 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:14:02 crc kubenswrapper[5118]: I0223 10:14:02.976403 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:14:07 crc kubenswrapper[5118]: I0223 10:14:07.439024 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:14:07 crc kubenswrapper[5118]: I0223 10:14:07.497523 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:14:11 crc kubenswrapper[5118]: I0223 10:14:11.544372 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9vkh"] Feb 23 10:14:11 crc kubenswrapper[5118]: I0223 10:14:11.545149 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g9vkh" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" containerName="registry-server" containerID="cri-o://7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208" gracePeriod=2 Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.058782 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.121013 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmxjf\" (UniqueName: \"kubernetes.io/projected/e24a7994-0a1e-4538-9148-d70634e23dab-kube-api-access-tmxjf\") pod \"e24a7994-0a1e-4538-9148-d70634e23dab\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.121704 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-catalog-content\") pod \"e24a7994-0a1e-4538-9148-d70634e23dab\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.121735 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-utilities\") pod \"e24a7994-0a1e-4538-9148-d70634e23dab\" (UID: \"e24a7994-0a1e-4538-9148-d70634e23dab\") " Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.124213 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-utilities" (OuterVolumeSpecName: "utilities") pod "e24a7994-0a1e-4538-9148-d70634e23dab" (UID: "e24a7994-0a1e-4538-9148-d70634e23dab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.139398 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24a7994-0a1e-4538-9148-d70634e23dab-kube-api-access-tmxjf" (OuterVolumeSpecName: "kube-api-access-tmxjf") pod "e24a7994-0a1e-4538-9148-d70634e23dab" (UID: "e24a7994-0a1e-4538-9148-d70634e23dab"). InnerVolumeSpecName "kube-api-access-tmxjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.224800 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmxjf\" (UniqueName: \"kubernetes.io/projected/e24a7994-0a1e-4538-9148-d70634e23dab-kube-api-access-tmxjf\") on node \"crc\" DevicePath \"\"" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.224842 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.282876 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e24a7994-0a1e-4538-9148-d70634e23dab" (UID: "e24a7994-0a1e-4538-9148-d70634e23dab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.326872 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e24a7994-0a1e-4538-9148-d70634e23dab-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.594157 5118 generic.go:334] "Generic (PLEG): container finished" podID="e24a7994-0a1e-4538-9148-d70634e23dab" containerID="7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208" exitCode=0 Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.594208 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9vkh" event={"ID":"e24a7994-0a1e-4538-9148-d70634e23dab","Type":"ContainerDied","Data":"7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208"} Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.594248 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9vkh" event={"ID":"e24a7994-0a1e-4538-9148-d70634e23dab","Type":"ContainerDied","Data":"39324be2e652b6834c5a2dbf4ea6624d02f453c345c928827ca505a2d95e097f"} Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.594251 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9vkh" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.594267 5118 scope.go:117] "RemoveContainer" containerID="7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.618669 5118 scope.go:117] "RemoveContainer" containerID="a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.650850 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9vkh"] Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.663169 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g9vkh"] Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.670182 5118 scope.go:117] "RemoveContainer" containerID="74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.693408 5118 scope.go:117] "RemoveContainer" containerID="7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208" Feb 23 10:14:12 crc kubenswrapper[5118]: E0223 10:14:12.693827 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208\": container with ID starting with 7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208 not found: ID does not exist" containerID="7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.693861 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208"} err="failed to get container status \"7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208\": rpc error: code = NotFound desc = could not find container \"7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208\": container with ID starting with 7eeebf0bf3a084f799cd911737fa3e881e0e13080c626b943fe6235151114208 not found: ID does not exist" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.693882 5118 scope.go:117] "RemoveContainer" containerID="a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9" Feb 23 10:14:12 crc kubenswrapper[5118]: E0223 10:14:12.694394 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9\": container with ID starting with a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9 not found: ID does not exist" containerID="a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.694417 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9"} err="failed to get container status \"a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9\": rpc error: code = NotFound desc = could not find container \"a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9\": container with ID starting with a71407879ef049273756f5932aadf616a5da651007fbc3368e821e82791459b9 not found: ID does not exist" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.694433 5118 scope.go:117] "RemoveContainer" containerID="74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4" Feb 23 10:14:12 crc kubenswrapper[5118]: E0223 10:14:12.694640 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4\": container with ID starting with 74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4 not found: ID does not exist" containerID="74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4" Feb 23 10:14:12 crc kubenswrapper[5118]: I0223 10:14:12.694665 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4"} err="failed to get container status \"74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4\": rpc error: code = NotFound desc = could not find container \"74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4\": container with ID starting with 74699eb1a1cbce493e680992be85cdbc495e8333dc025bfdaa91bcdbbd565dc4 not found: ID does not exist" Feb 23 10:14:13 crc kubenswrapper[5118]: I0223 10:14:13.707811 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" path="/var/lib/kubelet/pods/e24a7994-0a1e-4538-9148-d70634e23dab/volumes" Feb 23 10:14:32 crc kubenswrapper[5118]: I0223 10:14:32.975028 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:14:32 crc kubenswrapper[5118]: I0223 10:14:32.975783 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:14:32 crc kubenswrapper[5118]: I0223 10:14:32.975848 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 10:14:32 crc kubenswrapper[5118]: I0223 10:14:32.976904 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 10:14:32 crc kubenswrapper[5118]: I0223 10:14:32.976986 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" gracePeriod=600 Feb 23 10:14:33 crc kubenswrapper[5118]: E0223 10:14:33.106436 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:14:33 crc kubenswrapper[5118]: I0223 10:14:33.806994 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" exitCode=0 Feb 23 10:14:33 crc kubenswrapper[5118]: I0223 10:14:33.807054 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016"} Feb 23 10:14:33 crc kubenswrapper[5118]: I0223 10:14:33.807286 5118 scope.go:117] "RemoveContainer" containerID="4dcb38446cecdf81ec0db57492be71a7df5ba224f8bd7abc00df7bfb9045ad2c" Feb 23 10:14:33 crc kubenswrapper[5118]: I0223 10:14:33.807960 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:14:33 crc kubenswrapper[5118]: E0223 10:14:33.808270 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:14:47 crc kubenswrapper[5118]: I0223 10:14:47.353922 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2/init-config-reloader/0.log" Feb 23 10:14:47 crc kubenswrapper[5118]: I0223 10:14:47.524429 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2/config-reloader/0.log" Feb 23 10:14:47 crc kubenswrapper[5118]: I0223 10:14:47.532656 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2/init-config-reloader/0.log" Feb 23 10:14:47 crc kubenswrapper[5118]: I0223 10:14:47.545859 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b8ba7872-be8c-4c8a-874f-f44c3b4cf8b2/alertmanager/0.log" Feb 23 10:14:47 crc kubenswrapper[5118]: I0223 10:14:47.731752 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7b334d5c-a053-474f-8395-432faf152c91/aodh-listener/0.log" Feb 23 10:14:47 crc kubenswrapper[5118]: I0223 10:14:47.743818 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7b334d5c-a053-474f-8395-432faf152c91/aodh-api/0.log" Feb 23 10:14:47 crc kubenswrapper[5118]: I0223 10:14:47.765957 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7b334d5c-a053-474f-8395-432faf152c91/aodh-evaluator/0.log" Feb 23 10:14:47 crc kubenswrapper[5118]: I0223 10:14:47.847614 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_7b334d5c-a053-474f-8395-432faf152c91/aodh-notifier/0.log" Feb 23 10:14:47 crc kubenswrapper[5118]: I0223 10:14:47.930854 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-678dd5c864-6nzbp_a5bb46fd-6ecf-49f2-8b75-082309e154ab/barbican-api/0.log" Feb 23 10:14:47 crc kubenswrapper[5118]: I0223 10:14:47.962011 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-678dd5c864-6nzbp_a5bb46fd-6ecf-49f2-8b75-082309e154ab/barbican-api-log/0.log" Feb 23 10:14:48 crc kubenswrapper[5118]: I0223 10:14:48.113553 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c8f867454-q8dx9_33b45cc0-f140-4ef3-ad47-56be870583a5/barbican-keystone-listener/0.log" Feb 23 10:14:48 crc kubenswrapper[5118]: I0223 10:14:48.349363 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69b5cb8cbf-z5x2s_afa3b53c-97b6-4eb6-b1f3-22b5feef02b5/barbican-worker/0.log" Feb 23 10:14:48 crc kubenswrapper[5118]: I0223 10:14:48.370708 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69b5cb8cbf-z5x2s_afa3b53c-97b6-4eb6-b1f3-22b5feef02b5/barbican-worker-log/0.log" Feb 23 10:14:48 crc kubenswrapper[5118]: I0223 10:14:48.683533 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-tq4rm_687c6ebc-4763-49bb-8293-b1e92f4f39ca/bootstrap-openstack-openstack-cell1/0.log" Feb 23 10:14:48 crc kubenswrapper[5118]: I0223 10:14:48.697079 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:14:48 crc kubenswrapper[5118]: E0223 10:14:48.697406 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:14:48 crc kubenswrapper[5118]: I0223 10:14:48.769426 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-networker-krbms_5f95e63a-6e71-4bf2-9ba9-694e78a6e0c1/bootstrap-openstack-openstack-networker/0.log" Feb 23 10:14:48 crc kubenswrapper[5118]: I0223 10:14:48.903847 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c23b5987-be35-42db-b1d5-cfaadfbdb9e0/ceilometer-central-agent/1.log" Feb 23 10:14:48 crc kubenswrapper[5118]: I0223 10:14:48.959863 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c8f867454-q8dx9_33b45cc0-f140-4ef3-ad47-56be870583a5/barbican-keystone-listener-log/0.log" Feb 23 10:14:49 crc kubenswrapper[5118]: I0223 10:14:49.014786 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c23b5987-be35-42db-b1d5-cfaadfbdb9e0/ceilometer-central-agent/0.log" Feb 23 10:14:49 crc kubenswrapper[5118]: I0223 10:14:49.083800 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c23b5987-be35-42db-b1d5-cfaadfbdb9e0/proxy-httpd/0.log" Feb 23 10:14:49 crc kubenswrapper[5118]: I0223 10:14:49.111426 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c23b5987-be35-42db-b1d5-cfaadfbdb9e0/ceilometer-notification-agent/0.log" Feb 23 10:14:49 crc kubenswrapper[5118]: I0223 10:14:49.127259 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c23b5987-be35-42db-b1d5-cfaadfbdb9e0/sg-core/0.log" Feb 23 10:14:49 crc kubenswrapper[5118]: I0223 10:14:49.261869 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-qtj48_afb0350b-ca38-4bda-82cb-78dd02b3eef1/ceph-client-openstack-openstack-cell1/0.log" Feb 23 10:14:49 crc kubenswrapper[5118]: I0223 10:14:49.652218 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_cce4e91f-1362-4a4c-820f-d4a6ba874e33/cinder-api/0.log" Feb 23 10:14:49 crc kubenswrapper[5118]: I0223 10:14:49.674189 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_cce4e91f-1362-4a4c-820f-d4a6ba874e33/cinder-api-log/0.log" Feb 23 10:14:49 crc kubenswrapper[5118]: I0223 10:14:49.907065 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_6b3cd17b-fa93-44fa-9571-cebff83d65ad/probe/0.log" Feb 23 10:14:50 crc kubenswrapper[5118]: I0223 10:14:50.125266 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1d980983-651d-4df6-a7ca-18d1191300cf/cinder-scheduler/0.log" Feb 23 10:14:50 crc kubenswrapper[5118]: I0223 10:14:50.155215 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1d980983-651d-4df6-a7ca-18d1191300cf/cinder-scheduler/1.log" Feb 23 10:14:50 crc kubenswrapper[5118]: I0223 10:14:50.489916 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1d980983-651d-4df6-a7ca-18d1191300cf/probe/0.log" Feb 23 10:14:50 crc kubenswrapper[5118]: I0223 10:14:50.950903 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_6b3cd17b-fa93-44fa-9571-cebff83d65ad/cinder-backup/0.log" Feb 23 10:14:51 crc kubenswrapper[5118]: I0223 10:14:51.280022 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_54fb818d-8422-4517-84b1-b9d71a9f213a/probe/0.log" Feb 23 10:14:51 crc kubenswrapper[5118]: I0223 10:14:51.413683 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_54fb818d-8422-4517-84b1-b9d71a9f213a/cinder-volume/1.log" Feb 23 10:14:51 crc kubenswrapper[5118]: I0223 10:14:51.550536 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-znjvp_c0594606-fafa-4f81-9497-2b0a637ef8d8/configure-network-openstack-openstack-cell1/0.log" Feb 23 10:14:51 crc kubenswrapper[5118]: I0223 10:14:51.634744 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-networker-gtzln_bfa05bc3-3fc4-4d65-a467-6096899d3260/configure-network-openstack-openstack-networker/0.log" Feb 23 10:14:51 crc kubenswrapper[5118]: I0223 10:14:51.762878 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_6b3cd17b-fa93-44fa-9571-cebff83d65ad/cinder-backup/1.log" Feb 23 10:14:51 crc kubenswrapper[5118]: I0223 10:14:51.862410 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-5wgqh_bfff4753-aa33-4692-8798-63c79f965334/configure-os-openstack-openstack-cell1/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.007653 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-networker-vxbgf_394e2de4-4a97-4449-9c8a-f6cd4e6a1788/configure-os-openstack-openstack-networker/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.075266 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-665d6cc987-mjtg9_fcbae92c-436e-4555-bf63-c25cf526532a/init/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.285535 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-665d6cc987-mjtg9_fcbae92c-436e-4555-bf63-c25cf526532a/init/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.432357 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-fpn5p_7bba7957-c51c-4948-8946-e3f32d217698/download-cache-openstack-openstack-cell1/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.459500 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-665d6cc987-mjtg9_fcbae92c-436e-4555-bf63-c25cf526532a/dnsmasq-dns/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.618121 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_54fb818d-8422-4517-84b1-b9d71a9f213a/cinder-volume/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.666851 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-networker-q6vdr_4a112539-48d7-414b-9d7e-b63e8afb0244/download-cache-openstack-openstack-networker/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.751912 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c3c8427e-8b9a-4896-bf16-d50804df2346/glance-log/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.766851 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c3c8427e-8b9a-4896-bf16-d50804df2346/glance-httpd/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.924046 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_51a8db46-f5fa-42f9-b28e-68791b50dc7a/glance-httpd/0.log" Feb 23 10:14:52 crc kubenswrapper[5118]: I0223 10:14:52.931045 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_51a8db46-f5fa-42f9-b28e-68791b50dc7a/glance-log/0.log" Feb 23 10:14:53 crc kubenswrapper[5118]: I0223 10:14:53.151931 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-76664f888f-t8jd9_1cd05d4d-997d-43f6-8144-90706df15017/heat-api/0.log" Feb 23 10:14:53 crc kubenswrapper[5118]: I0223 10:14:53.285495 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5497bc9d57-4jlv5_21f60b5a-456e-4912-b16a-f0d3074f9d02/heat-engine/0.log" Feb 23 10:14:53 crc kubenswrapper[5118]: I0223 10:14:53.291160 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-84f9ff9486-k2nsq_8e69549d-84dc-4239-8aff-a2c5e04dd246/heat-cfnapi/0.log" Feb 23 10:14:53 crc kubenswrapper[5118]: I0223 10:14:53.452682 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6994f557dc-cpfnd_5090d54b-dd37-4141-b04c-0a5451a4f264/horizon/0.log" Feb 23 10:14:53 crc kubenswrapper[5118]: I0223 10:14:53.491455 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-jxfs9_a6383853-b182-4930-87fe-2d27f67c8cf3/install-certs-openstack-openstack-cell1/0.log" Feb 23 10:14:53 crc kubenswrapper[5118]: I0223 10:14:53.539180 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6994f557dc-cpfnd_5090d54b-dd37-4141-b04c-0a5451a4f264/horizon-log/0.log" Feb 23 10:14:53 crc kubenswrapper[5118]: I0223 10:14:53.690737 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-networker-ntsq6_327a5d6a-824c-45f9-b361-e796699fb933/install-certs-openstack-openstack-networker/0.log" Feb 23 10:14:53 crc kubenswrapper[5118]: I0223 10:14:53.738599 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-rjjq4_2d868181-f742-41c8-b371-869982a76657/install-os-openstack-openstack-cell1/0.log" Feb 23 10:14:53 crc kubenswrapper[5118]: I0223 10:14:53.886941 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-networker-8s5dc_350a056c-4003-4872-b099-2a475d6aabe9/install-os-openstack-openstack-networker/0.log" Feb 23 10:14:53 crc kubenswrapper[5118]: I0223 10:14:53.978598 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29530621-v4ftv_63ba3449-8278-4d38-b02c-080d45721d1c/keystone-cron/0.log" Feb 23 10:14:54 crc kubenswrapper[5118]: I0223 10:14:54.246127 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29530681-hz74l_76e9f12c-042d-4e31-bb0b-ff59287da006/keystone-cron/0.log" Feb 23 10:14:54 crc kubenswrapper[5118]: I0223 10:14:54.281845 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_70317625-f015-449e-bee0-152cd305ffea/kube-state-metrics/0.log" Feb 23 10:14:54 crc kubenswrapper[5118]: I0223 10:14:54.509409 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-xcx95_7c62cb5a-a8c8-4ab5-9c66-dc6e3d92720c/libvirt-openstack-openstack-cell1/0.log" Feb 23 10:14:54 crc kubenswrapper[5118]: I0223 10:14:54.862735 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e67a6d39-e2b4-4133-b908-717e9f957170/manila-scheduler/0.log" Feb 23 10:14:54 crc kubenswrapper[5118]: I0223 10:14:54.937593 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e67a6d39-e2b4-4133-b908-717e9f957170/probe/0.log" Feb 23 10:14:54 crc kubenswrapper[5118]: I0223 10:14:54.971291 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_3970e01b-cedb-40fc-9594-55ec615ff971/manila-api/0.log" Feb 23 10:14:55 crc kubenswrapper[5118]: I0223 10:14:55.054210 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6998df9889-njjfn_1688eed1-a62c-4fec-b12d-c1eb1a45d6cc/keystone-api/0.log" Feb 23 10:14:55 crc kubenswrapper[5118]: I0223 10:14:55.147339 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_3970e01b-cedb-40fc-9594-55ec615ff971/manila-api-log/0.log" Feb 23 10:14:55 crc kubenswrapper[5118]: I0223 10:14:55.158778 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_3a077e0c-4058-463d-a95c-5566479cd3af/manila-share/0.log" Feb 23 10:14:55 crc kubenswrapper[5118]: I0223 10:14:55.207296 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_3a077e0c-4058-463d-a95c-5566479cd3af/probe/0.log" Feb 23 10:14:55 crc kubenswrapper[5118]: I0223 10:14:55.686057 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-674bbb4f9c-tfqpc_5af9d028-a91a-4540-926f-d0c7c50ca61b/neutron-httpd/0.log" Feb 23 10:14:56 crc kubenswrapper[5118]: I0223 10:14:56.269288 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-chnnl_837cd892-bf64-4ba8-9805-290c7fb37f26/neutron-metadata-openstack-openstack-cell1/0.log" Feb 23 10:14:56 crc kubenswrapper[5118]: I0223 10:14:56.273848 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-7q7f6_ad2694f7-370d-4478-959a-4668f0b0fefc/neutron-dhcp-openstack-openstack-cell1/0.log" Feb 23 10:14:56 crc kubenswrapper[5118]: I0223 10:14:56.407480 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-674bbb4f9c-tfqpc_5af9d028-a91a-4540-926f-d0c7c50ca61b/neutron-api/0.log" Feb 23 10:14:56 crc kubenswrapper[5118]: I0223 10:14:56.483669 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-networker-m96tb_d0897d57-2e00-4c9a-be28-20a997e5b96f/neutron-metadata-openstack-openstack-networker/0.log" Feb 23 10:14:56 crc kubenswrapper[5118]: I0223 10:14:56.663628 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-64pnn_7b2f0e3e-3ab3-4f48-9823-2253d2267393/neutron-sriov-openstack-openstack-cell1/0.log" Feb 23 10:14:56 crc kubenswrapper[5118]: I0223 10:14:56.939448 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bf5e7919-f61d-4017-9be8-205d13874bd5/nova-api-api/0.log" Feb 23 10:14:57 crc kubenswrapper[5118]: I0223 10:14:57.200061 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bf5e7919-f61d-4017-9be8-205d13874bd5/nova-api-log/0.log" Feb 23 10:14:57 crc kubenswrapper[5118]: I0223 10:14:57.303969 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8f83d0f7-2228-4c9c-8990-04ccd42dafc6/nova-cell0-conductor-conductor/0.log" Feb 23 10:14:57 crc kubenswrapper[5118]: I0223 10:14:57.469580 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f801af4d-7c56-423f-9590-3c9a9d813356/nova-cell1-conductor-conductor/0.log" Feb 23 10:14:57 crc kubenswrapper[5118]: I0223 10:14:57.632875 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7e6f7cef-764b-4c09-920e-38f952ce4538/nova-cell1-novncproxy-novncproxy/0.log" Feb 23 10:14:57 crc kubenswrapper[5118]: I0223 10:14:57.722372 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celld6qk5_8285a362-7003-43c3-969a-88ee748d09e8/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Feb 23 10:14:57 crc kubenswrapper[5118]: I0223 10:14:57.940429 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-bvnm9_c744cb6d-c774-4eb8-82ba-d6acd7ccc1ea/nova-cell1-openstack-openstack-cell1/0.log" Feb 23 10:14:58 crc kubenswrapper[5118]: I0223 10:14:58.051695 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1483f138-18e5-4827-81a6-db3d0b06551e/nova-metadata-log/0.log" Feb 23 10:14:58 crc kubenswrapper[5118]: I0223 10:14:58.093549 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_1483f138-18e5-4827-81a6-db3d0b06551e/nova-metadata-metadata/0.log" Feb 23 10:14:58 crc kubenswrapper[5118]: I0223 10:14:58.335138 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3f104baf-13e1-44ac-bf73-ea400599dee0/mysql-bootstrap/0.log" Feb 23 10:14:58 crc kubenswrapper[5118]: I0223 10:14:58.338713 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bd75b7e3-5c92-4b37-adba-ef56c1507da6/nova-scheduler-scheduler/0.log" Feb 23 10:14:58 crc kubenswrapper[5118]: I0223 10:14:58.555195 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3f104baf-13e1-44ac-bf73-ea400599dee0/mysql-bootstrap/0.log" Feb 23 10:14:58 crc kubenswrapper[5118]: I0223 10:14:58.560893 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_3f104baf-13e1-44ac-bf73-ea400599dee0/galera/0.log" Feb 23 10:14:58 crc kubenswrapper[5118]: I0223 10:14:58.655740 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6148598d-7822-4b48-b805-f2544e9bc5ea/mysql-bootstrap/0.log" Feb 23 10:14:58 crc kubenswrapper[5118]: I0223 10:14:58.841167 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6148598d-7822-4b48-b805-f2544e9bc5ea/mysql-bootstrap/0.log" Feb 23 10:14:58 crc kubenswrapper[5118]: I0223 10:14:58.871961 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_6148598d-7822-4b48-b805-f2544e9bc5ea/galera/0.log" Feb 23 10:14:58 crc kubenswrapper[5118]: I0223 10:14:58.912744 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_78cdaa37-27be-4446-b098-11f5a3b136e9/openstackclient/0.log" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.068352 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c896a524-fcc3-4636-867e-09ff5e556861/ovn-northd/0.log" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.147387 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c896a524-fcc3-4636-867e-09ff5e556861/openstack-network-exporter/0.log" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.361613 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-9zb28_15546255-9c2d-4475-93b1-1805b835245f/ovn-openstack-openstack-cell1/0.log" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.414144 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-networker-h8lzm_4258d556-fe9b-469a-b37a-6a15eb81b4be/ovn-openstack-openstack-networker/0.log" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.536139 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b50fcd88-9e24-4304-9ba8-892b357328b5/openstack-network-exporter/0.log" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.629756 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b50fcd88-9e24-4304-9ba8-892b357328b5/ovsdbserver-nb/0.log" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.696992 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:14:59 crc kubenswrapper[5118]: E0223 10:14:59.697267 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.759945 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_a9c4c002-69d9-4589-8c0b-a110c31b8314/openstack-network-exporter/0.log" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.768946 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_a9c4c002-69d9-4589-8c0b-a110c31b8314/ovsdbserver-nb/0.log" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.936976 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_3d0221c3-d348-4f99-9f66-2ff299664efa/ovsdbserver-nb/0.log" Feb 23 10:14:59 crc kubenswrapper[5118]: I0223 10:14:59.959671 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_3d0221c3-d348-4f99-9f66-2ff299664efa/openstack-network-exporter/0.log" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.109083 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_45e75a46-97ac-454a-aae6-988e4b4bd679/openstack-network-exporter/0.log" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.149212 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85"] Feb 23 10:15:00 crc kubenswrapper[5118]: E0223 10:15:00.150017 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700cfb03-9925-4678-8c30-bd08af10c64c" containerName="extract-utilities" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.150052 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="700cfb03-9925-4678-8c30-bd08af10c64c" containerName="extract-utilities" Feb 23 10:15:00 crc kubenswrapper[5118]: E0223 10:15:00.150165 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" containerName="extract-utilities" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.150177 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" containerName="extract-utilities" Feb 23 10:15:00 crc kubenswrapper[5118]: E0223 10:15:00.150225 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" containerName="registry-server" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.150236 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" containerName="registry-server" Feb 23 10:15:00 crc kubenswrapper[5118]: E0223 10:15:00.150247 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700cfb03-9925-4678-8c30-bd08af10c64c" containerName="registry-server" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.150256 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="700cfb03-9925-4678-8c30-bd08af10c64c" containerName="registry-server" Feb 23 10:15:00 crc kubenswrapper[5118]: E0223 10:15:00.150283 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" containerName="extract-content" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.150291 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" containerName="extract-content" Feb 23 10:15:00 crc kubenswrapper[5118]: E0223 10:15:00.150305 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700cfb03-9925-4678-8c30-bd08af10c64c" containerName="extract-content" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.150311 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="700cfb03-9925-4678-8c30-bd08af10c64c" containerName="extract-content" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.150603 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24a7994-0a1e-4538-9148-d70634e23dab" containerName="registry-server" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.150643 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="700cfb03-9925-4678-8c30-bd08af10c64c" containerName="registry-server" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.151618 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.154224 5118 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.155252 5118 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.158559 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85"] Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.197778 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_45e75a46-97ac-454a-aae6-988e4b4bd679/ovsdbserver-sb/0.log" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.258872 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_d8b50ef3-7b23-448e-86cb-6e46a16c2624/openstack-network-exporter/0.log" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.323400 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-config-volume\") pod \"collect-profiles-29530695-t9b85\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.323484 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vngh6\" (UniqueName: \"kubernetes.io/projected/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-kube-api-access-vngh6\") pod \"collect-profiles-29530695-t9b85\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.327960 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-secret-volume\") pod \"collect-profiles-29530695-t9b85\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.332560 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_d8b50ef3-7b23-448e-86cb-6e46a16c2624/ovsdbserver-sb/0.log" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.430175 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-config-volume\") pod \"collect-profiles-29530695-t9b85\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.430261 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vngh6\" (UniqueName: \"kubernetes.io/projected/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-kube-api-access-vngh6\") pod \"collect-profiles-29530695-t9b85\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.430285 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-secret-volume\") pod \"collect-profiles-29530695-t9b85\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.431981 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-config-volume\") pod \"collect-profiles-29530695-t9b85\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.437872 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-secret-volume\") pod \"collect-profiles-29530695-t9b85\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.450744 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vngh6\" (UniqueName: \"kubernetes.io/projected/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-kube-api-access-vngh6\") pod \"collect-profiles-29530695-t9b85\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.470656 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_9b32637c-2170-4283-9a68-b8c4af56ef90/ovsdbserver-sb/0.log" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.487478 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.541115 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_9b32637c-2170-4283-9a68-b8c4af56ef90/openstack-network-exporter/0.log" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.869646 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-chv96c_a5415a2b-4e23-462e-a219-3ec3c0c3e369/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.976602 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85"] Feb 23 10:15:00 crc kubenswrapper[5118]: I0223 10:15:00.984375 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69b8f76cc6-xp4qh_c7b08825-a737-4e87-b491-8cba50c463dd/placement-api/0.log" Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.015342 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69b8f76cc6-xp4qh_c7b08825-a737-4e87-b491-8cba50c463dd/placement-log/0.log" Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.123127 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" event={"ID":"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2","Type":"ContainerStarted","Data":"c9b4304f764868c5c1318625406562b339e1d50b9f0a8afb87d562d33502815a"} Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.156283 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-nzck8p_090ec22d-8571-49a4-b33a-fc211a094200/pre-adoption-validation-openstack-pre-adoption-openstack-networ/0.log" Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.333670 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb3e17bb-d95e-4bff-8e39-325f7f0517f8/init-config-reloader/0.log" Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.555992 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb3e17bb-d95e-4bff-8e39-325f7f0517f8/init-config-reloader/0.log" Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.556940 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb3e17bb-d95e-4bff-8e39-325f7f0517f8/config-reloader/0.log" Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.558345 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb3e17bb-d95e-4bff-8e39-325f7f0517f8/prometheus/0.log" Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.631119 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fb3e17bb-d95e-4bff-8e39-325f7f0517f8/thanos-sidecar/0.log" Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.758452 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c02f2661-073e-48e7-b6c6-527de9c32b91/setup-container/0.log" Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.972344 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c02f2661-073e-48e7-b6c6-527de9c32b91/setup-container/0.log" Feb 23 10:15:01 crc kubenswrapper[5118]: I0223 10:15:01.998926 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c02f2661-073e-48e7-b6c6-527de9c32b91/rabbitmq/0.log" Feb 23 10:15:02 crc kubenswrapper[5118]: I0223 10:15:02.028861 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c25bc4bd-e5d5-49b8-9419-28f0e33f394c/setup-container/0.log" Feb 23 10:15:02 crc kubenswrapper[5118]: I0223 10:15:02.134119 5118 generic.go:334] "Generic (PLEG): container finished" podID="7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2" containerID="9408382e8c320833accbb142c9d369b061904c0becf2e8b12071c962d72306ce" exitCode=0 Feb 23 10:15:02 crc kubenswrapper[5118]: I0223 10:15:02.134158 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" event={"ID":"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2","Type":"ContainerDied","Data":"9408382e8c320833accbb142c9d369b061904c0becf2e8b12071c962d72306ce"} Feb 23 10:15:02 crc kubenswrapper[5118]: I0223 10:15:02.276803 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c25bc4bd-e5d5-49b8-9419-28f0e33f394c/setup-container/0.log" Feb 23 10:15:02 crc kubenswrapper[5118]: I0223 10:15:02.312513 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c25bc4bd-e5d5-49b8-9419-28f0e33f394c/rabbitmq/0.log" Feb 23 10:15:02 crc kubenswrapper[5118]: I0223 10:15:02.368214 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-r5qpm_5bb7c61d-20fe-4699-9ed2-1db80db282b5/reboot-os-openstack-openstack-cell1/0.log" Feb 23 10:15:02 crc kubenswrapper[5118]: I0223 10:15:02.560269 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-networker-9m9lh_11eb8834-97e4-4039-aab8-41e55d069d49/reboot-os-openstack-openstack-networker/0.log" Feb 23 10:15:02 crc kubenswrapper[5118]: I0223 10:15:02.595992 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-2rc7r_5fbe2adc-de72-4e17-8635-911860358d61/run-os-openstack-openstack-cell1/0.log" Feb 23 10:15:02 crc kubenswrapper[5118]: I0223 10:15:02.831567 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-networker-fkzmh_99646e7b-9e06-4397-8677-6fc383d0683d/run-os-openstack-openstack-networker/0.log" Feb 23 10:15:02 crc kubenswrapper[5118]: I0223 10:15:02.890642 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-md2tf_10127b7d-da9f-43f0-b2c4-625ea937f23d/ssh-known-hosts-openstack/0.log" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.274952 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-56ljn_a52048ce-f33c-40af-ac4b-8d6981ff6f60/telemetry-openstack-openstack-cell1/0.log" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.410137 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b5f49473-b37e-44f5-a7cf-ad38cd064729/tempest-tests-tempest-tests-runner/0.log" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.445138 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ea4eec4e-c875-4140-a73f-51881eda1861/test-operator-logs-container/0.log" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.593135 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.699919 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-2n7rp_37085206-dc9b-47c1-a2a5-5b54b83c7e47/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.721615 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-config-volume\") pod \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.721644 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vngh6\" (UniqueName: \"kubernetes.io/projected/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-kube-api-access-vngh6\") pod \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.721688 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-secret-volume\") pod \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\" (UID: \"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2\") " Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.722466 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2" (UID: "7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.727493 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-kube-api-access-vngh6" (OuterVolumeSpecName: "kube-api-access-vngh6") pod "7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2" (UID: "7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2"). InnerVolumeSpecName "kube-api-access-vngh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.741143 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2" (UID: "7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.825495 5118 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.825517 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vngh6\" (UniqueName: \"kubernetes.io/projected/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-kube-api-access-vngh6\") on node \"crc\" DevicePath \"\"" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.825526 5118 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 10:15:03 crc kubenswrapper[5118]: I0223 10:15:03.852284 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-networker-pcxxf_2a03095b-b10c-4ab3-b1cb-4b31730a0585/tripleo-cleanup-tripleo-cleanup-openstack-networker/0.log" Feb 23 10:15:04 crc kubenswrapper[5118]: I0223 10:15:04.017749 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-ttqz2_19007ba1-bffe-48df-b101-500178cc1930/validate-network-openstack-openstack-cell1/0.log" Feb 23 10:15:04 crc kubenswrapper[5118]: I0223 10:15:04.139468 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-networker-4ssqk_fd9a0e4f-ed13-4cdd-b8b7-3edba6274dbc/validate-network-openstack-openstack-networker/0.log" Feb 23 10:15:04 crc kubenswrapper[5118]: I0223 10:15:04.153787 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" event={"ID":"7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2","Type":"ContainerDied","Data":"c9b4304f764868c5c1318625406562b339e1d50b9f0a8afb87d562d33502815a"} Feb 23 10:15:04 crc kubenswrapper[5118]: I0223 10:15:04.153834 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b4304f764868c5c1318625406562b339e1d50b9f0a8afb87d562d33502815a" Feb 23 10:15:04 crc kubenswrapper[5118]: I0223 10:15:04.153895 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530695-t9b85" Feb 23 10:15:04 crc kubenswrapper[5118]: I0223 10:15:04.676654 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6"] Feb 23 10:15:04 crc kubenswrapper[5118]: I0223 10:15:04.687381 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530650-qchp6"] Feb 23 10:15:05 crc kubenswrapper[5118]: I0223 10:15:05.707260 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532cd860-7918-44d0-9f42-0fad388dc16c" path="/var/lib/kubelet/pods/532cd860-7918-44d0-9f42-0fad388dc16c/volumes" Feb 23 10:15:09 crc kubenswrapper[5118]: I0223 10:15:09.140958 5118 scope.go:117] "RemoveContainer" containerID="f2c89673d5a4b994e6bb0e8931888d55a0fd2871f87313418b941c4d6304b711" Feb 23 10:15:11 crc kubenswrapper[5118]: I0223 10:15:11.697226 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:15:11 crc kubenswrapper[5118]: E0223 10:15:11.697847 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:15:15 crc kubenswrapper[5118]: I0223 10:15:15.320319 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c42fb7ad-c7fe-425a-b3b5-82cf6c03122b/memcached/0.log" Feb 23 10:15:23 crc kubenswrapper[5118]: I0223 10:15:23.698337 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:15:23 crc kubenswrapper[5118]: E0223 10:15:23.699389 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:15:26 crc kubenswrapper[5118]: I0223 10:15:26.816557 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn_300729f0-11bf-4947-b58d-1a60bd0af741/util/0.log" Feb 23 10:15:27 crc kubenswrapper[5118]: I0223 10:15:27.032393 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn_300729f0-11bf-4947-b58d-1a60bd0af741/pull/0.log" Feb 23 10:15:27 crc kubenswrapper[5118]: I0223 10:15:27.068592 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn_300729f0-11bf-4947-b58d-1a60bd0af741/pull/0.log" Feb 23 10:15:27 crc kubenswrapper[5118]: I0223 10:15:27.111208 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn_300729f0-11bf-4947-b58d-1a60bd0af741/util/0.log" Feb 23 10:15:27 crc kubenswrapper[5118]: I0223 10:15:27.261543 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn_300729f0-11bf-4947-b58d-1a60bd0af741/util/0.log" Feb 23 10:15:27 crc kubenswrapper[5118]: I0223 10:15:27.277620 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn_300729f0-11bf-4947-b58d-1a60bd0af741/pull/0.log" Feb 23 10:15:27 crc kubenswrapper[5118]: I0223 10:15:27.292727 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b79676bsgn_300729f0-11bf-4947-b58d-1a60bd0af741/extract/0.log" Feb 23 10:15:27 crc kubenswrapper[5118]: I0223 10:15:27.726821 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-dhlhx_323a4a15-ada9-44e1-9fb1-4533964ab4b8/manager/0.log" Feb 23 10:15:28 crc kubenswrapper[5118]: I0223 10:15:28.182763 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-98cgm_618a679a-a0c0-40a1-b3b3-79e0834aef7c/manager/0.log" Feb 23 10:15:28 crc kubenswrapper[5118]: I0223 10:15:28.246630 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-d6j2x_22db249a-2327-4cb1-a11c-72ac99925b27/manager/0.log" Feb 23 10:15:28 crc kubenswrapper[5118]: I0223 10:15:28.448019 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-vqkbz_3fe2a698-e792-41a1-92bc-589d0bab315b/manager/0.log" Feb 23 10:15:29 crc kubenswrapper[5118]: I0223 10:15:29.089866 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-7m4sj_29f5282f-4695-42d1-9ebd-6fd3d8acc24b/manager/0.log" Feb 23 10:15:29 crc kubenswrapper[5118]: I0223 10:15:29.634554 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-79wlh_f9e92ac7-cd2b-4c82-94a1-213c37487e61/manager/0.log" Feb 23 10:15:29 crc kubenswrapper[5118]: I0223 10:15:29.714024 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-98sv6_d42094d4-0a02-4b3d-8bad-7d0a0bda3d47/manager/0.log" Feb 23 10:15:29 crc kubenswrapper[5118]: I0223 10:15:29.985945 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-bfhhb_e729f808-0740-4153-aa13-93bdfc02da47/manager/0.log" Feb 23 10:15:30 crc kubenswrapper[5118]: I0223 10:15:30.387814 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-kprdd_133701aa-5bf0-4431-90bd-4a67b0c71a8f/manager/0.log" Feb 23 10:15:30 crc kubenswrapper[5118]: I0223 10:15:30.615391 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-qhvx6_61637571-dc55-41bf-9765-415fa7a78100/manager/0.log" Feb 23 10:15:31 crc kubenswrapper[5118]: I0223 10:15:31.336705 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-54gzb_f392b9d5-c8ed-4976-a7df-96cbbd8faad0/manager/0.log" Feb 23 10:15:31 crc kubenswrapper[5118]: I0223 10:15:31.585744 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-9zqc5_f4b0cd84-f619-415b-9d18-bc6de1b1f40b/manager/0.log" Feb 23 10:15:32 crc kubenswrapper[5118]: I0223 10:15:32.064815 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-m96kp_e58e8816-6d4e-48bd-b00e-956269a79c2f/operator/0.log" Feb 23 10:15:32 crc kubenswrapper[5118]: I0223 10:15:32.867991 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qswx2_8a10def1-5478-48e8-acb6-3cc2e549e94a/registry-server/0.log" Feb 23 10:15:33 crc kubenswrapper[5118]: I0223 10:15:33.395912 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-z5qcz_e56300c0-de91-4f3e-9b0b-6e5303e7561d/manager/0.log" Feb 23 10:15:33 crc kubenswrapper[5118]: I0223 10:15:33.698836 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-bz8br_f2be56ef-55b5-42c9-bf9e-7db0f6148b49/manager/0.log" Feb 23 10:15:33 crc kubenswrapper[5118]: I0223 10:15:33.801021 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-5zc5v_9885aacd-cab4-40c1-b504-067fa6b38393/manager/0.log" Feb 23 10:15:34 crc kubenswrapper[5118]: I0223 10:15:34.098838 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-dcfgx_de50ead7-a26c-4c87-a379-7c83b3ffeabd/operator/0.log" Feb 23 10:15:34 crc kubenswrapper[5118]: I0223 10:15:34.372366 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-4x7fr_a66d9cfc-3b02-4896-8dda-bcbeddb5c803/manager/0.log" Feb 23 10:15:34 crc kubenswrapper[5118]: I0223 10:15:34.699211 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:15:34 crc kubenswrapper[5118]: E0223 10:15:34.707172 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:15:34 crc kubenswrapper[5118]: I0223 10:15:34.749390 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-mgwl7_e7dabd60-e0b9-4ddf-9775-67bf8f922e54/manager/0.log" Feb 23 10:15:34 crc kubenswrapper[5118]: I0223 10:15:34.818510 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-4zd6f_07e7dae1-170c-4c44-8ebf-4547edea1c65/manager/0.log" Feb 23 10:15:35 crc kubenswrapper[5118]: I0223 10:15:35.037440 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-xkf9z_9169039b-beb7-48c1-8dbc-485f91b6a2ac/manager/0.log" Feb 23 10:15:36 crc kubenswrapper[5118]: I0223 10:15:36.832342 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-b8rhr_39af282d-8ae3-4a52-9274-cd62d12a1ef1/manager/0.log" Feb 23 10:15:37 crc kubenswrapper[5118]: I0223 10:15:37.470305 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-gvdcc_292023bd-64ab-499f-aabc-43b314775b62/manager/0.log" Feb 23 10:15:38 crc kubenswrapper[5118]: I0223 10:15:38.005394 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-x8xz4_f6ab91fc-435e-4d88-8dfa-837000d8decd/manager/0.log" Feb 23 10:15:48 crc kubenswrapper[5118]: I0223 10:15:48.697400 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:15:48 crc kubenswrapper[5118]: E0223 10:15:48.698071 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:15:55 crc kubenswrapper[5118]: I0223 10:15:55.442071 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2rz7k_d3d6cc75-6216-4db9-8f11-6b5045548df1/control-plane-machine-set-operator/0.log" Feb 23 10:15:55 crc kubenswrapper[5118]: I0223 10:15:55.550303 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wtljz_509bc0a3-3bfa-4d55-b42d-3f584823ba57/machine-api-operator/0.log" Feb 23 10:15:55 crc kubenswrapper[5118]: I0223 10:15:55.551316 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wtljz_509bc0a3-3bfa-4d55-b42d-3f584823ba57/kube-rbac-proxy/0.log" Feb 23 10:16:01 crc kubenswrapper[5118]: I0223 10:16:01.698119 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:16:01 crc kubenswrapper[5118]: E0223 10:16:01.698847 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:16:08 crc kubenswrapper[5118]: I0223 10:16:08.319676 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-x2phk_8c7ed690-42a7-491b-ac42-b0e4afda4272/cert-manager-controller/0.log" Feb 23 10:16:08 crc kubenswrapper[5118]: I0223 10:16:08.534145 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-bzjnz_523774d0-ef24-403e-8a1a-1fad85d7985c/cert-manager-cainjector/0.log" Feb 23 10:16:08 crc kubenswrapper[5118]: I0223 10:16:08.586524 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-rkjcd_ef65b4bf-df47-4f17-aaf8-a2dfb6ba56ba/cert-manager-webhook/0.log" Feb 23 10:16:15 crc kubenswrapper[5118]: I0223 10:16:15.697968 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:16:15 crc kubenswrapper[5118]: E0223 10:16:15.698756 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:16:20 crc kubenswrapper[5118]: I0223 10:16:20.573808 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-ff9mh_0de2854b-06ec-4c6f-816e-f6aba8930366/nmstate-console-plugin/0.log" Feb 23 10:16:20 crc kubenswrapper[5118]: I0223 10:16:20.790271 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kfjtf_c824e133-db80-49dc-a6a9-e90850e89a2a/nmstate-handler/0.log" Feb 23 10:16:20 crc kubenswrapper[5118]: I0223 10:16:20.825413 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-xsnnb_2113d5b6-58a4-48b3-87ce-d61904a474f9/kube-rbac-proxy/0.log" Feb 23 10:16:20 crc kubenswrapper[5118]: I0223 10:16:20.954777 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-xsnnb_2113d5b6-58a4-48b3-87ce-d61904a474f9/nmstate-metrics/0.log" Feb 23 10:16:21 crc kubenswrapper[5118]: I0223 10:16:21.055693 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-gg2kz_9334f91a-86de-49e4-bc57-04b55a83f25d/nmstate-operator/0.log" Feb 23 10:16:21 crc kubenswrapper[5118]: I0223 10:16:21.182360 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-vvxz2_03b5790f-4c4a-44ae-8236-96ec6d31ab74/nmstate-webhook/0.log" Feb 23 10:16:26 crc kubenswrapper[5118]: I0223 10:16:26.697490 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:16:26 crc kubenswrapper[5118]: E0223 10:16:26.698322 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:16:36 crc kubenswrapper[5118]: I0223 10:16:36.029782 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-5w9fv_de0aafc9-dc3e-4457-96c6-5ef2ba36efbc/prometheus-operator/0.log" Feb 23 10:16:36 crc kubenswrapper[5118]: I0223 10:16:36.189469 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq_17fb334f-b958-4fba-b22b-f18ba52f29ae/prometheus-operator-admission-webhook/0.log" Feb 23 10:16:36 crc kubenswrapper[5118]: I0223 10:16:36.280331 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj_92174a3a-511b-43fd-ac12-662d7d419640/prometheus-operator-admission-webhook/0.log" Feb 23 10:16:36 crc kubenswrapper[5118]: I0223 10:16:36.457035 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2wznd_1b496827-29a0-4c5b-9d77-764163a0ddf7/operator/0.log" Feb 23 10:16:36 crc kubenswrapper[5118]: I0223 10:16:36.505427 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dtfft_97e1bcf9-32e9-4db5-8b2f-2113aadd6a86/perses-operator/0.log" Feb 23 10:16:41 crc kubenswrapper[5118]: I0223 10:16:41.697718 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:16:41 crc kubenswrapper[5118]: E0223 10:16:41.698737 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:16:51 crc kubenswrapper[5118]: I0223 10:16:51.715436 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-5fkd5_6355cba8-47c0-4813-b2a6-14b7ba533f5d/kube-rbac-proxy/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.053053 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-frr-files/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.078083 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-5fkd5_6355cba8-47c0-4813-b2a6-14b7ba533f5d/controller/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.291842 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-frr-files/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.329340 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-metrics/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.330401 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-reloader/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.339766 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-reloader/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.502618 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-frr-files/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.568373 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-metrics/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.583832 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-reloader/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.623107 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-metrics/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.713950 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-frr-files/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.767234 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-reloader/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.814621 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/cp-metrics/0.log" Feb 23 10:16:52 crc kubenswrapper[5118]: I0223 10:16:52.866186 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/controller/0.log" Feb 23 10:16:53 crc kubenswrapper[5118]: I0223 10:16:53.039968 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/frr-metrics/0.log" Feb 23 10:16:53 crc kubenswrapper[5118]: I0223 10:16:53.085407 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/kube-rbac-proxy/0.log" Feb 23 10:16:53 crc kubenswrapper[5118]: I0223 10:16:53.166878 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/kube-rbac-proxy-frr/0.log" Feb 23 10:16:53 crc kubenswrapper[5118]: I0223 10:16:53.320504 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/reloader/0.log" Feb 23 10:16:53 crc kubenswrapper[5118]: I0223 10:16:53.408567 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-gtnbz_cafc617b-b973-45be-8f33-c7a6414d303a/frr-k8s-webhook-server/0.log" Feb 23 10:16:53 crc kubenswrapper[5118]: I0223 10:16:53.678241 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-566586f54b-bm2tq_1549afd4-1ff5-4412-92f1-10e57c8f3bc8/manager/0.log" Feb 23 10:16:54 crc kubenswrapper[5118]: I0223 10:16:54.144571 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-75bvd_ad24076e-75f9-406f-8f31-11211974625d/kube-rbac-proxy/0.log" Feb 23 10:16:54 crc kubenswrapper[5118]: I0223 10:16:54.222318 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5d5d656c8d-m4blz_721f2089-bb03-426d-adfa-f16793595526/webhook-server/0.log" Feb 23 10:16:55 crc kubenswrapper[5118]: I0223 10:16:55.219574 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-75bvd_ad24076e-75f9-406f-8f31-11211974625d/speaker/0.log" Feb 23 10:16:56 crc kubenswrapper[5118]: I0223 10:16:56.415357 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5vhwv_4e891361-5f36-4ebe-a394-c553a139765a/frr/0.log" Feb 23 10:16:56 crc kubenswrapper[5118]: I0223 10:16:56.697628 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:16:56 crc kubenswrapper[5118]: E0223 10:16:56.698478 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:17:08 crc kubenswrapper[5118]: I0223 10:17:08.696890 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:17:08 crc kubenswrapper[5118]: E0223 10:17:08.697726 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.066315 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78_9fc8d426-e713-4b39-b393-717378bc1b1b/util/0.log" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.261646 5118 scope.go:117] "RemoveContainer" containerID="c05fda2b3296a81e5b6f8afd6fec560a0597e4f949ba354e9d577e6774bd35e0" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.325413 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78_9fc8d426-e713-4b39-b393-717378bc1b1b/util/0.log" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.329982 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78_9fc8d426-e713-4b39-b393-717378bc1b1b/pull/0.log" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.367747 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78_9fc8d426-e713-4b39-b393-717378bc1b1b/pull/0.log" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.613726 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78_9fc8d426-e713-4b39-b393-717378bc1b1b/extract/0.log" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.620704 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78_9fc8d426-e713-4b39-b393-717378bc1b1b/util/0.log" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.631962 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jgm78_9fc8d426-e713-4b39-b393-717378bc1b1b/pull/0.log" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.808238 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7_cd06a418-d41c-43a0-84fe-c05fb6e60b51/util/0.log" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.953551 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7_cd06a418-d41c-43a0-84fe-c05fb6e60b51/util/0.log" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.957651 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7_cd06a418-d41c-43a0-84fe-c05fb6e60b51/pull/0.log" Feb 23 10:17:09 crc kubenswrapper[5118]: I0223 10:17:09.974473 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7_cd06a418-d41c-43a0-84fe-c05fb6e60b51/pull/0.log" Feb 23 10:17:10 crc kubenswrapper[5118]: I0223 10:17:10.190022 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7_cd06a418-d41c-43a0-84fe-c05fb6e60b51/extract/0.log" Feb 23 10:17:10 crc kubenswrapper[5118]: I0223 10:17:10.203566 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7_cd06a418-d41c-43a0-84fe-c05fb6e60b51/util/0.log" Feb 23 10:17:10 crc kubenswrapper[5118]: I0223 10:17:10.254284 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08gcvv7_cd06a418-d41c-43a0-84fe-c05fb6e60b51/pull/0.log" Feb 23 10:17:10 crc kubenswrapper[5118]: I0223 10:17:10.368521 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4_2961e2e0-0d3d-4ab9-8916-c03b6ef25b22/util/0.log" Feb 23 10:17:10 crc kubenswrapper[5118]: I0223 10:17:10.545003 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4_2961e2e0-0d3d-4ab9-8916-c03b6ef25b22/pull/0.log" Feb 23 10:17:10 crc kubenswrapper[5118]: I0223 10:17:10.567900 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4_2961e2e0-0d3d-4ab9-8916-c03b6ef25b22/util/0.log" Feb 23 10:17:10 crc kubenswrapper[5118]: I0223 10:17:10.584736 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4_2961e2e0-0d3d-4ab9-8916-c03b6ef25b22/pull/0.log" Feb 23 10:17:10 crc kubenswrapper[5118]: I0223 10:17:10.761244 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4_2961e2e0-0d3d-4ab9-8916-c03b6ef25b22/util/0.log" Feb 23 10:17:10 crc kubenswrapper[5118]: I0223 10:17:10.807705 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4_2961e2e0-0d3d-4ab9-8916-c03b6ef25b22/extract/0.log" Feb 23 10:17:10 crc kubenswrapper[5118]: I0223 10:17:10.858935 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213tlcp4_2961e2e0-0d3d-4ab9-8916-c03b6ef25b22/pull/0.log" Feb 23 10:17:11 crc kubenswrapper[5118]: I0223 10:17:11.138553 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6wzcs_69456a54-9a29-4036-af66-f6c601f7c43d/extract-utilities/0.log" Feb 23 10:17:11 crc kubenswrapper[5118]: I0223 10:17:11.314227 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6wzcs_69456a54-9a29-4036-af66-f6c601f7c43d/extract-utilities/0.log" Feb 23 10:17:11 crc kubenswrapper[5118]: I0223 10:17:11.342738 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6wzcs_69456a54-9a29-4036-af66-f6c601f7c43d/extract-content/0.log" Feb 23 10:17:11 crc kubenswrapper[5118]: I0223 10:17:11.356384 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6wzcs_69456a54-9a29-4036-af66-f6c601f7c43d/extract-content/0.log" Feb 23 10:17:11 crc kubenswrapper[5118]: I0223 10:17:11.502496 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6wzcs_69456a54-9a29-4036-af66-f6c601f7c43d/extract-utilities/0.log" Feb 23 10:17:11 crc kubenswrapper[5118]: I0223 10:17:11.541285 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6wzcs_69456a54-9a29-4036-af66-f6c601f7c43d/extract-content/0.log" Feb 23 10:17:11 crc kubenswrapper[5118]: I0223 10:17:11.728747 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wgrql_70ba3929-8fce-46de-ac77-f0ac791d194c/extract-utilities/0.log" Feb 23 10:17:11 crc kubenswrapper[5118]: I0223 10:17:11.925980 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wgrql_70ba3929-8fce-46de-ac77-f0ac791d194c/extract-utilities/0.log" Feb 23 10:17:11 crc kubenswrapper[5118]: I0223 10:17:11.938459 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wgrql_70ba3929-8fce-46de-ac77-f0ac791d194c/extract-content/0.log" Feb 23 10:17:11 crc kubenswrapper[5118]: I0223 10:17:11.982773 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wgrql_70ba3929-8fce-46de-ac77-f0ac791d194c/extract-content/0.log" Feb 23 10:17:12 crc kubenswrapper[5118]: I0223 10:17:12.183846 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wgrql_70ba3929-8fce-46de-ac77-f0ac791d194c/extract-content/0.log" Feb 23 10:17:12 crc kubenswrapper[5118]: I0223 10:17:12.244630 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wgrql_70ba3929-8fce-46de-ac77-f0ac791d194c/extract-utilities/0.log" Feb 23 10:17:12 crc kubenswrapper[5118]: I0223 10:17:12.555047 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d_dd42c5ed-566b-45c4-b508-b14c7f0a2512/util/0.log" Feb 23 10:17:12 crc kubenswrapper[5118]: I0223 10:17:12.802945 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d_dd42c5ed-566b-45c4-b508-b14c7f0a2512/util/0.log" Feb 23 10:17:12 crc kubenswrapper[5118]: I0223 10:17:12.868315 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d_dd42c5ed-566b-45c4-b508-b14c7f0a2512/pull/0.log" Feb 23 10:17:13 crc kubenswrapper[5118]: I0223 10:17:13.091692 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d_dd42c5ed-566b-45c4-b508-b14c7f0a2512/pull/0.log" Feb 23 10:17:13 crc kubenswrapper[5118]: I0223 10:17:13.243680 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6wzcs_69456a54-9a29-4036-af66-f6c601f7c43d/registry-server/0.log" Feb 23 10:17:13 crc kubenswrapper[5118]: I0223 10:17:13.257709 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d_dd42c5ed-566b-45c4-b508-b14c7f0a2512/util/0.log" Feb 23 10:17:13 crc kubenswrapper[5118]: I0223 10:17:13.334910 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d_dd42c5ed-566b-45c4-b508-b14c7f0a2512/pull/0.log" Feb 23 10:17:13 crc kubenswrapper[5118]: I0223 10:17:13.376741 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca2dt6d_dd42c5ed-566b-45c4-b508-b14c7f0a2512/extract/0.log" Feb 23 10:17:13 crc kubenswrapper[5118]: I0223 10:17:13.464394 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kzwjr_2e64f70b-4830-4fef-94c0-d5eef18c5eb0/marketplace-operator/0.log" Feb 23 10:17:13 crc kubenswrapper[5118]: I0223 10:17:13.651624 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-svbnv_38dd814b-2c41-4f46-a3c3-e54c5bcf2c43/extract-utilities/0.log" Feb 23 10:17:13 crc kubenswrapper[5118]: I0223 10:17:13.884449 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-svbnv_38dd814b-2c41-4f46-a3c3-e54c5bcf2c43/extract-content/0.log" Feb 23 10:17:13 crc kubenswrapper[5118]: I0223 10:17:13.899121 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-svbnv_38dd814b-2c41-4f46-a3c3-e54c5bcf2c43/extract-utilities/0.log" Feb 23 10:17:13 crc kubenswrapper[5118]: I0223 10:17:13.915148 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-svbnv_38dd814b-2c41-4f46-a3c3-e54c5bcf2c43/extract-content/0.log" Feb 23 10:17:14 crc kubenswrapper[5118]: I0223 10:17:14.130352 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-svbnv_38dd814b-2c41-4f46-a3c3-e54c5bcf2c43/extract-content/0.log" Feb 23 10:17:14 crc kubenswrapper[5118]: I0223 10:17:14.147541 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-svbnv_38dd814b-2c41-4f46-a3c3-e54c5bcf2c43/extract-utilities/0.log" Feb 23 10:17:14 crc kubenswrapper[5118]: I0223 10:17:14.346983 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m8j96_051f6240-bd14-4b49-b892-318d70c8a38c/extract-utilities/0.log" Feb 23 10:17:14 crc kubenswrapper[5118]: I0223 10:17:14.548764 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m8j96_051f6240-bd14-4b49-b892-318d70c8a38c/extract-content/0.log" Feb 23 10:17:14 crc kubenswrapper[5118]: I0223 10:17:14.551065 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m8j96_051f6240-bd14-4b49-b892-318d70c8a38c/extract-utilities/0.log" Feb 23 10:17:14 crc kubenswrapper[5118]: I0223 10:17:14.647893 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m8j96_051f6240-bd14-4b49-b892-318d70c8a38c/extract-content/0.log" Feb 23 10:17:14 crc kubenswrapper[5118]: I0223 10:17:14.771526 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wgrql_70ba3929-8fce-46de-ac77-f0ac791d194c/registry-server/0.log" Feb 23 10:17:14 crc kubenswrapper[5118]: I0223 10:17:14.773140 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m8j96_051f6240-bd14-4b49-b892-318d70c8a38c/extract-content/0.log" Feb 23 10:17:14 crc kubenswrapper[5118]: I0223 10:17:14.779629 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m8j96_051f6240-bd14-4b49-b892-318d70c8a38c/extract-utilities/0.log" Feb 23 10:17:14 crc kubenswrapper[5118]: I0223 10:17:14.795173 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-svbnv_38dd814b-2c41-4f46-a3c3-e54c5bcf2c43/registry-server/0.log" Feb 23 10:17:16 crc kubenswrapper[5118]: I0223 10:17:16.293919 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m8j96_051f6240-bd14-4b49-b892-318d70c8a38c/registry-server/0.log" Feb 23 10:17:21 crc kubenswrapper[5118]: I0223 10:17:21.697541 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:17:21 crc kubenswrapper[5118]: E0223 10:17:21.698548 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:17:28 crc kubenswrapper[5118]: I0223 10:17:28.669130 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-5w9fv_de0aafc9-dc3e-4457-96c6-5ef2ba36efbc/prometheus-operator/0.log" Feb 23 10:17:28 crc kubenswrapper[5118]: I0223 10:17:28.675264 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d6d44b954-mkltj_92174a3a-511b-43fd-ac12-662d7d419640/prometheus-operator-admission-webhook/0.log" Feb 23 10:17:28 crc kubenswrapper[5118]: I0223 10:17:28.716295 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d6d44b954-f7djq_17fb334f-b958-4fba-b22b-f18ba52f29ae/prometheus-operator-admission-webhook/0.log" Feb 23 10:17:28 crc kubenswrapper[5118]: I0223 10:17:28.903223 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-2wznd_1b496827-29a0-4c5b-9d77-764163a0ddf7/operator/0.log" Feb 23 10:17:28 crc kubenswrapper[5118]: I0223 10:17:28.917348 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dtfft_97e1bcf9-32e9-4db5-8b2f-2113aadd6a86/perses-operator/0.log" Feb 23 10:17:32 crc kubenswrapper[5118]: I0223 10:17:32.697657 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:17:32 crc kubenswrapper[5118]: E0223 10:17:32.698480 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:17:44 crc kubenswrapper[5118]: I0223 10:17:44.698204 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:17:44 crc kubenswrapper[5118]: E0223 10:17:44.700318 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:17:46 crc kubenswrapper[5118]: E0223 10:17:46.894511 5118 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.46:47620->38.102.83.46:40611: write tcp 38.102.83.46:47620->38.102.83.46:40611: write: broken pipe Feb 23 10:17:57 crc kubenswrapper[5118]: I0223 10:17:57.192733 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:17:57 crc kubenswrapper[5118]: E0223 10:17:57.193742 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:18:08 crc kubenswrapper[5118]: I0223 10:18:08.698190 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:18:08 crc kubenswrapper[5118]: E0223 10:18:08.699414 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:18:23 crc kubenswrapper[5118]: I0223 10:18:23.697780 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:18:23 crc kubenswrapper[5118]: E0223 10:18:23.698825 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:18:38 crc kubenswrapper[5118]: I0223 10:18:38.698060 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:18:38 crc kubenswrapper[5118]: E0223 10:18:38.698949 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:18:51 crc kubenswrapper[5118]: I0223 10:18:51.703698 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:18:51 crc kubenswrapper[5118]: E0223 10:18:51.704486 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:19:03 crc kubenswrapper[5118]: I0223 10:19:03.707135 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:19:03 crc kubenswrapper[5118]: E0223 10:19:03.708182 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:19:19 crc kubenswrapper[5118]: I0223 10:19:19.698014 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:19:19 crc kubenswrapper[5118]: E0223 10:19:19.699297 5118 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlxj9_openshift-machine-config-operator(d3ecfa2c-410e-49e5-86ab-f386efab9cf6)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" Feb 23 10:19:33 crc kubenswrapper[5118]: I0223 10:19:33.698072 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:19:34 crc kubenswrapper[5118]: I0223 10:19:34.362467 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"bd2f9ceee3a1a561ea92f797bbc4e9977fe33ba272a549d02a6fcf71327faa28"} Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.072463 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pfh7v"] Feb 23 10:21:13 crc kubenswrapper[5118]: E0223 10:21:13.073364 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2" containerName="collect-profiles" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.073376 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2" containerName="collect-profiles" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.073614 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9fa3d6-d150-4ef5-bf2b-686376a6bfc2" containerName="collect-profiles" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.075588 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.095841 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pfh7v"] Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.163698 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhk57\" (UniqueName: \"kubernetes.io/projected/5323fe62-9141-4e33-83bd-8d0c914e8070-kube-api-access-vhk57\") pod \"certified-operators-pfh7v\" (UID: \"5323fe62-9141-4e33-83bd-8d0c914e8070\") " pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.163773 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5323fe62-9141-4e33-83bd-8d0c914e8070-catalog-content\") pod \"certified-operators-pfh7v\" (UID: \"5323fe62-9141-4e33-83bd-8d0c914e8070\") " pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.163826 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5323fe62-9141-4e33-83bd-8d0c914e8070-utilities\") pod \"certified-operators-pfh7v\" (UID: \"5323fe62-9141-4e33-83bd-8d0c914e8070\") " pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.266051 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhk57\" (UniqueName: \"kubernetes.io/projected/5323fe62-9141-4e33-83bd-8d0c914e8070-kube-api-access-vhk57\") pod \"certified-operators-pfh7v\" (UID: \"5323fe62-9141-4e33-83bd-8d0c914e8070\") " pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.266586 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5323fe62-9141-4e33-83bd-8d0c914e8070-catalog-content\") pod \"certified-operators-pfh7v\" (UID: \"5323fe62-9141-4e33-83bd-8d0c914e8070\") " pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.266709 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5323fe62-9141-4e33-83bd-8d0c914e8070-utilities\") pod \"certified-operators-pfh7v\" (UID: \"5323fe62-9141-4e33-83bd-8d0c914e8070\") " pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.267289 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5323fe62-9141-4e33-83bd-8d0c914e8070-catalog-content\") pod \"certified-operators-pfh7v\" (UID: \"5323fe62-9141-4e33-83bd-8d0c914e8070\") " pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.267369 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5323fe62-9141-4e33-83bd-8d0c914e8070-utilities\") pod \"certified-operators-pfh7v\" (UID: \"5323fe62-9141-4e33-83bd-8d0c914e8070\") " pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.292187 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhk57\" (UniqueName: \"kubernetes.io/projected/5323fe62-9141-4e33-83bd-8d0c914e8070-kube-api-access-vhk57\") pod \"certified-operators-pfh7v\" (UID: \"5323fe62-9141-4e33-83bd-8d0c914e8070\") " pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:13 crc kubenswrapper[5118]: I0223 10:21:13.399669 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:14 crc kubenswrapper[5118]: I0223 10:21:14.022037 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pfh7v"] Feb 23 10:21:14 crc kubenswrapper[5118]: I0223 10:21:14.596783 5118 generic.go:334] "Generic (PLEG): container finished" podID="5323fe62-9141-4e33-83bd-8d0c914e8070" containerID="086812ac492c8159e8bad6adc14aadf5a72c3b586d668a688b43090e50430b30" exitCode=0 Feb 23 10:21:14 crc kubenswrapper[5118]: I0223 10:21:14.596880 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfh7v" event={"ID":"5323fe62-9141-4e33-83bd-8d0c914e8070","Type":"ContainerDied","Data":"086812ac492c8159e8bad6adc14aadf5a72c3b586d668a688b43090e50430b30"} Feb 23 10:21:14 crc kubenswrapper[5118]: I0223 10:21:14.597063 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfh7v" event={"ID":"5323fe62-9141-4e33-83bd-8d0c914e8070","Type":"ContainerStarted","Data":"28dcea1b474fd8feecea1249a18f1a7419a5f7f039f83f67fad7dce2371f8165"} Feb 23 10:21:14 crc kubenswrapper[5118]: I0223 10:21:14.599171 5118 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 10:21:14 crc kubenswrapper[5118]: I0223 10:21:14.902769 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r9bp2"] Feb 23 10:21:14 crc kubenswrapper[5118]: I0223 10:21:14.906535 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:14 crc kubenswrapper[5118]: I0223 10:21:14.914532 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9bp2"] Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.011684 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-catalog-content\") pod \"community-operators-r9bp2\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.012203 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-utilities\") pod \"community-operators-r9bp2\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.012628 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbrwd\" (UniqueName: \"kubernetes.io/projected/eeede929-b70c-4cb2-a803-b3052ce99677-kube-api-access-nbrwd\") pod \"community-operators-r9bp2\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.119087 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-utilities\") pod \"community-operators-r9bp2\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.119536 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbrwd\" (UniqueName: \"kubernetes.io/projected/eeede929-b70c-4cb2-a803-b3052ce99677-kube-api-access-nbrwd\") pod \"community-operators-r9bp2\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.119639 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-utilities\") pod \"community-operators-r9bp2\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.119649 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-catalog-content\") pod \"community-operators-r9bp2\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.120150 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-catalog-content\") pod \"community-operators-r9bp2\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.144324 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbrwd\" (UniqueName: \"kubernetes.io/projected/eeede929-b70c-4cb2-a803-b3052ce99677-kube-api-access-nbrwd\") pod \"community-operators-r9bp2\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.238583 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:15 crc kubenswrapper[5118]: I0223 10:21:15.842730 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9bp2"] Feb 23 10:21:15 crc kubenswrapper[5118]: W0223 10:21:15.852047 5118 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeede929_b70c_4cb2_a803_b3052ce99677.slice/crio-3f59ef528c9ca3573fdb8c520da2b5dc8e97bb4b95b73ea941f97782b29bee84 WatchSource:0}: Error finding container 3f59ef528c9ca3573fdb8c520da2b5dc8e97bb4b95b73ea941f97782b29bee84: Status 404 returned error can't find the container with id 3f59ef528c9ca3573fdb8c520da2b5dc8e97bb4b95b73ea941f97782b29bee84 Feb 23 10:21:16 crc kubenswrapper[5118]: I0223 10:21:16.624720 5118 generic.go:334] "Generic (PLEG): container finished" podID="eeede929-b70c-4cb2-a803-b3052ce99677" containerID="39f25479674ccd84d40a5bae37fa30d52182d9ca987030f9c2911f0e6659791d" exitCode=0 Feb 23 10:21:16 crc kubenswrapper[5118]: I0223 10:21:16.625120 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9bp2" event={"ID":"eeede929-b70c-4cb2-a803-b3052ce99677","Type":"ContainerDied","Data":"39f25479674ccd84d40a5bae37fa30d52182d9ca987030f9c2911f0e6659791d"} Feb 23 10:21:16 crc kubenswrapper[5118]: I0223 10:21:16.625148 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9bp2" event={"ID":"eeede929-b70c-4cb2-a803-b3052ce99677","Type":"ContainerStarted","Data":"3f59ef528c9ca3573fdb8c520da2b5dc8e97bb4b95b73ea941f97782b29bee84"} Feb 23 10:21:18 crc kubenswrapper[5118]: I0223 10:21:18.666230 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9bp2" event={"ID":"eeede929-b70c-4cb2-a803-b3052ce99677","Type":"ContainerStarted","Data":"d9efa930674c75b2278638668a8ce16ca3f8256b665c6690cb66ccba4447c010"} Feb 23 10:21:20 crc kubenswrapper[5118]: I0223 10:21:20.685759 5118 generic.go:334] "Generic (PLEG): container finished" podID="eeede929-b70c-4cb2-a803-b3052ce99677" containerID="d9efa930674c75b2278638668a8ce16ca3f8256b665c6690cb66ccba4447c010" exitCode=0 Feb 23 10:21:20 crc kubenswrapper[5118]: I0223 10:21:20.685841 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9bp2" event={"ID":"eeede929-b70c-4cb2-a803-b3052ce99677","Type":"ContainerDied","Data":"d9efa930674c75b2278638668a8ce16ca3f8256b665c6690cb66ccba4447c010"} Feb 23 10:21:20 crc kubenswrapper[5118]: I0223 10:21:20.688376 5118 generic.go:334] "Generic (PLEG): container finished" podID="5323fe62-9141-4e33-83bd-8d0c914e8070" containerID="b84fe08b680712a3d07d1229336460a3d99d137190f3483efe6530ebec17f9f2" exitCode=0 Feb 23 10:21:20 crc kubenswrapper[5118]: I0223 10:21:20.688407 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfh7v" event={"ID":"5323fe62-9141-4e33-83bd-8d0c914e8070","Type":"ContainerDied","Data":"b84fe08b680712a3d07d1229336460a3d99d137190f3483efe6530ebec17f9f2"} Feb 23 10:21:21 crc kubenswrapper[5118]: I0223 10:21:21.712599 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9bp2" event={"ID":"eeede929-b70c-4cb2-a803-b3052ce99677","Type":"ContainerStarted","Data":"1c10ce93868b3e606fad6fa032d2af33e2fb7ff843fc995e896061f4f00bc5d9"} Feb 23 10:21:21 crc kubenswrapper[5118]: I0223 10:21:21.712957 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfh7v" event={"ID":"5323fe62-9141-4e33-83bd-8d0c914e8070","Type":"ContainerStarted","Data":"6dfbd6c0bb299f59f7780d5a04e3eda8e115d35f044dd1a0436a6682edb287ee"} Feb 23 10:21:21 crc kubenswrapper[5118]: I0223 10:21:21.729063 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pfh7v" podStartSLOduration=2.00992258 podStartE2EDuration="8.729043199s" podCreationTimestamp="2026-02-23 10:21:13 +0000 UTC" firstStartedPulling="2026-02-23 10:21:14.598557318 +0000 UTC m=+12937.602341901" lastFinishedPulling="2026-02-23 10:21:21.317677917 +0000 UTC m=+12944.321462520" observedRunningTime="2026-02-23 10:21:21.727733287 +0000 UTC m=+12944.731517890" watchObservedRunningTime="2026-02-23 10:21:21.729043199 +0000 UTC m=+12944.732827772" Feb 23 10:21:21 crc kubenswrapper[5118]: I0223 10:21:21.770064 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r9bp2" podStartSLOduration=3.124861069 podStartE2EDuration="7.770035742s" podCreationTimestamp="2026-02-23 10:21:14 +0000 UTC" firstStartedPulling="2026-02-23 10:21:16.626882477 +0000 UTC m=+12939.630667050" lastFinishedPulling="2026-02-23 10:21:21.27205715 +0000 UTC m=+12944.275841723" observedRunningTime="2026-02-23 10:21:21.747352193 +0000 UTC m=+12944.751136776" watchObservedRunningTime="2026-02-23 10:21:21.770035742 +0000 UTC m=+12944.773820325" Feb 23 10:21:23 crc kubenswrapper[5118]: I0223 10:21:23.400638 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:23 crc kubenswrapper[5118]: I0223 10:21:23.400966 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:24 crc kubenswrapper[5118]: I0223 10:21:24.449723 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pfh7v" podUID="5323fe62-9141-4e33-83bd-8d0c914e8070" containerName="registry-server" probeResult="failure" output=< Feb 23 10:21:24 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 10:21:24 crc kubenswrapper[5118]: > Feb 23 10:21:25 crc kubenswrapper[5118]: I0223 10:21:25.239002 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:25 crc kubenswrapper[5118]: I0223 10:21:25.239066 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:26 crc kubenswrapper[5118]: I0223 10:21:26.298607 5118 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-r9bp2" podUID="eeede929-b70c-4cb2-a803-b3052ce99677" containerName="registry-server" probeResult="failure" output=< Feb 23 10:21:26 crc kubenswrapper[5118]: timeout: failed to connect service ":50051" within 1s Feb 23 10:21:26 crc kubenswrapper[5118]: > Feb 23 10:21:30 crc kubenswrapper[5118]: I0223 10:21:30.790628 5118 generic.go:334] "Generic (PLEG): container finished" podID="ea04ed6c-6822-4375-809a-d0c8ab5f484b" containerID="b67fa2bd0975bc3ad32231ffc41e99039dd9a8a3ccaf41df33670791e05eff66" exitCode=0 Feb 23 10:21:30 crc kubenswrapper[5118]: I0223 10:21:30.790697 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lrbmh/must-gather-l59jv" event={"ID":"ea04ed6c-6822-4375-809a-d0c8ab5f484b","Type":"ContainerDied","Data":"b67fa2bd0975bc3ad32231ffc41e99039dd9a8a3ccaf41df33670791e05eff66"} Feb 23 10:21:30 crc kubenswrapper[5118]: I0223 10:21:30.791761 5118 scope.go:117] "RemoveContainer" containerID="b67fa2bd0975bc3ad32231ffc41e99039dd9a8a3ccaf41df33670791e05eff66" Feb 23 10:21:31 crc kubenswrapper[5118]: I0223 10:21:31.045128 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lrbmh_must-gather-l59jv_ea04ed6c-6822-4375-809a-d0c8ab5f484b/gather/0.log" Feb 23 10:21:33 crc kubenswrapper[5118]: I0223 10:21:33.454146 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:33 crc kubenswrapper[5118]: I0223 10:21:33.533897 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pfh7v" Feb 23 10:21:33 crc kubenswrapper[5118]: I0223 10:21:33.615448 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pfh7v"] Feb 23 10:21:33 crc kubenswrapper[5118]: I0223 10:21:33.712642 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6wzcs"] Feb 23 10:21:33 crc kubenswrapper[5118]: I0223 10:21:33.712865 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6wzcs" podUID="69456a54-9a29-4036-af66-f6c601f7c43d" containerName="registry-server" containerID="cri-o://0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f" gracePeriod=2 Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.349026 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.407896 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-catalog-content\") pod \"69456a54-9a29-4036-af66-f6c601f7c43d\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.407970 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r8h6\" (UniqueName: \"kubernetes.io/projected/69456a54-9a29-4036-af66-f6c601f7c43d-kube-api-access-4r8h6\") pod \"69456a54-9a29-4036-af66-f6c601f7c43d\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.408136 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-utilities\") pod \"69456a54-9a29-4036-af66-f6c601f7c43d\" (UID: \"69456a54-9a29-4036-af66-f6c601f7c43d\") " Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.409382 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-utilities" (OuterVolumeSpecName: "utilities") pod "69456a54-9a29-4036-af66-f6c601f7c43d" (UID: "69456a54-9a29-4036-af66-f6c601f7c43d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.420683 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69456a54-9a29-4036-af66-f6c601f7c43d-kube-api-access-4r8h6" (OuterVolumeSpecName: "kube-api-access-4r8h6") pod "69456a54-9a29-4036-af66-f6c601f7c43d" (UID: "69456a54-9a29-4036-af66-f6c601f7c43d"). InnerVolumeSpecName "kube-api-access-4r8h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.470145 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69456a54-9a29-4036-af66-f6c601f7c43d" (UID: "69456a54-9a29-4036-af66-f6c601f7c43d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.510885 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.510923 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r8h6\" (UniqueName: \"kubernetes.io/projected/69456a54-9a29-4036-af66-f6c601f7c43d-kube-api-access-4r8h6\") on node \"crc\" DevicePath \"\"" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.510944 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69456a54-9a29-4036-af66-f6c601f7c43d-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.834990 5118 generic.go:334] "Generic (PLEG): container finished" podID="69456a54-9a29-4036-af66-f6c601f7c43d" containerID="0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f" exitCode=0 Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.835144 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wzcs" event={"ID":"69456a54-9a29-4036-af66-f6c601f7c43d","Type":"ContainerDied","Data":"0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f"} Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.835276 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wzcs" event={"ID":"69456a54-9a29-4036-af66-f6c601f7c43d","Type":"ContainerDied","Data":"981e12ecbb9b64676ae14542cf4b8c8a2983b988e2b2d765325defe12b8854b8"} Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.835340 5118 scope.go:117] "RemoveContainer" containerID="0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.835220 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wzcs" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.865283 5118 scope.go:117] "RemoveContainer" containerID="a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.891896 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6wzcs"] Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.892060 5118 scope.go:117] "RemoveContainer" containerID="a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.914381 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6wzcs"] Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.934685 5118 scope.go:117] "RemoveContainer" containerID="0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f" Feb 23 10:21:34 crc kubenswrapper[5118]: E0223 10:21:34.935457 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f\": container with ID starting with 0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f not found: ID does not exist" containerID="0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.935498 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f"} err="failed to get container status \"0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f\": rpc error: code = NotFound desc = could not find container \"0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f\": container with ID starting with 0927fcc6fb8368c064cdf15db648f41fc733793b05f6a9c0b45a13df53119f8f not found: ID does not exist" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.935551 5118 scope.go:117] "RemoveContainer" containerID="a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264" Feb 23 10:21:34 crc kubenswrapper[5118]: E0223 10:21:34.935900 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264\": container with ID starting with a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264 not found: ID does not exist" containerID="a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.935922 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264"} err="failed to get container status \"a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264\": rpc error: code = NotFound desc = could not find container \"a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264\": container with ID starting with a58b41074bc31b9bbdcd0a967f49c83e14770fcf30779f747d4bb09c65dc8264 not found: ID does not exist" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.935934 5118 scope.go:117] "RemoveContainer" containerID="a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7" Feb 23 10:21:34 crc kubenswrapper[5118]: E0223 10:21:34.936262 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7\": container with ID starting with a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7 not found: ID does not exist" containerID="a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7" Feb 23 10:21:34 crc kubenswrapper[5118]: I0223 10:21:34.936291 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7"} err="failed to get container status \"a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7\": rpc error: code = NotFound desc = could not find container \"a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7\": container with ID starting with a062eda4f4938dc915a27822251a22ccd8c5be01b858cf7da8e9cced7165e0b7 not found: ID does not exist" Feb 23 10:21:35 crc kubenswrapper[5118]: I0223 10:21:35.294939 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:35 crc kubenswrapper[5118]: I0223 10:21:35.353949 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:35 crc kubenswrapper[5118]: I0223 10:21:35.708999 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69456a54-9a29-4036-af66-f6c601f7c43d" path="/var/lib/kubelet/pods/69456a54-9a29-4036-af66-f6c601f7c43d/volumes" Feb 23 10:21:37 crc kubenswrapper[5118]: I0223 10:21:37.721643 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9bp2"] Feb 23 10:21:37 crc kubenswrapper[5118]: I0223 10:21:37.722379 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r9bp2" podUID="eeede929-b70c-4cb2-a803-b3052ce99677" containerName="registry-server" containerID="cri-o://1c10ce93868b3e606fad6fa032d2af33e2fb7ff843fc995e896061f4f00bc5d9" gracePeriod=2 Feb 23 10:21:37 crc kubenswrapper[5118]: I0223 10:21:37.887039 5118 generic.go:334] "Generic (PLEG): container finished" podID="eeede929-b70c-4cb2-a803-b3052ce99677" containerID="1c10ce93868b3e606fad6fa032d2af33e2fb7ff843fc995e896061f4f00bc5d9" exitCode=0 Feb 23 10:21:37 crc kubenswrapper[5118]: I0223 10:21:37.887111 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9bp2" event={"ID":"eeede929-b70c-4cb2-a803-b3052ce99677","Type":"ContainerDied","Data":"1c10ce93868b3e606fad6fa032d2af33e2fb7ff843fc995e896061f4f00bc5d9"} Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.262518 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.389388 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-catalog-content\") pod \"eeede929-b70c-4cb2-a803-b3052ce99677\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.389441 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbrwd\" (UniqueName: \"kubernetes.io/projected/eeede929-b70c-4cb2-a803-b3052ce99677-kube-api-access-nbrwd\") pod \"eeede929-b70c-4cb2-a803-b3052ce99677\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.389460 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-utilities\") pod \"eeede929-b70c-4cb2-a803-b3052ce99677\" (UID: \"eeede929-b70c-4cb2-a803-b3052ce99677\") " Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.390862 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-utilities" (OuterVolumeSpecName: "utilities") pod "eeede929-b70c-4cb2-a803-b3052ce99677" (UID: "eeede929-b70c-4cb2-a803-b3052ce99677"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.396262 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeede929-b70c-4cb2-a803-b3052ce99677-kube-api-access-nbrwd" (OuterVolumeSpecName: "kube-api-access-nbrwd") pod "eeede929-b70c-4cb2-a803-b3052ce99677" (UID: "eeede929-b70c-4cb2-a803-b3052ce99677"). InnerVolumeSpecName "kube-api-access-nbrwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.464870 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eeede929-b70c-4cb2-a803-b3052ce99677" (UID: "eeede929-b70c-4cb2-a803-b3052ce99677"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.491815 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.491845 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbrwd\" (UniqueName: \"kubernetes.io/projected/eeede929-b70c-4cb2-a803-b3052ce99677-kube-api-access-nbrwd\") on node \"crc\" DevicePath \"\"" Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.491857 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeede929-b70c-4cb2-a803-b3052ce99677-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.902573 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9bp2" event={"ID":"eeede929-b70c-4cb2-a803-b3052ce99677","Type":"ContainerDied","Data":"3f59ef528c9ca3573fdb8c520da2b5dc8e97bb4b95b73ea941f97782b29bee84"} Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.902615 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9bp2" Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.902645 5118 scope.go:117] "RemoveContainer" containerID="1c10ce93868b3e606fad6fa032d2af33e2fb7ff843fc995e896061f4f00bc5d9" Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.944884 5118 scope.go:117] "RemoveContainer" containerID="d9efa930674c75b2278638668a8ce16ca3f8256b665c6690cb66ccba4447c010" Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.949928 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9bp2"] Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.959617 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r9bp2"] Feb 23 10:21:38 crc kubenswrapper[5118]: I0223 10:21:38.965407 5118 scope.go:117] "RemoveContainer" containerID="39f25479674ccd84d40a5bae37fa30d52182d9ca987030f9c2911f0e6659791d" Feb 23 10:21:39 crc kubenswrapper[5118]: I0223 10:21:39.712638 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeede929-b70c-4cb2-a803-b3052ce99677" path="/var/lib/kubelet/pods/eeede929-b70c-4cb2-a803-b3052ce99677/volumes" Feb 23 10:21:46 crc kubenswrapper[5118]: I0223 10:21:46.583382 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lrbmh/must-gather-l59jv"] Feb 23 10:21:46 crc kubenswrapper[5118]: I0223 10:21:46.584232 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lrbmh/must-gather-l59jv" podUID="ea04ed6c-6822-4375-809a-d0c8ab5f484b" containerName="copy" containerID="cri-o://67aa9354bf4b1b213ddcae4c5c3d11584acce1e0449a49a8affac0253899a251" gracePeriod=2 Feb 23 10:21:46 crc kubenswrapper[5118]: I0223 10:21:46.593576 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lrbmh/must-gather-l59jv"] Feb 23 10:21:46 crc kubenswrapper[5118]: I0223 10:21:46.989504 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lrbmh_must-gather-l59jv_ea04ed6c-6822-4375-809a-d0c8ab5f484b/copy/0.log" Feb 23 10:21:46 crc kubenswrapper[5118]: I0223 10:21:46.990421 5118 generic.go:334] "Generic (PLEG): container finished" podID="ea04ed6c-6822-4375-809a-d0c8ab5f484b" containerID="67aa9354bf4b1b213ddcae4c5c3d11584acce1e0449a49a8affac0253899a251" exitCode=143 Feb 23 10:21:46 crc kubenswrapper[5118]: I0223 10:21:46.990477 5118 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c099812b1c6d6637812d966e7042a93ec7b73b30ef61cc871901aaf9088c1aa" Feb 23 10:21:47 crc kubenswrapper[5118]: I0223 10:21:47.040596 5118 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lrbmh_must-gather-l59jv_ea04ed6c-6822-4375-809a-d0c8ab5f484b/copy/0.log" Feb 23 10:21:47 crc kubenswrapper[5118]: I0223 10:21:47.041087 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/must-gather-l59jv" Feb 23 10:21:47 crc kubenswrapper[5118]: I0223 10:21:47.192582 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ea04ed6c-6822-4375-809a-d0c8ab5f484b-must-gather-output\") pod \"ea04ed6c-6822-4375-809a-d0c8ab5f484b\" (UID: \"ea04ed6c-6822-4375-809a-d0c8ab5f484b\") " Feb 23 10:21:47 crc kubenswrapper[5118]: I0223 10:21:47.192776 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28gwl\" (UniqueName: \"kubernetes.io/projected/ea04ed6c-6822-4375-809a-d0c8ab5f484b-kube-api-access-28gwl\") pod \"ea04ed6c-6822-4375-809a-d0c8ab5f484b\" (UID: \"ea04ed6c-6822-4375-809a-d0c8ab5f484b\") " Feb 23 10:21:47 crc kubenswrapper[5118]: I0223 10:21:47.204659 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea04ed6c-6822-4375-809a-d0c8ab5f484b-kube-api-access-28gwl" (OuterVolumeSpecName: "kube-api-access-28gwl") pod "ea04ed6c-6822-4375-809a-d0c8ab5f484b" (UID: "ea04ed6c-6822-4375-809a-d0c8ab5f484b"). InnerVolumeSpecName "kube-api-access-28gwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:21:47 crc kubenswrapper[5118]: I0223 10:21:47.295221 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28gwl\" (UniqueName: \"kubernetes.io/projected/ea04ed6c-6822-4375-809a-d0c8ab5f484b-kube-api-access-28gwl\") on node \"crc\" DevicePath \"\"" Feb 23 10:21:47 crc kubenswrapper[5118]: I0223 10:21:47.442320 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea04ed6c-6822-4375-809a-d0c8ab5f484b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ea04ed6c-6822-4375-809a-d0c8ab5f484b" (UID: "ea04ed6c-6822-4375-809a-d0c8ab5f484b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:21:47 crc kubenswrapper[5118]: I0223 10:21:47.541308 5118 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ea04ed6c-6822-4375-809a-d0c8ab5f484b-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 23 10:21:47 crc kubenswrapper[5118]: I0223 10:21:47.709630 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea04ed6c-6822-4375-809a-d0c8ab5f484b" path="/var/lib/kubelet/pods/ea04ed6c-6822-4375-809a-d0c8ab5f484b/volumes" Feb 23 10:21:47 crc kubenswrapper[5118]: I0223 10:21:47.998760 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lrbmh/must-gather-l59jv" Feb 23 10:22:02 crc kubenswrapper[5118]: I0223 10:22:02.975552 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:22:02 crc kubenswrapper[5118]: I0223 10:22:02.975936 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:22:09 crc kubenswrapper[5118]: I0223 10:22:09.429355 5118 scope.go:117] "RemoveContainer" containerID="67aa9354bf4b1b213ddcae4c5c3d11584acce1e0449a49a8affac0253899a251" Feb 23 10:22:09 crc kubenswrapper[5118]: I0223 10:22:09.455691 5118 scope.go:117] "RemoveContainer" containerID="b67fa2bd0975bc3ad32231ffc41e99039dd9a8a3ccaf41df33670791e05eff66" Feb 23 10:22:32 crc kubenswrapper[5118]: I0223 10:22:32.975012 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:22:32 crc kubenswrapper[5118]: I0223 10:22:32.975559 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:23:02 crc kubenswrapper[5118]: I0223 10:23:02.975308 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:23:02 crc kubenswrapper[5118]: I0223 10:23:02.975997 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:23:02 crc kubenswrapper[5118]: I0223 10:23:02.976080 5118 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" Feb 23 10:23:02 crc kubenswrapper[5118]: I0223 10:23:02.977485 5118 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd2f9ceee3a1a561ea92f797bbc4e9977fe33ba272a549d02a6fcf71327faa28"} pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 10:23:02 crc kubenswrapper[5118]: I0223 10:23:02.977629 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" containerID="cri-o://bd2f9ceee3a1a561ea92f797bbc4e9977fe33ba272a549d02a6fcf71327faa28" gracePeriod=600 Feb 23 10:23:03 crc kubenswrapper[5118]: I0223 10:23:03.904728 5118 generic.go:334] "Generic (PLEG): container finished" podID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerID="bd2f9ceee3a1a561ea92f797bbc4e9977fe33ba272a549d02a6fcf71327faa28" exitCode=0 Feb 23 10:23:03 crc kubenswrapper[5118]: I0223 10:23:03.904821 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerDied","Data":"bd2f9ceee3a1a561ea92f797bbc4e9977fe33ba272a549d02a6fcf71327faa28"} Feb 23 10:23:03 crc kubenswrapper[5118]: I0223 10:23:03.905241 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" event={"ID":"d3ecfa2c-410e-49e5-86ab-f386efab9cf6","Type":"ContainerStarted","Data":"72f8ece3aaf12306a20b699953ac52f6a7a705b3adf052c0a24109abf4d9403f"} Feb 23 10:23:03 crc kubenswrapper[5118]: I0223 10:23:03.905260 5118 scope.go:117] "RemoveContainer" containerID="dbb4da27c0070d315eb57b0ae789e5547b805805e93cae32fb2322642035c016" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.557677 5118 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4ckkl"] Feb 23 10:23:50 crc kubenswrapper[5118]: E0223 10:23:50.558682 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeede929-b70c-4cb2-a803-b3052ce99677" containerName="extract-utilities" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.558695 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeede929-b70c-4cb2-a803-b3052ce99677" containerName="extract-utilities" Feb 23 10:23:50 crc kubenswrapper[5118]: E0223 10:23:50.558702 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeede929-b70c-4cb2-a803-b3052ce99677" containerName="extract-content" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.558708 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeede929-b70c-4cb2-a803-b3052ce99677" containerName="extract-content" Feb 23 10:23:50 crc kubenswrapper[5118]: E0223 10:23:50.558721 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea04ed6c-6822-4375-809a-d0c8ab5f484b" containerName="gather" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.558727 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea04ed6c-6822-4375-809a-d0c8ab5f484b" containerName="gather" Feb 23 10:23:50 crc kubenswrapper[5118]: E0223 10:23:50.558741 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69456a54-9a29-4036-af66-f6c601f7c43d" containerName="extract-content" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.558746 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="69456a54-9a29-4036-af66-f6c601f7c43d" containerName="extract-content" Feb 23 10:23:50 crc kubenswrapper[5118]: E0223 10:23:50.558757 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea04ed6c-6822-4375-809a-d0c8ab5f484b" containerName="copy" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.558763 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea04ed6c-6822-4375-809a-d0c8ab5f484b" containerName="copy" Feb 23 10:23:50 crc kubenswrapper[5118]: E0223 10:23:50.558781 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeede929-b70c-4cb2-a803-b3052ce99677" containerName="registry-server" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.558789 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeede929-b70c-4cb2-a803-b3052ce99677" containerName="registry-server" Feb 23 10:23:50 crc kubenswrapper[5118]: E0223 10:23:50.558808 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69456a54-9a29-4036-af66-f6c601f7c43d" containerName="registry-server" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.558816 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="69456a54-9a29-4036-af66-f6c601f7c43d" containerName="registry-server" Feb 23 10:23:50 crc kubenswrapper[5118]: E0223 10:23:50.558833 5118 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69456a54-9a29-4036-af66-f6c601f7c43d" containerName="extract-utilities" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.558840 5118 state_mem.go:107] "Deleted CPUSet assignment" podUID="69456a54-9a29-4036-af66-f6c601f7c43d" containerName="extract-utilities" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.559095 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeede929-b70c-4cb2-a803-b3052ce99677" containerName="registry-server" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.559136 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="69456a54-9a29-4036-af66-f6c601f7c43d" containerName="registry-server" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.559150 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea04ed6c-6822-4375-809a-d0c8ab5f484b" containerName="gather" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.559167 5118 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea04ed6c-6822-4375-809a-d0c8ab5f484b" containerName="copy" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.560964 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.573217 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ckkl"] Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.691797 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-catalog-content\") pod \"redhat-marketplace-4ckkl\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.691924 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-utilities\") pod \"redhat-marketplace-4ckkl\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.691958 5118 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9pr8\" (UniqueName: \"kubernetes.io/projected/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-kube-api-access-g9pr8\") pod \"redhat-marketplace-4ckkl\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.793463 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-catalog-content\") pod \"redhat-marketplace-4ckkl\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.793932 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-utilities\") pod \"redhat-marketplace-4ckkl\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.793981 5118 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9pr8\" (UniqueName: \"kubernetes.io/projected/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-kube-api-access-g9pr8\") pod \"redhat-marketplace-4ckkl\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.794053 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-catalog-content\") pod \"redhat-marketplace-4ckkl\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.794494 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-utilities\") pod \"redhat-marketplace-4ckkl\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.814001 5118 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9pr8\" (UniqueName: \"kubernetes.io/projected/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-kube-api-access-g9pr8\") pod \"redhat-marketplace-4ckkl\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:50 crc kubenswrapper[5118]: I0223 10:23:50.899826 5118 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:23:51 crc kubenswrapper[5118]: I0223 10:23:51.396690 5118 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ckkl"] Feb 23 10:23:51 crc kubenswrapper[5118]: I0223 10:23:51.473948 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ckkl" event={"ID":"4809ac39-b9c9-46da-aa07-b3d3eb9fded8","Type":"ContainerStarted","Data":"cb4002eeb3f260dab0023af3fe29a90482bc4bf63747d4a4a08b4fd7c6621159"} Feb 23 10:23:52 crc kubenswrapper[5118]: I0223 10:23:52.488697 5118 generic.go:334] "Generic (PLEG): container finished" podID="4809ac39-b9c9-46da-aa07-b3d3eb9fded8" containerID="498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20" exitCode=0 Feb 23 10:23:52 crc kubenswrapper[5118]: I0223 10:23:52.488761 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ckkl" event={"ID":"4809ac39-b9c9-46da-aa07-b3d3eb9fded8","Type":"ContainerDied","Data":"498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20"} Feb 23 10:23:53 crc kubenswrapper[5118]: I0223 10:23:53.503693 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ckkl" event={"ID":"4809ac39-b9c9-46da-aa07-b3d3eb9fded8","Type":"ContainerStarted","Data":"2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae"} Feb 23 10:23:54 crc kubenswrapper[5118]: I0223 10:23:54.515193 5118 generic.go:334] "Generic (PLEG): container finished" podID="4809ac39-b9c9-46da-aa07-b3d3eb9fded8" containerID="2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae" exitCode=0 Feb 23 10:23:54 crc kubenswrapper[5118]: I0223 10:23:54.515541 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ckkl" event={"ID":"4809ac39-b9c9-46da-aa07-b3d3eb9fded8","Type":"ContainerDied","Data":"2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae"} Feb 23 10:23:55 crc kubenswrapper[5118]: I0223 10:23:55.528559 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ckkl" event={"ID":"4809ac39-b9c9-46da-aa07-b3d3eb9fded8","Type":"ContainerStarted","Data":"c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81"} Feb 23 10:23:55 crc kubenswrapper[5118]: I0223 10:23:55.563167 5118 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4ckkl" podStartSLOduration=3.088712318 podStartE2EDuration="5.563149251s" podCreationTimestamp="2026-02-23 10:23:50 +0000 UTC" firstStartedPulling="2026-02-23 10:23:52.492204368 +0000 UTC m=+13095.495988951" lastFinishedPulling="2026-02-23 10:23:54.966641311 +0000 UTC m=+13097.970425884" observedRunningTime="2026-02-23 10:23:55.55775505 +0000 UTC m=+13098.561539623" watchObservedRunningTime="2026-02-23 10:23:55.563149251 +0000 UTC m=+13098.566933834" Feb 23 10:24:00 crc kubenswrapper[5118]: I0223 10:24:00.899902 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:24:00 crc kubenswrapper[5118]: I0223 10:24:00.900478 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:24:00 crc kubenswrapper[5118]: I0223 10:24:00.949460 5118 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:24:01 crc kubenswrapper[5118]: I0223 10:24:01.641002 5118 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:24:01 crc kubenswrapper[5118]: I0223 10:24:01.689354 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ckkl"] Feb 23 10:24:03 crc kubenswrapper[5118]: I0223 10:24:03.613075 5118 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4ckkl" podUID="4809ac39-b9c9-46da-aa07-b3d3eb9fded8" containerName="registry-server" containerID="cri-o://c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81" gracePeriod=2 Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.156768 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.252446 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-utilities\") pod \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.252525 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9pr8\" (UniqueName: \"kubernetes.io/projected/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-kube-api-access-g9pr8\") pod \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.252584 5118 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-catalog-content\") pod \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\" (UID: \"4809ac39-b9c9-46da-aa07-b3d3eb9fded8\") " Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.254083 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-utilities" (OuterVolumeSpecName: "utilities") pod "4809ac39-b9c9-46da-aa07-b3d3eb9fded8" (UID: "4809ac39-b9c9-46da-aa07-b3d3eb9fded8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.258052 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-kube-api-access-g9pr8" (OuterVolumeSpecName: "kube-api-access-g9pr8") pod "4809ac39-b9c9-46da-aa07-b3d3eb9fded8" (UID: "4809ac39-b9c9-46da-aa07-b3d3eb9fded8"). InnerVolumeSpecName "kube-api-access-g9pr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.284065 5118 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4809ac39-b9c9-46da-aa07-b3d3eb9fded8" (UID: "4809ac39-b9c9-46da-aa07-b3d3eb9fded8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.356150 5118 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.356216 5118 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9pr8\" (UniqueName: \"kubernetes.io/projected/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-kube-api-access-g9pr8\") on node \"crc\" DevicePath \"\"" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.356236 5118 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4809ac39-b9c9-46da-aa07-b3d3eb9fded8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.623956 5118 generic.go:334] "Generic (PLEG): container finished" podID="4809ac39-b9c9-46da-aa07-b3d3eb9fded8" containerID="c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81" exitCode=0 Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.624042 5118 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ckkl" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.624048 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ckkl" event={"ID":"4809ac39-b9c9-46da-aa07-b3d3eb9fded8","Type":"ContainerDied","Data":"c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81"} Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.624464 5118 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ckkl" event={"ID":"4809ac39-b9c9-46da-aa07-b3d3eb9fded8","Type":"ContainerDied","Data":"cb4002eeb3f260dab0023af3fe29a90482bc4bf63747d4a4a08b4fd7c6621159"} Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.624484 5118 scope.go:117] "RemoveContainer" containerID="c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.647612 5118 scope.go:117] "RemoveContainer" containerID="2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.675857 5118 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ckkl"] Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.682875 5118 scope.go:117] "RemoveContainer" containerID="498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.686751 5118 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ckkl"] Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.732583 5118 scope.go:117] "RemoveContainer" containerID="c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81" Feb 23 10:24:04 crc kubenswrapper[5118]: E0223 10:24:04.733033 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81\": container with ID starting with c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81 not found: ID does not exist" containerID="c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.733082 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81"} err="failed to get container status \"c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81\": rpc error: code = NotFound desc = could not find container \"c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81\": container with ID starting with c91084678778468b643b3fe148459750ab5d3c79c3e58f5f31e37a9d9c1dfd81 not found: ID does not exist" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.733140 5118 scope.go:117] "RemoveContainer" containerID="2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae" Feb 23 10:24:04 crc kubenswrapper[5118]: E0223 10:24:04.733597 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae\": container with ID starting with 2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae not found: ID does not exist" containerID="2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.733672 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae"} err="failed to get container status \"2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae\": rpc error: code = NotFound desc = could not find container \"2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae\": container with ID starting with 2cdda91821b080081cc864c021c5d6905940a563e6907ef851749d3078dba7ae not found: ID does not exist" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.733722 5118 scope.go:117] "RemoveContainer" containerID="498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20" Feb 23 10:24:04 crc kubenswrapper[5118]: E0223 10:24:04.734016 5118 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20\": container with ID starting with 498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20 not found: ID does not exist" containerID="498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20" Feb 23 10:24:04 crc kubenswrapper[5118]: I0223 10:24:04.734064 5118 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20"} err="failed to get container status \"498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20\": rpc error: code = NotFound desc = could not find container \"498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20\": container with ID starting with 498cbeafae7491b2dce737ae83b79480d2a590a33ab74bc3d6acdfa3e3e85f20 not found: ID does not exist" Feb 23 10:24:05 crc kubenswrapper[5118]: I0223 10:24:05.708130 5118 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4809ac39-b9c9-46da-aa07-b3d3eb9fded8" path="/var/lib/kubelet/pods/4809ac39-b9c9-46da-aa07-b3d3eb9fded8/volumes" Feb 23 10:25:32 crc kubenswrapper[5118]: I0223 10:25:32.975422 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:25:32 crc kubenswrapper[5118]: I0223 10:25:32.977813 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 10:26:02 crc kubenswrapper[5118]: I0223 10:26:02.974934 5118 patch_prober.go:28] interesting pod/machine-config-daemon-qlxj9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 10:26:02 crc kubenswrapper[5118]: I0223 10:26:02.975772 5118 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlxj9" podUID="d3ecfa2c-410e-49e5-86ab-f386efab9cf6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515147025526024454 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015147025527017372 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015146772672016526 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015146772673015477 5ustar corecore